Distillation columns: Product composition control – process identification models

This technique can help reconnect information and actions that are separated by time.

11/12/2013


Distillation columns are among the most common unit operations for separation and purification in process industries. Control of the composition (purity) of one or more of the product streams is invariably critical for economic operation of the column. For precise composition control, on-stream process analyzers are often employed to measure composition of the critical product. For example, monitoring the amount of a key impurity that must be kept below a maximum specification limit in the product.

For light hydrocarbon service, such as an olefins plant or natural gas liquids distillation, a gas chromatograph (GC) is often the instrument of choice for product composition analysis. One GC can be designed and programmed to analyze several streams, thereby minimizing overall installation and maintenance costs. The one major drawback of GCs is that the analyzer results are not continuous. Each analysis is made via an injection of a sample of the stream to be analyzed into the instrument, and then the analysis typically requires anywhere from 8 to 15 minutes before single-point results are available. There may also be dead time and lag associated with the stream sampling system. The overall resulting dead time and lag, combined with the discontinuous nature of the analysis, create significant challenges when attempting to use the GC measurement for closed-loop control. In these situations, board operators will often attempt to use the analyzer readings in open-loop fashion, adjusting the reflux flow and/or reboiler heat medium flow by hand to maintain product compositions near targets.

However, modern control techniques have been developed to utilize analyzer measurements from GCs for closed-loop, distillation column product composition control. One such effective technique utilizes a combination of:

• Control of an intermediate variable which is inferential of composition (e.g., a tray temperature, pressure-compensated, if needed).

• Use of a process identification model (PIM) to relate the intermediate control variable to the process analyzer measurements.

As an example, consider the depropanizer shown in the accompanying figure. We’ll assume that the key product quality variable is the iso-butane (i-C4) content of the propane (overhead) product, which is being measured by an on-line GC.


As an example, consider the depropanizer shown in the accompanying figure. We’ll assume that the key product quality variable is the iso-butane (i-C4) content of the propane (overhead) product, which is being measured by an on-line GC. Maintaining i-C4 content near (but below) its maximum specification limit maximizes propane recovery. The reflux flow is adjusted in cascade from a tray temperature controller, indirectly maintaining propane product composition. The operator is supposed to adjust the tray temperature setpoint based upon feedback from the analyzer.

In practice, the tray temperature control may not work very well for a number of reasons, the most common one being dead time and lag between a change in reflux flow and its impact on the temperature. However, utilizing advanced regulatory control (ARC) techniques, such as feedforward, this cascade can be made to work. The next step is to implement the PIM and close the loop between the analyzer and tray temperature controller.

What does our PIM need to do in this situation? Think about it:  When a new reading comes in from the analyzer, what information is it telling us? Is it telling us the composition of the propane product that is going to storage right now? No. It’s telling us the composition when the sample was injected in the analyzer, which could be as long as 30 or 40 minutes ago, depending upon how many streams are being analyzed, the time between individual analyses, and any sampling system delays. So, what can we do with this old information?

Our PIM needs to be a time-adjusted relationship between the tray temperature and the analyzer reading. In the case of this depropanizer, a simple dead time and lag transfer function was perfectly adequate to delay the temperature so that it lined up with the composition. Analysis of trend data indicated that a dead time of 34.5 minutes and lag time of 14 minutes provided a suitable match between the two variables.

How then are the variables related and the loop closed? With a simple model (usually linear because we’re operating over a fairly narrow composition range) we can relate the delayed tray temperature to the product composition this way:

(Temperature)delayed = K * (Composition) + Bias

Each time a new analyzer reading comes in (and after the reading is validated as reasonable), the controller user the equation in two steps: First predict the product composition (based on the delayed actual temperature) and then compare it with the analyzer results. Based on the error between predicted vs. actual composition, rules are used to calculate a new equation bias. The equation is then inverted to calculate the new temperature controller setpoint, which is then downloaded to the temperature controller. Note that control action is taken only when a new reading comes in from the GC.

In summary, this technique works well when there is an intermediate control variable, such as a tray temperature (inferential of composition), that can be correlated in time with the analyzer reading. Since feedback from the analyzer is model-based, there are no inherent shortcomings associated with the long dead time and lag, such as would be the case using conventional PID feedback control. This technique had been applied quite effectively on many different types of distillation columns where on-line composition readings are available from GCs.

This post was written by Jim Ford. Jim is a process control consultant at MAVERICK Technologies, a leading automation solutions provider offering industrial automation, strategic manufacturing, and enterprise integration services for the process industries. MAVERICK delivers expertise and consulting in a wide variety of areas including industrial automation controls, distributed control systems, manufacturing execution systems, operational strategy, business process optimization and more.



Anonymous , 11/18/13 04:20 PM:

Haltom City, texas has stinking odor. To me it seems this methane and could be a power electracl power? thanks, al vegely
The Top Plant program honors outstanding manufacturing facilities in North America. View the 2013 Top Plant.
The Product of the Year program recognizes products newly released in the manufacturing industries.
The Leaders Under 40 program features outstanding young people who are making a difference in manufacturing. View the 2013 Leaders here.
The new control room: It's got all the bells and whistles - and alarms, too; Remote maintenance; Specifying VFDs
2014 forecast issue: To serve and to manufacture - Veterans will bring skill and discipline to the plant floor if we can find a way to get them there.
2013 Top Plant: Lincoln Electric Company, Cleveland, Ohio
Case Study Database

Case Study Database

Get more exposure for your case study by uploading it to the Plant Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.

These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.

Click here to visit the Case Study Database and upload your case study.

Bring focus to PLC programming: 5 things to avoid in putting your system together; Managing the DCS upgrade; PLM upgrade: a step-by-step approach
Balancing the bagging triangle; PID tuning improves process efficiency; Standardizing control room HMIs
Commissioning electrical systems in mission critical facilities; Anticipating the Smart Grid; Mitigating arc flash hazards in medium-voltage switchgear; Comparing generator sizing software

Annual Salary Survey

Participate in the 2013 Salary Survey

In a year when manufacturing continued to lead the economic rebound, it makes sense that plant manager bonuses rebounded. Plant Engineering’s annual Salary Survey shows both wages and bonuses rose in 2012 after a retreat the year before.

Average salary across all job titles for plant floor management rose 3.5% to $95,446, and bonus compensation jumped to $15,162, a 4.2% increase from the 2010 level and double the 2011 total, which showed a sharp drop in bonus.

2012 Salary Survey Analysis

2012 Salary Survey Results

Maintenance and reliability tips and best practices from the maintenance and reliability coaches at Allied Reliability Group.
The One Voice for Manufacturing blog reports on federal public policy issues impacting the manufacturing sector. One Voice is a joint effort by the National Tooling and Machining...
The Society for Maintenance and Reliability Professionals an organization devoted...
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
Maintenance is not optional in manufacturing. It’s a profit center, driving productivity and uptime while reducing overall repair costs.
The Lachance on CMMS blog is about current maintenance topics. Blogger Paul Lachance is president and chief technology officer for Smartware Group.