Understanding Process Elasticity

Lessons learned cooking french fries can teach much about process control strategy.

By Chuck Maher June 14, 2012

Editor’s note: While this story is specifically about french fries, the same concepts of control strategy apply to many types of processes, particularly batch processes. For example, the amount of time feedstocks have to spend in a reactor can vary according to all sorts of variables, and an effective control strategy has to be able to measure those variables and compensate automatically to ensure the correct outcome. The solutions in this case may help you find your own.

How many times have you stood in line at a McDonald’s, Burger King, Wendy’s, or the like, and listened to a loud cacophony of beeps, buzzes, and other assorted noises that seem to be calling attention to something urgent. They are, of course, alarms of various sorts. Quick-service restaurants (they seem to prefer this title over fast food) have an ingenious collection of microprocessor-based controllers on all their cooking appliances. They need them to fulfill their corporate goals of consistent food quality and ease of operation due to high a turnover in personnel, and to minimize the number of people needed to staff the restaurant and service customers. These challenges sound like those encountered in many process plants these days.

McDonald’s boasts that the french fry (FF) you get in New York City is the same as the one you get in Beijing. This is, for the most part, true because much science and testing has gone into the process of cooking FFs. It may seem trivial at first: You take the raw fries out of a package and dump them in hot oil, wait a while, and take them out, right? It’s actually a complex process not all that different from many found in various phases of chemical manufacturing.

Consider these questions:

  • What is the best cooking temperature for the oil?
  • How long should you cook a batch?
  • How much does batch size affect the cooking time?
  • Are the potatoes frozen or slacked (defrosted)?

Most people know through either intuition or experience that things cook faster at a higher temperature than at a lower one, but the product can also burn or become inedible if you go too far.

A mentor once told me, “Cooking is all about boiling water.” This is true in general and particularly so with deep fat frying. When a batch of FFs is lowered into a vat of hot oil, there is a rapid drop in temperature. Potatoes contain water on the surface and internally that turns to steam, so it is similar to many types of endothermic reactions. It is the latent heat of vaporization absorbed from the hot oil that causes this rapid drop in temperature. If the controller can’t heat the oil quickly enough to compensate, the cooking rate slows down.

The fryer controller has two primary functions. One is to control the temperature of the oil, and the other is to time the cooking interval. Both functions are directly related. Legend has it that in the early 1960s when McDonald’s was becoming more widely established, company chefs undertook an extensive set of experiments in the test lab where small amounts of FFs were cooked at one-degree temperature intervals. These FFs were then rated as to their degree of “doneness.”

A properly cooked FF will be crisp on the outside, snapping when you bend it. Its center will be pulpy when squeezed and not dried out. The outside color will be a pleasing brown and, above all, it must taste good. There must not be any unpleasant flavor transmitted from the cooking oil. That is why fryers that cook fish should never be used for cooking FFs. The net result of all these tests was a so-called optimum “cook curve” of time versus temperature for FFs. (See Figure 1.) The lower the temperature, the longer the product had to be cooked, but if the oil temperature stayed too low for too long, then the product was unacceptable. It was undercooked, limp, greasy, and pale in color. 

Assume that after all this testing the resulting curve showed that the optimum oil temperature to start a batch cooking is 350 °F and that when cooking a small batch of FFs a cook time of 150 sec (2.5 min) produces a very good product. Problem solved? Now lower a basket with 5 lbs of frozen potatoes into the fryer and watch the upheaval.

The oil temperature will plummet in seconds to around 300 °F or lower. If you remove the FFs after 2.5 min you will end up tossing them all in the garbage. You must lengthen the cook time based on the drop in temperature. The same applies to all sorts of chemical reactions where the rate is a function of temperature. This need is referred to by several different names: elastic time, comp time, load compensation, and other similar terms to describe the effect.

In the early to mid-1970s the first microprocessors were still almost a decade away.  Electronic cooking controllers were being implemented using a combination of discrete logic (gates and counters) and monolithic analog components (transistors and operational amplifiers). The Fairchild 709 monolithic IC op amp came on the scene in 1965 and was quickly put to use in cooking controllers.

In 1978, U.S. Patent number 4,362,094 was issued, titled “Cooking Time Control System.” It made use of discrete monolithic components only. There is no microcontroller and therefore no firmware. It was all hardwired. This patent describes a pulse train that is feeding a counter that has its frequency changed based on a cooking rate.

The cooking rate is defined as the rate of heat flow into the product being cooked. This heat flow is known to be a function of the differential temperature between the hot oil and the product being cooked. This relationship is not a linear one. However, the patent states that this nonlinear region can be closely approximated by a constant over a narrow range of oil temperature. The patent specifies using a platinum RTD sensor for this application because of its linearity, accuracy, and stability.

As the temperature drops, the frequency of the pulse train to the counter is lowered, effectively increasing the cooking time since it takes longer to count the predetermined number of pulses. While not perfect, this technique proved quite effective in improving the quality and consistency of the final product. This and similar techniques were used until the first microprocessors became available.

We should all be familiar with cook timers around the kitchen. You set the time interval for which you want a product to cook and start it running. It usually starts counting down so that you can see the time remaining. When the count reaches zero, a bell or chime will go off to tell you that the time has elapsed. Another clever engineer in this period before the advent of microcontrollers noticed that the cook curve closely resembled that of the resistance-versus-temperature curve of a negative-slope thermistor. He then used a thermistor to measure the oil temperature and used its value of resistance, which increases as the temperature drops and vice versa to vary the frequency of a pulse train operating a cook timer. In this way he was able to implement the elastic time feature very effectively. With the advent of microcontrollers it became much easier to use things like lookup tables and such to implement the desired cook curve in firmware and to use different algorithms for implementing the elastic time feature.

A very common way to program a timer is to load an internal register (or counter) with a preset number and then periodically decrement or increment it and test the result to see if you have reached zero or the desired count. The secret to variable timing is the period of the pulse that is doing the counting up or down.

For example, if your pulse period is 1 sec and if you want to cook for 2.5 min, you would preset the counter to the binary equivalent of 150. If you doubled the period with the same preloaded count, then the cook would last for 5 min.

Every microcontroller has a built-in time tick. This tick may be in microseconds, milliseconds, or in some cases even seconds. It is generally not an integer but rather some non-integer number and requires some type of scaling to get it into the units you need. The cook curve can be translated into a table format where the table is entered with a temperature and exited with a count that represents the time for that interval and at that temperature. The lower the temperature is, then the larger the number and vice versa.

A simpler way to implement elastic time is to realize that if you multiply the ideal cook temperature for a specific product by its ideal cook time, then the result is in effect the energy absorbed during an ideal cook of that product. If the actual oil temperature is measured at a constant interval and continually added to a register, the value in that register will be the actual integrated time/temperature. This number can be compared to that of the ideal cook, and when equal or slightly greater, the cook will be ended.

This approach loads a register with the ideal energy (ideal cook temperature multiplied by the ideal cook time) and then as the batch cooking progresses, it periodically samples and adds the actual temperature to a register, thus integrating the temperature as described above. The program then compares the contents of this register to that containing the ideal time/temperature, and when the ideal is exceeded, it signals the end of the process.

Figure 2 shows how this works. The crosshatched area at the beginning of the cook represents the ideal cook energy, and the unshaded area below the actual temperature curve is the additional energy (cooking time) needed in order for the actual energy to be equal to or slightly greater than that used during an ideal cook. Clearly the cook time has been “stretched.”

Another interesting fact was that subsequent testing showed that the same cook curve developed for cooking FFs was equally effective when baking biscuits in an oven. There the cook time needed to be stretched based on how many trays of uncooked biscuit dough were present in the oven at the start of a cook. The same concept can apply to countless chemical processes and reactions.

Chuck Maher is an automation consultant and owner of PER Associates in Mustang, Okla.

https://www.embededdesignservices.net/