Plant managers need financial, operational data
Plant floor workers need to understand the financial implications of their decisions so they can prevent the problem from ever occurring.
Enterprise Manufacturing Intelligence (EMI) has emerged as a key growth application with both new vendors as well as established ones in the space, all vying to give manufacturing executives insight into their operations. Almost every one of these EMI tools provides capabilities to not only look at the key performance indicators (KPI) of the enterprise, but also drill down to the very data that was aggregated to produce the KPI in the first place.
The power of these tools-as claimed by the suppliers-is that they allow an executive to get to the root of the problem. For instance, Plant 6 is having a productivity issue, so use the EMI dashboard to drill down and ascertain it. This leads to discovering that the turret lathe on the whatchamacallit production line has been experiencing downtime due to lubrication failures.
Much of the emphasis on having tools to diagnose plant-floor issues in the hands of executives is in response to solving a problem of many executives' own doing. In an effort to continually "Lean out" the business, they have eliminated layers of supervision so that in many cases there are but two or three levels between the plant floor and senior corporate management.
So, when there are production problems, they can't turn to the managers who used to be in place (consolidating information up from each of the manufacturing operations) and ask them what was wrong and why it hasn't been fixed. Given the speed at which business operates today, however, that's not necessarily a bad thing. In the past, many operational issues were not discovered until month-end close, and then the investigation and reporting might have taken additional weeks. The corrective decision might have gotten back to the plant floor one or more months afterwards, by which time the problem may or may not even exist.
Clearly, being able to see the issue in near real time and provide corrective action within hours or days is preferable to the old way of doing things. In fact, it is the evolution of the EMI tools we have today that has enabled this transformation in business.
An example from pulp and paper
Here is a true, albeit historical, anecdote from the early days of using process data historians, about the results one company achieved turning the model on its head. It had been published in the past, but it was so long ago many of today's practitioners weren't even in the industry when it first came out. This particular case study is from the pulp and paper industry, in which I was working at the time.
In fine paper, like the kind used for copying and printing, the basic ingredients are wood pulp and several additives that aid in giving the paper whiteness, opacity, and a finish that holds ink or photocopy printing. The common additives are cornstarch, which aids in finish; a white clay called kaolin, which aids in opacity; dyes, which adjust color; and titanium dioxide (TiO2), which aids in whiteness, brightness, and opacity. Fine paper has a number of specifications such as whiteness, opacity, and brightness, as well as ash content, which is important for disposal and recycling purposes.
In this example, whenever quality problems such as off-whiteness or off-brightness came up, the typical reaction of an operator was to adjust the TiO2, considered to be the "miracle drug" of paper-making since it could solve virtually any problem. Of all the ingredients I listed above (paper, wood pulp, corn starch, clay, dye, and TiO2), care to guess which is the most expensive? Clearly TiO2.
This led to management looking at cost figures at the end of every month and complaining about the high cost of TiO2 in the process. The word would trickle back down to operators to stop using so much TiO2, but as soon as a process problem would come up, they would always resort to the wonder drug to meet production demands.
At this point a new project was introduced into the plant. The project was the implementation of the first 32-bit CPU-based data historian on the market with a user-configurable report builder and graphics front-end. With the project in place, one of the first applications the project team tackled was the paper-making process cost reduction issue. The control system used by the operators displayed all of the setpoints used to control the papermaking process in the usual units such as pounds per hour, gallons per minute, or similar units.
Whenever a quality problem surfaced, the operators knew that by just increasing the TiO2 slurry flow a few liters per minute, the problem could be solved. The other ingredients could also be adjusted, but the magnitude of the changes in flow or addition rates was always higher since the recipe was heavy on other materials, with TiO2 being one of the smallest quantities in the recipe. Of course, this is what was leading to the very problems that were driving management to continually complain about cost. The project team took an unusual approach to the problem. Since the data historian sat between the control system and the business computer, it was able to combine information from both systems. The project team extracted information from the purchasing system on the cost of the various ingredients used in making paper and the flow rates from the control system, and created a hybrid display that showed the usage not in classic process units but in dollars per ton of production. Thus, the operators were able to get immediate feedback on the economic implications of the adjustments to the process they were making. Since the plant, while unionized, had managed to implement a profit sharing plan, the operators were well incentivized to help control costs.
The results of giving operators business information was a savings of well over $400,000 in today's dollars. So by giving the plant floor staff insight into the business drivers, a group of $30-an-hour individuals generated significant savings.
The argument can be made that we need to do more of this-letting plant floor people understand the financial implications of their decisions, while incentivizing them to make the right ones, so they can prevent the problem from ever occurring. Is it not far better to never have a problem like excessive TiO2 costs by giving operators more information than having a $2-million-a-year executive use a tool only to identify the problem instead?
Dan Miklovic is a research analyst for LNS Research, which has worked in conjunction with with MESA International to produce the "2013-2014 Manufacturing Metrics that Really Matter" eBook.
- Events & Awards
- Magazine Archives
- Oil & Gas Engineering
- Salary Survey
- Digital Reports
Annual Salary Survey
Before the calendar turned, 2016 already had the makings of a pivotal year for manufacturing, and for the world.
There were the big events for the year, including the United States as Partner Country at Hannover Messe in April and the 2016 International Manufacturing Technology Show in Chicago in September. There's also the matter of the U.S. presidential elections in November, which promise to shape policy in manufacturing for years to come.
But the year started with global economic turmoil, as a slowdown in Chinese manufacturing triggered a worldwide stock hiccup that sent values plummeting. The continued plunge in world oil prices has resulted in a slowdown in exploration and, by extension, the manufacture of exploration equipment.
Read more: 2015 Salary Survey