Intelligent automation series part 4: Data analysis systems

Part 4: This series explains the automation pyramid: A framework for process intelligence and improvement. The fourth level: data analysis systems are explained. Links to Parts 1 through 3 can be found at the bottom of this story.

By Alex Marcy January 29, 2016

Data analysis tools are designed to take the information from a process control system and interpret the data. When used properly, the information synthesized from a process control system can become extremely valuable. This article will discuss some of the common analysis tools in use today and how to leverage them to understand process conditions in great detail. 

Following the trend

The most common data analysis tool is a trend, providing a simple visualization of how values change over time. Trends can display data from instrumentation, process equipment, and other data entry systems such as quality control sample test results. Operators typically use trends to monitor process conditions during their shift, and process engineers use trends to perform a quick analysis of process upsets to narrow the time periods for further research.

However, trends do have some limitations. While trends are useful to see how process conditions change over time, they do not provide context for the data. If maintenance wants to track the number of times a pump has unexpectedly shut down during a shift, or how many minutes the pump was off during the same time period, collecting this information could be a time-consuming process. If a process engineer, looking at the same pump, wants to know the top 3 reasons the pump shut off, further time would need to be invested using only a trend to analyze the data.

Some of the limitations of trends can be overcome using reports. Reports can include trends for a specific time period, as well as other components such as tabular data, pie charts or Pareto charts, and staff notes on process conditions or other issues arising during a shift.

Reports can add the context missing from trends. They can be generated for a specific period of time: per shift, daily, weekly, etc., or can be generated when a specific event occurs, such as when a piece of equipment is shut down. Reports do require up-front planning and development time, so having a specific end-result in mind is important for reports to be as valuable as possible. A typical example of the up-front planning would be the process engineer from the previous section collecting a handful of data points when a pump shuts off to assess the impact of downtime on the rest of the process quickly. Instead of manually collecting this data each time the pump shuts off, the system could generate a report with the correct data and email it out, saving time on collecting and disseminating the information.

Head in the clouds

Data analysis systems are most useful when they remove most of the noise from data, display it in the easiest to analyze way, and help answer whatever questions might be posed. With advancements in web-based technology, data analysis systems are more flexible than ever, with a wide variety of tools available to analyze data in many different ways, available to the user at any time.

For trends and reports this is done using parameterized reports requiring only one report for many different use cases, or ad-hoc reports or trends allowing the user to drag and drop data points to quickly build reports with selectable display formats, trend types, and statistical analysis in real-time.

For a plant manager, who might not need to analyze pump runtimes by the minute, web-based dashboards can be created for them to easily track key performance indicators (KPIs) that are important to track the health of the process. Dashboards can also be created on an on-demand basis allowing for nearly unlimited flexibility in how each person in the company can access the data they need to make the best decisions for their area of responsibility.

For more specific use cases, tools such as statistical process control (SPC) help correlate process data with quality control test data to identify and reduce the causes of re-work or scrap material by doing the heavy lifting to analyze data and determine ideal process conditions. computerized maintenance management systems (CMMS) do a similar thing for equipment maintenance, providing asset management capabilities, reports on what maintenance has occurred recently, what is upcoming, and using actual equipment runtimes and health to do condition-based scheduling.

As a preview of the next level of the pyramid, the idea of adding the right context for the person doing data analysis can be used at any level in the organization. Using the same data available to operators via trends, process engineers via reports and SPC, and maintenance via CMMS, the real-time cost per X barrels of oil processed by a refinery could be calculated in real-time and displayed on a dashboard. This is an easy way for someone on the business side to understand operations at any given time.

Segue to the fifth level

The important takeaway is a process can generate a lot of data. To get the most value from this data, it needs to be given to the right people, in the right format so they can use it to make the right decisions. There is an ever increasing number of ways this can be accomplished. The first step should be to understand how the data will be most useful, and use that as a starting point to build the right tools and improve on them as time goes on.

Next part: Bridging the gap between the shop floor and the top floor by integrating process control systems with enterprise resource planning and other business systems.

– Alex Marcy, P.E. is the co-owner and President of Corso Systems, a system integration firm based in Chicago Ill. Edited by Eric R. Eissler, Oil & Gas Engineering, eeissler@cfemedia.com

Original content can be found at Oil and Gas Engineering.