Data analysis: Matching information with the right tools

Using the proper data analysis tools leads to valuable information for facility operations.

By Alex Marcy, Corso Systems July 31, 2016

When data from a process control system is combined with the right data analysis tools it can become extremely valuable. There are common analysis tools in use today that you can leverage to understand process conditions in great detail.

Following the trend

The most common data analysis tool is a trend, providing a simple visualization of how values change over time. Trends can display data from instrumentation, process equipment, and other data entry systems such as quality control sample test results. Operators typically use trends to monitor process conditions during their shift, and process engineers use trends to perform a quick analysis of process upsets to narrow the time periods for further research.

However, trends do have some limitations. While trends are useful to see how process conditions change over time, they do not provide context for the data. If maintenance wants to track the number of times a pump has unexpectedly shut down during a shift, or how many minutes the pump was off during the same time period, collecting this information could be a time-consuming process. If a process engineer, looking at the same pump wants to know the top three reasons why the pump shut off, more time would need to be invested to analyze the data.

Using reports can solve some of the limitations trends have. Reports can include trends for a specific time period, as well as other components such as tabular data, pie charts or Pareto charts, and staff notes on process conditions or other issues arising during a shift.

Reports can add the context missing from trends. They can be generated for a specific period of time: per shift, daily, weekly, etc., or can be generated when a specific event occurs, such as when a piece of equipment is shut down. Reports do require up-front planning and development time, so having a specific end-result in mind is important for reports to be as valuable as possible.

A typical example of the up-front planning would be the process engineer from the previous section collecting a handful of data points when a pump shuts off to assess the impact of downtime on the rest of the process quickly. Instead of manually collecting this data each time the pump shuts off, the system could generate a report with the correct data and e-mail it out, saving time on collecting and disseminating the information.

Head in the clouds

Data analysis systems are most useful when they remove most of the noise from data, display it in the easiest to analyze way, and help answer whatever questions might be posed. With advancements in Web-based technology, data analysis systems are more flexible than ever, with a wide variety of tools available to analyze data in many different ways, available to the user at any time.

For trends and reports this is done using parameterized reports requiring only one report for many different use cases, or ad-hoc reports, or trends allowing the user to drag and drop data points to quickly build reports with selectable display formats, trend types, and statistical analysis in real time.

For a plant manager, who might not need to analyze pump run-times by the minute, Web-based dashboards can be created for them to easily track key performance indicators (KPIs) that are important to track the health of the process. Dashboards can also be created on an on-demand basis allowing for nearly unlimited flexibility in how each person in the company can access the data he needs to make the best decisions for his area of responsibility.

For more specific use cases, tools such as statistical process control (SPC) help correlate process data with quality control test data to identify and reduce the causes of re-work or scrap material by doing the heavy lifting to analyze data and determine ideal process conditions. Computerized maintenance management systems (CMMS) do a similar thing for equipment maintenance, providing asset management capabilities, reports on what maintenance has occurred recently, what is upcoming, and using actual equipment run-times and health to do condition-based scheduling.

The idea of adding the right context for the person doing data analysis can be used at any level in the organization. Using the same data available to operators via trends, process engineers via reports and SPC, and maintenance via CMMS, the real-time cost per X barrels of oil processed by a refinery can be calculated in real time and displayed on a dashboard. This is an easy way for someone to understand operations at any given time.

Segue to the fifth level

The important takeaway is a process can generate a lot of data. To get the most value from this data, it needs to be given to the right people, in the right format so they can use it to make the right decisions. There is an ever increasing number of ways this can be accomplished. The first step should be to understand how the data will be most useful and use that as a starting point to build the right tools and improve on them as time goes on.

Next part: Bridging the gap between the shop floor and the top floor by integrating process control systems with enterprise resource planning and other business systems.

Alex Marcy, P.E., is the co-owner and president of Corso Systems, a system integration firm based in Chicago. Corso Systems is a CFE Media content partner.

ONLINE extra

See related stories from Alex Marcy linked below.

Original content can be found at Oil and Gas Engineering.