Extracting full value from operational data

Technologies, such as big data and the Industrial Internet of Things (IIoT), offer oil and gas companies great opportunities as tools for real-time data analysis and performance improvement—if the industry can conquer the accompanying data-integration challenges.

By Sidney Hill August 20, 2015

Oil and gas industry analysts say emerging technologies, such as cloud computing, big data analytics, and IIoT offer excellent platforms to take advantage of automation data to boost operational performance. But they also point out that these new platforms cannot be used to full advantage unless users can solve how to integrate data residing in multiple automation systems.

The myriad issues associated with data and systems integration was listed among the top factors preventing industrial companies from fully accessing big data analytics in a 2014 survey conducted by GE and Accenture. The survey, which polled executives from 150 companies across eight industries, formed the basis for a report titled "Industrial Internet Insights for 2015." 

A majority of executives responding to the survey expressed the belief that big data can dramatically alter the competitive landscape of their respective industries-and they appear ready to invest accordingly. In fact, more than 80% of these executives listed big data among their top three investment priorities.

Why the sudden fervent interest in big data within industrial companies? Because, according to the GE-Accenture report, "data created by industrial equipment . . . holds more potential business value than any other type, including data generated on the social Web and the consumer Internet."

Jeff Miers, a managing director in Accenture’s energy business, said that the data generated by automating equipment in oil and gas operations carries significant value because it’s tied directly to an organization’s profitability.

"Data produced by oilfield equipment is the principal source of information used to enhance productivity," Miers said. "In the current environment, marked by tremendous price pressure, there’s a greater emphasis on things that boost productivity and lower costs because they have a direct impact on the bottom line."

Given those factors, it would seem oil and gas executives would be especially eager to start launching big data projects, yet only 31% of oil and gas executives responding to the GE-Accenture survey said big data was their top choice as a technology investment.

Miers believes that 31% reflects realism about the difficulties associated with implementing big data projects rather than lack of belief in the technology’s potential for making oil and gas operations more efficient.

Technical and organizational challenges

In addition to problems related to data and systems integration, Miers pointed out, implementing big data or industrial Internet solutions requires overcoming technical challenges around network security and organizational issues, such as getting individuals from various departments to agree on methods for managing and sharing data.

Even with these issues at play, Miers said, "Over the past few months, we have begun to see a greater sense of urgency with regard to big data and IIoT projects within the oil and gas industry. More companies are launching pilot projects. As those pilots start to show positive results, the numbers we saw in the survey related to investments in these technologies will improve."

To date, most oil and gas data integration projects place emphasis on monitoring how well equipment is performing within a given set of parameters. Projects like these can help companies enact preventive maintenance programs by developing schedules to replace parts before their presumed end of life, and thus avoiding costly, unplanned downtime.

In many preventive maintenance programs, machine technicians receive alarms on desktop or mobile devices when a piece of equipment fails, allowing them to respond immediately and shorten equipment downtime. Big data and IIoT technologies are expected to help eliminate equipment downtime altogether by providing the means to develop predictive-as opposed to preventive-maintenance strategies.

Miers indicated that predictive strategies will become commonplace when producers learn to seamlessly blend data from their IT and operations technology (OT) systems onto a single platform that allows for comprehensive, real-time data analysis. 

In general, IT systems handle data generated in office environments. They include ERP or other business management tools like financial and human resources applications. In the oil and gas industry, OT systems are associated with field operations and the gathering of data at the wellhead, along the pipeline, or in a refinery.

Complete information on a single screen

Integrating data from these divergent sources should make it possible for users to view a single screen and get a clear picture of how the overall business is performing. It also should allow companies to pinpoint the specific parts of the operation in most need of improvement.

Miers called this "situational awareness," which he said is the first step on the road to predictive analytics. Apache Corporation, a Houston-based oil and gas exploration and production company, followed this road to develop an application that helps minimize lost production time by enabling operators to predict when pumps are likely to fail.

Apache executives started this project after uncovering research that showed the global oil and gas industry could increase oil production by 500,000 barrels per day, and earn an additional $19 billion per year, by improving pump performance by only 1%.

Larry O’Brien, research director with ARC Advisory Group, pointed out that oil and gas producers have been seeking to merge IT and OT data for years, but the advent of big data and IIoT has brought this topic to the forefront.

In O’Brien’s view, enterprise-wide data integration could make data generated by field-level automation networks even more valuable. "If you can turn data from all of your wells and plants around the world into useful information, it becomes easier to optimize your operations and actually use all of that automation and control technology with an eye toward increasing equipment reliability, improving business processes, and ultimately increasing profitability," O’Brien said.

He also said that oil and gas companies may never extract full value from their field-level data unless they find a better approach to system integration.

Historically, oil and gas producers have used custom interfaces-either developed in-house or by consultants-to pass data between field-level devices from different manufacturers. They also have tended to follow this strategy when attempting to pass data from the field to higher-level systems.

For the most part, these custom interfaces work well, but they generally are expensive to create and maintain, which limits the number of integration projects a company can undertake, which, in turn, limits the value that can be derived from field-level data.

Off-the-shelf solutions

Unsurprisingly, some technology vendors are actively addressing these issues with off-the-shelf solutions. Accenture is, in effect, one of those vendors via a partnership with GE that resulted in an application called the "Intelligent Pipeline Solution."

This package was built on GE’s cloud-based Predix application development platform.

GE engineers developed the pipeline management functionality while Accenture built the interfaces for pulling data from pipeline sensors and preparing it for display by a cloud-based analytics application. 

Houston-based Columbia Pipeline Group is using this application to enhance management of its pipelines in the Marcellus and Utica shale plays in southwestern Pennsylvania and Ohio. 

"We’re using standard ETL (extract, transform, and load) capabilities," Miers said of the method Accenture employed for moving data from the pipeline to the cloud. "We’re extracting data from legacy systems, transforming it for the new cloud-based environment, and then loading it into that environment."

Other vendors are relying on an updated version of the Open Platform Communications (OPC) communications standard to address OT to IT integration.

Development of the OPC standard is managed by an industry consortium known as the OPC Foundation which counts more than 400 automation and control hardware and software vendors among its membership. The first version of the OPC was developed in the mid-1990s with the goal of creating an environment where assembling automation networks comprised of devices from multiple manufacturers would be as simple as connecting computers and printers to an office network.

"We were looking at Microsoft Windows and it was plug and play," said Russ Agrusa, CEO of HMI and SCADA software supplier ICONICS, who also is an OPC Foundation board member. "We wanted to do the same thing for manufacturers and ultimately the oil and gas industry."

Before OPC, connecting devices from different manufacturers required the use of custom drivers that acted as device translators. Initially, most users wrote their own drivers. Then HMI and SCADA software vendors, such as ICONICS and Kepware started writing drivers for multiple devices and selling them as off-the-shelf connection tools. But as the number of automation devices proliferated, the HMI and SCADA vendors found themselves devoting more of their resources to writing and maintaining drivers than to developing their core software products.

After OPC was developed, vendors could embed the standard in their products and save themselves from having to write custom drivers, as long as their customers only wanted to pass raw data from machines to an HMI or SCADA system. Eventually, however, users wanted to do things like trigger alarms when machines required maintenance and pass historical data to HMIs for trend analysis. That prompted the need for additional standards.

A unified architecture

Now, the OPC Foundation has more than a half dozen standards for packaging and moving various types of data. It has placed those standards into a framework called OPC UA, for Unified Architecture, and software vendors have developed various strategies for incorporating them into products.

Agrusa views OPC as the natural vehicle for moving automation data to the cloud, given the millions of devices that already are connected via that standard, although he noted the standard requires some alterations to be fully cloud compatible.

"OPC was designed for closed corporate networks," Agrusa said. "When you move it to the Internet, it proves to be a bit bulky and it doesn’t move data quite fast enough yet for real-time analytics."

According to Agrusa, the OPC Foundation is working to make the standard more lightweight, while adding a level of security that will allow data to travel securely across corporate firewalls.

Steve Sponseller, business development director for oil and gas with Kepware, argues that that OPC is already becoming cloud friendly. He says Kepware is supporting customers’ effort to implement big data and IIoT projects by moving field-level data that complies with the OPC standard to the Splunk Enterprise platform, which is a cloud-hosted database that stores and manages large amounts of data, including unstructured data such as videos or large files from geologic and geographic systems used in the oil and gas industry.

More interfaces coming

"Currently, we use Splunk interfaces to get data out of our server and into Industrial Internet platforms," Sponseller said. "As the Internet of Things evolves and more databases of this type become available, we expect to develop additional interfaces."

Tim Shea, senior analyst at ARC Advisory Group, believes oil and gas executives have in exploring all options solving the integration issues that are preventing widespread adoption of big and IIOT technologies. These technologies can lower the cost of making their operations more efficient, Shea said, which is important in an industry that’s constantly adjusting to price fluctuations.

"It will enable companies to fully embrace automation," he concluded, "so they make a decent profit even at lower prices."

– Sidney Hill is a graduate of the Medill School of Journalism at Northwestern
University. He has been writing about the convergence of business and technology for more than 20 years. Edited by Eric R. Eissler, editor-in-chief, Oil & Gas Engineering, eeissler@cfemedia.com.

Original content can be found at Oil and Gas Engineering.