Continue to Site

Harnessing the power of predictive analytics

The oil and gas industry has plenty of data; now it needs to put it to better use.

By Karen Field April 2, 2016

In the June 2015 report, "Unlocking the potential of the Internet of Things," the McKinsey Global Institute concluded that most Internet of Things (IoT) data today are not used. To illustrate their point, the authors cited an example of an oil rig with 30,000 sensors, but only 1% of the data are examined.

According to Cisco’s 2015 Report "A New Reality for Oil and Gas," a typical offshore oil platform generates as much as 2 terabytes of data each day. Only a snippet of that data relating to the detection and control of out-of-normal conditions are actually used.

It’s ironic to note that the oil and gas industry was called out in the McKinsey report, given that 35 years ago, it actually led all industries in an earlier big data challenge: the transition to 3-D seismic data analysis. 

Back to the future

Thirty years ago, most oil and gas exploration work was done using 2-D seismic data. "A geoscientist would come in with colored pencils and draw and scribble on them. Interpretation was a manual exercise. There was so much paper it would take a couple of guys to cart it away when they were done," said Mark Lochmann, a consultant with 40 years of experience in the oil and gas industry within information technology.

Keith R. Holdaway remembers the day, too. A former geophysicist, he is a principal industry consultant and principal solutions architect at SAS working with innovative oil and gas technologies.

"It was a slow plod. Even though we were not close to the terabytes of data we have today, we didn’t have the hardware or the full scope of advanced analytics to handle it more efficiently," Holdaway said. "It was all empirical interpretation based on experience."

What spurred the development of 3-D seismic data was the fact that companies had a huge motivation to throw whatever money was necessary at finding a better way to discover reserves. In 1978, the Financial Accounting Standards Board announced a program of financial reporting, FASB No. 19, termed Reserves Recognition Accounting, allowing that a company’s reserves could be recognized as an asset.

When the industry moved to 3-D modeling of the Earth’s subsurface, the volume of data expanded exponentially. Lochmann explained that companies needed to have technical software to understand these huge sets of data (called "blobs"). "So the industry invested heavily in the tools and technology, not only from a corporate standpoint in terms of archiving, storing, and supporting it, but also by working with the vendor community in learning how to analyze that data, including the visualization piece." 

Big data is not the same problem today

One reason that the industry cannot simply replicate what it did in the past is because big data today is far more complex. Drilling and production requires real-time decision making and consequently real-time big data, which is fundamentally different than blobs. And there are data integrity and aggregation challenges as well as internal and cultural issues that pose barriers.

Although the volumes of seismic data generated are huge, the data are of limited types. Geophysicists are analyzing P-waves, S-waves, and full azimuthal and anisotropic rock properties in the seismic profiles. Wavelet analyses in seismic profiles can be convoluted, as noise is ever-present in seismic data.

That’s not the case when it comes to drilling or production areas. Instead of a single process to acquire seismic data, several thousand-maybe even 100,000-sensors in a field are collecting potentially hundreds of different data types. And the velocity of data is much faster.

"In the past, we had a big data dump every three to four weeks, now it’s coming through in minutes or even seconds," said Holdaway.

Lochmann likens big data of the past and big data of the present to someone being a great French writer who comes to the U.S. to lecture, but doesn’t speak English. "It’s two totally different worlds," he said.

Another difference is that Wall Street isn’t the big driver anymore, although Holdaway said that low oil prices might force changes like he’s seen in the past. "In the early 1980s, when low oil prices forced massive layoffs, companies realized they had to work smarter and so they adopted new technology even faster."

Unfortunately, accidents, such as the Deepwater Horizon Oil Spill, have also placed emphasis on big data. "What the Macondo well did was demonstrate that, perhaps, relying on humans to process all of this data and come to a decision might not be the safest way to operate," said Lochmann.

The sweet spots

"There are two things that big data brings to the table that your conventional technologies don’t," said Dave Lafferty, an oil and gas industry consultant who was formerly with the BP chief technology office. "Number one, you have the ability to put this stuff into context. If you look at a SCADA system, it plots a variable against time. If I get real fancy, I might cross-plot pressure and temperature."

What big data can do is understand in context what happened when something occurs, such as a shut-in well. Lafferty continued, "Did you change the choke, for example? Big data allows you to look at all these other factors and take a two-dimensional analysis and make it multidimensional. Ultimately, if it helps you answer the question: ‘What is normal, and what points you in a direction on what to do next?’"

There are literally hundreds of examples of successful applications of big data today in oil and gas, many with remarkable results. Lochmann points to a Chevron big data project in offshore Nigeria. "They reduced the time engineers had to deal with data-related, nonproductive time by 98%. That’s like handing themselves additional engineers to work on value-added activities," he said.

Lafferty points to the Prudhoe Bay project in Alaska. Here, BP applied predictive analytics for equipment health to all its major rotating equipment. "They struggled a bit with the data and quality issues, but they went in focused, solved the big data problem, applied the analytics, and put it into actionable form. They’re saving $10 million/year in unplanned shutdowns now," he said.

A significant opportunity for big data is in well integrity applications-particularly as many wells age. It presents a tricky, multi-dimensional data challenge. "The cement gets older, there is corrosion of the casing, all kinds of things. And it’s not really affordable to send people out to inspect several hundred wells. Old stripper wells, which can stop working due to gas interference, can also benefit," Lafferty said.

By continuous monitoring of the cement barrier and acoustic logs, companies can ensure that fluids, such as water and gas in one zone do not mix with oil in another zone.

Steam traps, which act as relief valves when the pressure gets too high, are another example. "You can lose a lot of energy when these steam traps get stuck open. Monitoring a steam trap is one of those things like doing the dishes, you don’t like it but it has to be done," said Lafferty.

Using an acoustic sensor, which hears steam coming out, and predictive analytics, companies know now, based on what a particular valve is doing, whether it needs attention.

Implementation obstacles

Lafferty said that a big disconnect today is that many vendors in the industry want to talk about their big data technology or analytics, but they don’t want to talk about the data quality or sensor health issues, which are paramount. "I think it’s because we tend to treat all this stuff as silos, rather than as systems," he said.

Lafferty likens it to a package distribution center. "The warehouse guy doesn’t know where all the packages come from, but he knows where to stack them. The shipping guy understands the loading dock and where the package is going, but he doesn’t understand the contents of the box. If you treat a process that way, there is no accountability. All the different teams need to work together."

Holdaway agrees that culture is an impediment. "I think the industry is having a hard time and has always had a hard time. One of the main problems in the oil and gas industry is that it sticks its heels in pretty much trying to retain the old traditional ways of working. Until you get that crew change and get people such as myself-baby boomers-out of the industry, there is going to remain too much dependency on that silo-ed engineering data," he said.

He also asserted that the operations and IT departments are still not converging as quickly as they should internally. "In some companies, IT is still like the curator of the data, and they do not work very well internally with the engineering department. Unless you have a strategy that integrates them, you won’t get very far with a big data project."

The need to combine data streams that were not meant to be combined could be a potential nightmare. But Lochmann said there’s no point in trying to aggregate all that data by moving it into a single database. That’s especially true in drilling and production, where there can be huge numbers of vendors, all with their own data structure and assumptions.

He advocates handling data by federation, which means keeping all data in its original database, rather than trying to move it all into one and define it perfectly. "What you do essentially is create a workflow and define the data you need based on that workflow," said Lochmann. "Then you create a map that tells you which database that data lives in, access it when it’s needed, and deliver it to where it’s needed."

This is also a way to address cross-disciplinary work processes. 

Path to success

The most successful big data projects that Lafferty has seen start with a strong business case that clearly defines the objectives. "What you want do is boil the stuff down into actionable items. Rather than taking vibration data, you want to collect the data that helps to answer a question such as ‘We want to know one month before this bearing is going to fail.’ Then work backwards to develop the data that will give you that answer."

The teams working on successful big data projects also ensure that they can trust the data that they are collecting by having checks and balances in the process. "The normalization of data, it’s really basic stuff, but it is amazing how something like whether the temperature data is in Fahrenheit or Celsius is easy to overlook," said Lafferty. "In fact, the Mars Climate Orbiter crashed on Mars in 1998 because of that very problem with data."

Karen Field, a former mechanical design engineer, has more than two decades of experience covering the electronics and automation industries.

Original content can be found at Oil and Gas Engineering.