Chevron Phillips addresses enterprise demand for real-time process data
Engineering disciplines, operations and maintenance amongst those clamoring for better insights via a data highway.
A rapidly growing appetite for time-series data coming from engineers and business analysts at Chevron Phillips Chemical Co. motivated the IT and operations functions there to tackle challenges faced in accessing and sharing real-time process data within the enterprise.
Headquartered in The Woodlands, TX, Chevron Phillips Chemical is a petrochemical company jointly owned by Chevron Corp. and Phillips 66. It includes 31 manufacturing and research centers around the world.
The company’s Project Hermes led to the deployment of a data highway – already implemented at eight Chevron Phillips sites – that eased data sharing across the process network, business network, and the company’s Microsoft Azure cloud computing platform.
“We call it Project Hermes because Hermes is the Greek deity that delivered messages from the celestial sphere to the folks on Earth. That is what we’re doing, except the information is shared by means of our cloud environment,” said Zachary Kaspar, a Chevron Phillips data engineer.
Like many companies, especially those resulting from joint ventures or mergers & acquisitions activity, automation systems at Chevron Phillips include many different operating environments and diverse distributed control systems (DCS), data historians, and SCADA toolsets. Because information sharing proved so arduous in the past, it didn’t happen much above the site level. What was done relied on Microsoft Excel and email.
A need to analyze
What got the information-sharing ball rolling was that in 2020, Chevron Phillips started working with Seattle, WA-based Seeq and its eponymously named software application, a solution for process manufacturing companies seeking insight from data. Seeq made it easy to connect to systems across the enterprise, Kaspar said. Furnished with these capabilities, folks at Chevron Phillips sites started building dashboards and using analytics as a tool.
Clearly, demand existed within the company for data to support those seeking to better understand or optimize real-time production processes. Two early use cases included automating environmental and sustainability reporting and improved predictive and preventive maintenance.
“We looked for a better way to share data, not just within the analytic platform, but throughout our own cloud environment,” Kaspar said. “We knew our Azure environment would be a one-stop shop to access that data but needed a mechanism to move data quickly and securely throughout our networks. That’s what using OPC with Matrikon Data Broker does.”
Matrikon Data Broker (MDB) makes OT data available enterprise-wide and quickly adopts new data sources. Edmonton, Canada-based Matrikon is a vendor-neutral supplier of OPC UA and OPC-based data interoperability solutions for control automation data sharing across the enterprise.
Other Matrikon tools used by Chevron Phillips Chemical include OPC UA Tunneller (UAT), which enables a phased migration approach to OPC UA adoption by seamlessly integrating OPC UA clients and servers with OPC Classic components. UAT also eliminates the use of DCOM between OPC Classic components, making their communications more secure and reliable.
“We’re targeting all systems that have time-series data in sites globally, regardless of make or model,” Kaspar said. “We had eight sites done before Thanksgiving last year. From a focus on deployment, we’re now beginning to leverage the data while completing our international deployments by June.”
Standards and solutions
As is well known, Open Platform Communications, or OPC, is a series of standards and specifications for industrial telecommunication. An industrial automation task force developed the original standard in 1996. In the 21st century, as digital transformation has taken hold in industrial enterprises around the world, the OPC Foundation developed the OPC Unified Architecture, a cross-platform, open-source IEC 62541 standard for data exchange from sensors to cloud applications. The first version was released in 2006. Recent versions of OPC UA have added publish/subscribe in addition to a client/server communications infrastructure.
“Most of the tools we connect to still talk in OPC Classic,” Kaspar said. “To move forward with the Hermes project, we had to demonstrate that we could move the data securely and that we could manage the entire footprint. You have an application on every single node. The biggest gain is having a standardized tool set to transport time-series data from the shop floor to the cloud. MQTT translates OPC UA into our Azure environment.
“What gave us confidence in OPC UA was our partner Matrikon. Matrikon has demonstrated the expertise to solve the challenges we face. We’re standardizing our path upward with OPC UA. It’s been a big win for us.”
Besides needing to access and share real-time data, data consumers within Chevron Phillips need source data that is enhanced contextually. Across industry, most integrations today are narrowly conceived. A unified OT data layer depends on third-party data-source federation, as well as context preservation and enhancement, achieved primarily by information modeling and mapping. OPC standards and Matrikon technology support this evolution.
OPC UA defines an open standards-based way to create and discover data models between OPC UA components. It does so without requiring code to be written for one UA component to ‘ingest’ the data from another 3rd party UA component. Matrikon Data Broker converts the potential value of the flexibility and robustness of the OPC UA standard into functionality end-users can use in real-world applications. MDB does this by giving end-users control over data modeling (via UA companion specifications) and OT data source mapping. This lets users create meaningful data views from their heterogenous OT data sources, like older components with no context (like Modbus-based units), proprietary formats, and new UA components, which often have fixed, vendor-defined, UA models.
Such customized data views serve as meaningful OT data input for on-prem and cloud applications (like dashboards, analytics, and report generators), which is essential in phased migration projects where older components lack the data context needed by modern applications.
Implementation contingencies
To manage standard technology deployments at highly diverse locations, Chevron Phillips first created a deployment template used to tackle each site.
“The idea behind it was simple enough,” said Kaspar. “We knew we had to install Data Broker, we knew we had to install a component, and we knew we had to install on a given number of networks. Variability depended on what systems each site had in production. Further, integration to one instance of, for example, the PI Historian doesn’t mean the next instance of PI will be a mirror image. But when it came to having a plan to deploy and a process for doing the work, it was cookie cutter. We were able to iterate multiple sites at one time. We deployed eight sites in less than 12 weeks.
“Current focus is on stress and load testing. We need to meet the demand for real-time data not just today but over the next five to 10 years. Another priority is system stability. We must have processes to monitor what’s been built and have analytics within the process to help us identify when and where this is a possibility of something breaking,” Kaspar said.
Finally, Kaspar observes that information technology and operations technology (IT/OT) convergence will define the future of production environments. IT folks and OT folks are still learning to work together. Sometimes, the biggest challenge is the different mindsets of the two groups, but the right technology can help create common ground.
“Folks from the IT space like to focus on leading-edge technology, to exploit IT for new value. Whereas the OT mind set on the plant floor, where the data generates from, focuses on system stability. That’s the biggest challenge.” Kasper continued, “what’s needed is to have a bridge. That’s where Matrikon Data Broker helped. Matrikon defines MDB as a new category of software called Data Technology, or DT, which represents software that fulfils OT data-sharing needs across the enterprise by addressing IT/OT gap-related issues “under the hood,” to eliminate traditional loggerheads IT and OT get into when working with applications that only address the IT or OT needs but not both. It’s a step in the right direction.”
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.