A look at digital transformation using the cloud
Cloud architecture, when examined in its entirety, can be very complex. Focusing on its interface with operations technology makes matters clearer and more manageable.
Digital Transformation Insights
- Digital transformation (DX) and Industrial Internet of Things (IIoT) can help simplify cloud architectures, which are comprised of six layers. The layers go beyond information technology/operational technology (IT/OT) convergence and help create the foundation for a business enterprise. Much of the transformation happens at level 1.
- A good cloud platform helps users manage data along the lifecycle, which supports curation and utilization to enable effective data analytics and machine learning (ML) to generate insights that can make manufacturing operations better.
- Digital transformation and IIoT are part of a larger effort to use all the many data streams in manufacturing to gain a better understanding on how everything works. ML and artificial intelligence (AI) make the process easier and the cloud makes it possible to store large amounts of data.
Digital transformation (DX) reorients how companies do business through digital technologies supported on cloud infrastructure to all areas of operation, including value-chain optimization, enterprise planning, and asset management. DX even extends to health, safety, and environmental management. DX and Industrial Internet of Things (IIoT) discussions include operations technology (OT) and information technology (IT) convergence, the cloud and networking strategies. The expanse of DX and IIoT topics can be overwhelming even to someone more experienced on the OT side.
Comprehensive cloud architecture comprises six layers (Figure 1), spanning everything from an individual field instrument to enterprise-wide management networks. For someone on the OT side, the upper layers are far removed, and there is little occasion to interact with them. In this case, the focus will be on the bottom up and will emphasize three areas:
- What is (are) the core objective(s) of the cloud shift in relation to its practical benefits.
- What is necessary to engage existing production equipment and networks with the cloud.
- A real-world example where DX is making a major difference.
To help contextualize the analysis, we can use a hypothetical company as an example with the assumption layer 0 is already working well. The field networks and automation host—distributed control system (DCS), programmable logic controller (PLC) or other automation devices—keep the process under control, and there is basic data retention using a process historian. This is the OT side of the operation. How do they connect to the IT and the cloud? This is where the edge factors in.
Level 1, Edge: Bridge and unifier
The term “edge” is misleading because it is at the center of so much. The edge system (Figure 2) serves not only as a bridge between traditional OT systems and IT, but also between the local (on-premises) systems and the cloud. In addition to transporting local data to the cloud, edge configurations also might be engineered to support unstructured data, and even run-time sensitive logic and artificial intelligence (AI). This makes it a very strategic link in the larger architecture.
The edge system must have an effective data collection mechanism for the OT systems so it can move the data to the cloud without problems. This requires a combination of hardware and software, and the edge system must be sized according to the nature of the application, with consideration for factors such as:
The primary role of an edge controller is to regulate the flow of information from the OT and IT floor to the data center or cloud (and vice versa), providing storage, buffering, and bulk upload of data wherever applicable. It also may provide intelligence to perform data filtering and execution of business logic—often using machine learning (ML) or transfer-learning algorithms—close to the plant floor.
The edge acquires OT data from process systems or historians using interfaces conforming to open industry standards like OPC and OPC UA. There also are native interfaces that support a host of other industry standard protocols such as Modbus, Foundation Fieldbus and Profibus.
Standard security practices deploy the edge in a demilitarized zone (DMZ) configuration, which is a physical or logical subnetwork that contains and exposes an organization’s external-facing service. Other options may involve servers with information diodes. Edge gateways also can be configured and managed from an external network, like the cloud, based on organizational security policies.
Edge strategy in three tiers
Details of edge strategies vary among companies, their production processes, and plant locations. A good way to begin the design process is using a three-tier edge strategy (Figure 3):
- The “edge gateway” focuses on providing secure transport of data from the plant floor to the cloud, but with restrictive OT protocol support.
- The “light edge” provides all the functionalities of the first tier, but also connectivity across a broader range of OT protocols, along with buffering, filtering, payload transformations, and some IT data exchange.
- The “comprehensive edge” builds on the first two, while also providing a mechanism to run applications (including AI apps), and to provide application orchestration, device management, support for robotics, and remote engineering from the cloud.
The comprehensive edge combines components of its collaborative information server with components for video and image analytics. This end-state vision of the edge supports ingestion of both structured data (from sensors, process data, etc.) and unstructured data (video, images, files, etc.). The comprehensive edge also ensures the edge can process and bridge all kinds of operations data (OT and IT) between the plant floor and the cloud.
The ability to execute AI applications that combine one or more of these data types can unlock new value through applications like virtual/smart workers, integrated remote operations, robotic applications and more. This third tier of edge strategy helps a facility take a step closer to realizing industrial autonomy. By applying distributed ML, it is possible to distribute the AI workload between the cloud and edge, enabling a new paradigm of smart manufacturing.
How the cloud benefits manufacturing
The cloud can host a wide variety of software applications, while providing secure and bulk data handling and other functionality, therefore what can be done with it dependent on the application. Some companies approach it as a blank slate and build all needed elements. Others prefer to use a third-party platform-as-a-service (PaaS) so they can have a quick start, along with access to sophisticated tools while they develop their own solutions. A third approach is subscribing to a fully managed cloud application service.
For those choosing the third approach, a cloud platform provides a full range of digital applications to deliver true DX capability, while greatly reducing the need for end-user support and investment at the IT level.
A cloud platform empowers users to manage managing data along its lifecycle using an established sequence of activities and processes. This supports data curation and utilization to enable effective analytics and ML to drive insights and innovation.
The cloud platform is organized into five layers (Figure 4), each representing a set of tools and services available to applications built on the platform. Common services (3-1 in Figure 4) include cybersecurity and identity management services that provide secure access to data, but only to valid users. The layered architecture of the platform maps to the key phases of data management: data enablement (3-2), data curation and processing (3-3), and data analytics (3-4). It also includes an application programming interface (API) (3-5) through which the applications consume the services provided by the platform.
How does it work to provide practical process improvements? This is where applications come into the picture. Applications are software programs built to solve specific business problems such as asset management, production optimization, or health and safety. They’re built to use the available process data by applying analytics, AI, ML, visualization and more.
Since all the applications are on a common platform, they can connect with each other and securely exchange or reuse data as needed. Once available on the platform, data can be reused by other applications, thereby eliminating data duplication throughout the lifecycle. Standard applications are configured for quick deployment.
With all the data and communication infrastructure available, the cloud platform is an ideal place to host optimization and process autonomy applications since they can access all necessary historical data and current process conditions, while providing brute number-crunching power.
Wastewater treatment optimization using DDMO integration
The number of applications available that are designed to provide process autonomy and optimization via a cloud platform are growing, but given the naturally cautious attitude of process manufacturers, many are anxious to see working applications in actual operations. Yokogawa recently completed a proof of concept (PoC) for optimizing operations at a U.S. wastewater reclamation facility producing potable water. For this initiative, a data-driven modeling for optimization (DDMO) application was delivered. It is powered by the cloud platform and can use historical data to improve operations. It has been used in a complex and critical water treatment application by the City of Los Angeles at its Tapia Water Reclamation Facility of the Las Virgenes Municipal Water District (LVMWD), located in Los Angeles County.
Producing enough potable water to supply 20 million people in the state calls for a variety of methods including recycling wastewater. This requires advanced water treatment (AWT) methods (Figure 5), including ultrafiltration (UF), reverse osmosis (RO), and ultraviolet advanced oxidation processes (UV AOP). The thought of recycling wastewater has caused concern among consumers, but the processes have proven safe and practical, provided the water can be treated to achieve log10 values of viruses, Giardia, and Cryptosporidium.
One major challenge the project was related to verifying the water treatment was effective, while avoiding overtreatment costs. The DDMO software suite was applied to model and then predict setpoints to optimize operations and support operator decision making, while maintaining the target water quality at water treatment facilities. With advanced data analysis through cloud software and secure remote connectivity, the automation system can make operational adjustments to optimize the process. Specialized scanners to evaluate treatment are equipped with built-in data cleansing, curation, analysis and modeling tools for continuous process optimization.
For this initiative, the project used the cloud platform (Figure 6) to minimize the required on-site activities and establish a secure data reference between the site and systems. The DDMO software used real-time operational data to derive control setpoints, which were sent back to operators at the Tapia Water Reclamation Facility.
After conducting a multifaceted evaluation and performing a concrete verification of effectiveness, the new techniques, coupled with the DDMO software suite, delivered operational efficiency improvements yielding more than a 10% reduction in power consumption, while meeting all water quality standards. Impressed with these results, the WateReuse Association presented Yokogawa and its partners with a 2022 Transformational Innovation award.
Hand valve position monitoring
A fertilizer company used the cloud platform to monitor hand valve positions at one of its plants. Panel operators are often blind to manual valve positions and rely on field operator feedback based on operator rounds. To address human failures causing incidents, process downtime, and production losses, a cloud-based monitoring solution covering hand valves was delivered.
The cloud platform provides:
- Connectivity to field-based IoT sensors using edge gateways
- Secure data flow, storage, and continuous monitoring
- Visualization of valve position in dashboard with alerts
- Easy deployment, maintenance, and scale up.
This solution limits the probability of mistakes, faulty operations, and environmental spills, leading to safer plant operation.
Asset performance management of geothermal power plants
Another example of an application is an asset performance management system installed at a geothermal power plant. This application is an add-on system applicable to any geothermal plant, and it connects to an existing plant control system, with additional points of measurement provided as needed.
The solution is deployed on the cloud platform. Connectivity to one or multiple production sites is established using standard OT protocols supported by an edge adaptor hosted on the cloud platform.
The cloud platform provides:
- Integration of plant data
- Secure data flow, storage, and continuous monitoring
- Calculation of geothermal key performance indicators (KPIs), with comparison to baselines
- Visualization of geothermal plant performance KPIs to help plant personnel identify performance degradation, and then act as required.
Digital transformation concepts at work
Industrial digital platforms, such as a cloud platform, provide a collection of common reusable services for data management, algorithmic execution, and visualization. These platforms enable the development of a variety of applications and integrated solutions with a consistent user experience, thereby accelerating and simplifying the value creation process. Business values unlocked by DX include improved effectiveness, efficiency, optimization, organization-wide collaboration, and progress toward industrial autonomy.
Original content can be found at Control Engineering.