An overview of industrial IoT, from edge to cloud

Next generation distributed I/O brings users one step closer to seamless connectivity

By Josh Eastburn April 7, 2020

By now, most anyone working in a role involving industrial automation has heard about digital transformation, the Internet of Things (IoT), and Industrial IoT (IIoT). These initiatives involve ever smarter devices installed progressively closer to the “edge,” perhaps connected to an internet “cloud,” or even connected through something called the “fog.” Even if we consolidate these terms under the umbrella of IIoT, for most folks a simple question remains: what is the goal of the IIoT?

Simply put, end users would like the IIoT to create a cohesive system of devices and applications able to share data seamlessly across machines, sites, and the enterprise to help them optimize production and discover new cost-saving opportunities.

This has always been a goal of industrial automation, but traditional operational technology (OT) architectures are poor at scaling, priced prohibitively and demand complex configuration and support. So what is changing?

Much as consumer hardware and software technologies have shifted to improve ease-of-use and connectivity, industrial products and methods are following the same trend by adopting information technology (IT) capabilities. This article discusses how a more distributed global architecture is enabling connectivity from the field to the cloud for sensors and actuators, and for the input/output (I/O) systems and controllers linked to them.

Up and down the architecture

Classical industrial automation architectures generally address data processing from a hierarchical standpoint. One good feature of this hierarchy is the clarity it provides with regards to where data can originate, be stored, undergo processing, and be delivered. However, the task of transporting data and processing it in context is often quite difficult because so many layers of equipment are required to connect devices and applications.

The lowest level of an automation architecture is generally considered to be the physical devices residing on process and machinery equipment: sensors, valve actuators, motor starters and so on. These are connected to the I/O points of control system programmable logic controllers (PLCs) and human-machine interfaces (HMIs). Both PLCs and HMIs are well suited for local control and visualization, but less useful for advanced calculations and processing. Fortunately, using industrial communications protocols, they can send data to upstream supervisory control and data acquisition (SCADA) systems where it might be historized and made available to corporate level analytical software. Sharing data within multi-vendor systems, however, often requires additional middleware such as an OPC server.

More advanced site manufacturing execution system (MES) and overall enterprise resource planning (ERP) software also reside at higher levels of the architecture, hosted on PCs or servers on site, or in the cloud, where the cloud is defined as providing large-scale, internet-based, shared computing and storage. Raw information generally flows up to higher levels to be analyzed and used to optimize operations.

Developments over the past decade are significantly altering this traditional hierarchy, flattening and simplifying it to a great extent.

Spanning edge, fog, and cloud

Computing capability and networking bandwidth used to be much less available. Each step up the hierarchy from a basic hardwired sensor to cloud computing systems was required to access greater computing resources and networking capabilities (Figure 1).

Figure 2: Modern edge devices, such as the Opto 22 groov RIO, flatten and simplify the architecture required to connect field I/O signals to business and control applications. Courtesy: Opto 22[/caption]

Many other factors besides advancing technology are driving this shift to a flatter architecture. The most straightforward motivation is to balance computing and networking demand between the edge and higher-level systems. Edge computing offloads central processing, preserves data fidelity, improves local responsiveness, and increases data transfer efficiency to the cloud.

Ultimately, however, this new edge-to-cloud architecture depends on having new options at the edge for acquiring and processing field data.

Distributed I/O evolves

Field data can be raw I/O points connected at the edge or derived calculation values. Either way, the problem with traditional architectures is the amount of work it takes to design, physically connect, configure, digitally map, communicate, and then maintain these data points. Adding even one point at a later date may require revisiting all these steps. To create more scalable, distributed systems, some vendors are making it possible to bypass these layers between the real world and intermediate or top-level analytics systems.

Classic I/O hardware, for example, is not very intelligent and must be mastered by some supervisory controller or system. But with enough computing power, all the necessary software for enabling communications can be embedded directly in an I/O device. Instead of requiring a control engine to configure and communicate I/O data to higher levels, I/O devices can transmit information on their own.

This kind of edge data processing is becoming possible also due to a proliferation of IIoT tools in recent years, for example:

  • MQTT with Sparkplug B: A secure, lightweight publish/subscribe communications protocol designed for machine-to-machine communications with a data payload designed for mission-critical industrial applications
  • OPC UA: A platform-independent OPC specification, useful for machine-to-machine communication with legacy devices
  • Node-RED: A low-code, open-source IoT programming language for managing data transfer across many protocols and web APIs.

Combined with standard IT protocols like VPN and DHCP for secure remote connection and automatic addressing, these technologies give today’s I/O hardware the ability to act as first-class participants in a distributed system, rather than requiring layers of supporting middleware (Figure 3).

Another obstacle to scalability for IIoT systems based on classic I/O hardware is the work required to provide power, network connections, and the right I/O module types. To address these issues, vendors are taking advantage of new technologies to make distributed remote I/O more feasible.

One example is power over Ethernet (PoE) capability, which uses a network cable to simultaneously supply low-voltage power and network connectivity. When PoE is embedded into a remote I/O device, it can even supply I/O loop power, simplifying electrical panel design and saving money on additional components and labor.

To make it easier for designers to specify the right I/O interface types, some new I/O devices also include more flexible configuration options, like mixed-signal I/O channels. These provide extensive options to mix and match I/O signal types as needed on one device, reducing front-end engineering work and spares management.

The combination of these features within distributed I/O devices makes it possible for implementers to easily add I/O points anywhere they are needed, starting with a few points and scaling up as much as necessary at any time. Wiring needs are minimized, so long as networking infrastructure is accessible.

For more comprehensive control and calculation, of course, any number of edge controllers can also be integrated. The combination of edge I/O and edge control leads to a new distributed data architecture.

Architecture options

So what new architectural possibilities are available to industrial automation designers using modern distributed I/O and edge computing? The logical hierarchy is flattened even as the geographical distribution is expanded, with edge devices making local data directly available to computing resources at the edge or at higher organizational levels (Figure 4).

Here are some examples of new information architectures that are becoming possible for use in places like commercial facilities, campuses, laboratories and industrial plants:

Shared Multi-Site Infrastructure: Where field signals are distributed over large geographic areas or multiple sites, edge devices can facilitate data transmission to networked applications and databases, improving the efficiency and security of local infrastructure or replacing high-maintenance middleware such as Windows PCs.

Brownfield Site Integration: Edge I/O can form a basic data processing fabric for existing equipment I/O in brownfield sites and work in combination with more powerful edge controllers and gateways using OPC UA to integrate data from legacy RTUs, PLCs, and PACs. This approach improves security and connectivity without interfering with existing control systems.

Direct Field-to-Cloud Integration: Engineers can design simple, flat, data processing networks using only edge I/O devices (without controllers or gateways), expanding as needed to monitor additional field signals. A distributed I/O system like this can process and report data directly to cloud-based supervisory systems, predictive maintenance databases, or MQTT servers.

Many-to-Many Data Distribution: Edge devices with embedded MQTT clients can publish field data directly to a shared MQTT server or redundant MQTT server group located anywhere the network reaches: on premises, in the cloud, or as part of regional fog computing resources. The server can then share that data with any number of interested network clients across the organization, including control systems, web services, and other edge devices.

Seamless connectivity

Seamless connectivity is now a reality thanks to technologies that make ubiquitous data exchange possible. New hardware and software products enable interconnectivity among physical locations in the field, at the local control room, in the front office, across geographic regions, and up to global data centers.

Distributed edge I/O, edge computing, and associated networking technologies support data transfer through the edge, fog, and cloud portions of an industrial architecture. End users can erase the former boundaries between IT and OT domains and get the data they need to optimize operations.


Author Bio: Josh Eastburn is director of technical marketing, Opto 22. After 12 years as an automation engineer working in the semiconductor, petrochemical, food and beverage, and life sciences industries, Eastburn works with the engineers at Opto 22 to understand the needs of tomorrow's customers. He is a contributing writer at blog.opto22.com.