Core technologies make edge-intelligence possible

Development environments combine device management, connectivity, cloud, and analytics.


An IIoT software platform includes IIoT nodes, edge-intelligence server, and cloud services. Courtesy: AdvantechAs the Industrial Internet of Things (IIoT) evolves it's impacted by the same integration challenges as previous automation generations. Besides changing requirements, many different hardware and software technologies and applications apply. However, today, with open standards, these diverse elements can be melded together to deliver solutions.

In IIoT product and application development, developers' goals may look to: 

  • Support heterogeneous sensors and actuators via the Internet
  • Integrate heterogeneous wired and wireless connectivity protocols, including Modbus, LoRa, Sigfox, Wi-Fi, Bluetooth, and others
  • Port original software to different hardware, including MCU, x86/ARM CPU, GPU, and others, and operating systems that include Microsoft Windows, Linux Distributions, mbed OS, Android, and others
  • Connect cloud services that might include WISE-PaaS, Microsoft Azure, ARM mbed Cloud, IBM Bluemix, and others
  • Maintain data ownership and integrity and its implications for security and privacy
  • Quickly develop robust applications
  • Deploy, update, upgrade, and maintain large numbers of devices and services
  • Transform Big Data into valuable business information.

Thus, an IIoT product or solution must meet challenges related to sensors, connectivity, security, Cloud services, storage, device hardware, device maintenance, edge/Cloud analytics, system integration, application development, and so on. The first challenge many companies face is migrating to an IoT application while balancing design time, time-to-market, and risk. 

Anatomy of a network

IoT data can be large in volume. Applications typically have real-time requirements. Transmitting massive amounts of raw data puts a load on network resources. Often it is more efficient to process data near its source and send only the valuable fraction to a cloud center.

Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. Time-sensitive data in edge computing may be processed at the point of origin by an intelligent device or sent to an intermediary server located in close geographical proximity. Data that is less time-sensitive can be sent to the cloud for historical analysis, Big Data analytics, and long-term storage.

IIoT software platform services are based on three key components: the IIoT node, edge-Intelligence server, and Cloud services. The following describes some technology choices any supplier or using enterprise must make in developing its platform.

For edge-device development, "southbound" sensing-device connectivity must handle diverse sensing protocols, such as Modbus, OPC, BACnet, and Wireless IP/NonIP. These protocols can be handled by plug-in modules that process sensor data, data normalization, and communications.

The solution then handles the "northbound" cloud connectivity and intelligence facilities using the microservice container paradigm to modularize the different Cloud connections and enable device management. Similarly, intelligence facilities also adopt the microservice container architecture to support the data ingestion workload, such as data pre-processing and cleaning.

Perhaps most valuable of all is the on-demand, real-time analytics service that extracts pre-set data features, in real-time, as data is generated. A predictive maintenance and quality capability serves as a proof-of-concept for edge field prediction. Using companies extend this framework to develop analytic or predictive-maintenance modules via the architecture's open standard, based on the ubiquitous MQTT communications protocol and the modularizing Docker container technology.

Other technologies, like RESTful API, MQTT, and Node-RED, facilitate drag-and-drop application development. Node-RED and the configuration utility make it easy to implement custom applications. Moreover, well-documented SDK-with-MQTT sample code and the RESTful API interface allow advanced developers to fulfill high-level requirements.

The last component is cloud services, with SSL/TLS communications and Intel Security both on the edge device and for cloud. The data service can provide the PostgreSQL DB and Mongo NoSQL DB as standard offerings, and supports a standard integration interface with a wide range of data processing and storage products. The dashboard website serves as the IoT application user interface, and displays information via browser or mobile device through visualization facilities such as Azure Power BI or Tableau.

At the end of the day, a platform provides a marketplace for sourcing diverse IoT software utilities, providing pure cloud solutions such as database, dashboard, and machine-learning tools. 

Edge computing is a distributed IT architecture in which client data is processed at the network periphery, as close to the originating source as possible. Courtesy: AdvantechFurther explication

Now let's look a little more closely at some of the technologies mentioned.

MQTT is a simple, lightweight publish/subscribe messaging protocol used for constrained devices and low-bandwidth, high-latency, or unreliable networks. The service publishes its capability and data to an MQTT broker and subscribes to specific topics for input interfaces.

A RESTful API defines a set of functions that developers use to perform requests and receive responses via HTTP protocols, such as GET and POST. Because RESTful APIs use HTTP as a transport, they can be used by practically any programming language and are easy to test. It's a requirement of a RESTful API that the client and server are coupled loosely and independent of each other, allowing either client or server to be coded in any language and improved upon at will, which leads to system longevity and ease of evolution.

The RESTful API specifies what it can provide and how it can be used, and requires that details such as query parameters, response format, request limitations, public use/API keys, methods (GET/POST/PUT/DELETE), language support, callback usage, HTTPS support, and resource representations should all be self-descriptive.

The properties impacted by the constraints of the RESTful architectural style include: 

  • Component interactions can be the dominant factor in user-perceived performance and network efficiency
  • Scalability to support large numbers of components and interactions among components
  • The simplicity of a uniform interface
  • Modifiability of components to meet changing needs, even while the application is running
  • Visibility of communication between components by service agents
  • Portability of components by moving program code with the data
  • Resistance to failure at the system level despite failures of components, connectors, or data.

A Microservice Architecture pattern allows a designer to split the application into sets of smaller, interconnected services instead of having a single monolithic application. A service typically implements distinct features or functionalities, such as connectivity management, vertical application, or other. Each microservice is a mini-application with its own architecture, including business logic along with various adapters.

Containerization is an OS-level virtualization method for deploying and running distributed applications without launching an entire virtual machine (VM) for each application. Instead, multiple isolated subsystems, called containers, run on one control host and access one kernel. Containers share the same OS kernel as the host and are usually more efficient than VMs, each of which requires a separate OS instance.

A Docker container wraps up a piece of software in an independent subsystem, complete with file system and everything it needs to run: code, runtime, system tools, system libraries, and anything that may be installed on a server. This guarantees that it always runs the same, regardless of environment.

Containers hold the components necessary to run the desired application, such as files, environment variables, and libraries. The host OS also constrains the container's access to physical resources—such as CPU and memory—so one container cannot consume all of a host's physical resources.

Node-RED is available as open source, and is implemented by the IBM Emerging Technology organization. It includes a browser-based flow editor that easily wires flows together using the wide range of nodes in the palette. Flows can then be deployed to runtime with a single click. The flows created in Node-RED are stored using JSON, and easily can be imported and exported for sharing with others. It can be run at the edge of the network or in the cloud. The node package manager ecosystem is used to extend the palette of nodes available, enabling connections to new devices and services. Advantech offers a wide range of Node-RED add-on nodes.

Freeboard provides simple, real-time visualization of key performance indicators. This tool opens up many possibilities for IoT projects because it's simple, affordable, open source, and ready for extension. Customers can get started for free and then when it is time to ramp up, they can select a plan that's right for them. 

Architecture alignment

An architecture of the type under discussion can be classified into five category layers. Each is implemented as its own microservice, using MQTT broker as the communication bus. All microservices interface with other microservices or clients. At runtime, each instance is a Docker container. This makes it easy to deploy distinct experiences for specific users, devices, or special-use cases. 

Containerization is an OS-level virtualization method for deploying and running distributed applications.

  1. The bottom layer of the architecture is the sensor network connectivity layer. Wired sensors support various types, including supervisory control and data acquisition (SCADA), Modbus, and OPC-UA. The network connectivity layer collects data, manages sensor hubs, translates sensor protocols to the MQTT protocol, then passes data to the MQTT communication bus.
  2. The SDK layer provides software services such as EIS RESTful API, HDD Fault Prediction Algorithm Service, and so on. Developers call these services through RESTful API or MQTT. Users add their own services, such as Machine Learning Platform, Data Base engine, and so on.
  3. A flow-based layer has Node-RED as the data-flow design engine, plus add-ons such as SUSI API, WSN, and HDD prediction nodes. Users design logic paths via simple drag-and-drop operations in a graphical environment.
  4. The management and presentation UI interface layer. A Webmin for system administration and IoT connection configuration uses the Node-RED-UI for presenting IoT/sensor data.
  5. The cloud layer may be pre-installed, as for example, with the WISE-Agent connected to WISE-PaaS/RMM Cloud Server.

A flexible and scalable hardware/software architecture helps companies develop complex IoT infrastructure in an integrated ecosystem that serves different vertical markets.

Such an architecture can be customized, combining several software services. It is then installed on different hardware depending on requirements.

Kurt Au is a product manager in the Advantech Embedded IoT Group.

The Top Plant program honors outstanding manufacturing facilities in North America. View the 2017 Top Plant.
The Product of the Year program recognizes products newly released in the manufacturing industries.
Each year, a panel of Control Engineering and Plant Engineering editors and industry expert judges select the System Integrator of the Year Award winners in three categories.
2017 Top Plant winner, Best practices, Plant Engineering at 70, Top 10 stories of 2017
Pipe fabrication and IIoT; 2017 Product of the Year finalists
The future of electrical safety; Four keys to RPM success; Picking the right weld fume option
Product of the Year winners, Pattern recognition, Engineering analytics, Revitalize older pump installations
Control room technology innovation; Practical approaches to corrosion protection; Pipeline regulator revises quality programs
The cloud, mobility, and remote operations; SCADA and contextual mobility; Custom UPS empowering a secure pipeline
Knowing how and when to use parallel generators
PID controllers, Solar-powered SCADA, Using 80 GHz radar sensors
Natural gas engines; New applications for fuel cells; Large engines become more efficient; Extending boiler life

Annual Salary Survey

Before the calendar turned, 2016 already had the makings of a pivotal year for manufacturing, and for the world.

There were the big events for the year, including the United States as Partner Country at Hannover Messe in April and the 2016 International Manufacturing Technology Show in Chicago in September. There's also the matter of the U.S. presidential elections in November, which promise to shape policy in manufacturing for years to come.

But the year started with global economic turmoil, as a slowdown in Chinese manufacturing triggered a worldwide stock hiccup that sent values plummeting. The continued plunge in world oil prices has resulted in a slowdown in exploration and, by extension, the manufacture of exploration equipment.

Read more: 2015 Salary Survey

Maintenance and reliability tips and best practices from the maintenance and reliability coaches at Allied Reliability Group.
The One Voice for Manufacturing blog reports on federal public policy issues impacting the manufacturing sector. One Voice is a joint effort by the National Tooling and Machining...
The Society for Maintenance and Reliability Professionals an organization devoted...
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
Maintenance is not optional in manufacturing. It’s a profit center, driving productivity and uptime while reducing overall repair costs.
The Lachance on CMMS blog is about current maintenance topics. Blogger Paul Lachance is president and chief technology officer for Smartware Group.
The maintenance journey has been a long, slow trek for most manufacturers and has gone from preventive maintenance to predictive maintenance.
This digital report explains how plant engineers and subject matter experts (SME) need support for time series data and its many challenges.
This digital report will explore several aspects of how IIoT will transform manufacturing in the coming years.
Maintenance Manager; California Oils Corp.
Associate, Electrical Engineering; Wood Harbinger
Control Systems Engineer; Robert Bosch Corp.
This course focuses on climate analysis, appropriateness of cooling system selection, and combining cooling systems.
This course will help identify and reveal electrical hazards and identify the solutions to implementing and maintaining a safe work environment.
This course explains how maintaining power and communication systems through emergency power-generation systems is critical.
click me