Core technologies make edge-intelligence possible

Development environments combine device management, connectivity, cloud, and analytics.

By Kurt Au, Advantech November 9, 2017

As the Industrial Internet of Things (IIoT) evolves it’s impacted by the same integration challenges as previous automation generations. Besides changing requirements, many different hardware and software technologies and applications apply. However, today, with open standards, these diverse elements can be melded together to deliver solutions.

In IIoT product and application development, developers’ goals may look to: 

  • Support heterogeneous sensors and actuators via the Internet
  • Integrate heterogeneous wired and wireless connectivity protocols, including Modbus, LoRa, Sigfox, Wi-Fi, Bluetooth, and others
  • Port original software to different hardware, including MCU, x86/ARM CPU, GPU, and others, and operating systems that include Microsoft Windows, Linux Distributions, mbed OS, Android, and others
  • Connect cloud services that might include WISE-PaaS, Microsoft Azure, ARM mbed Cloud, IBM Bluemix, and others
  • Maintain data ownership and integrity and its implications for security and privacy
  • Quickly develop robust applications
  • Deploy, update, upgrade, and maintain large numbers of devices and services
  • Transform Big Data into valuable business information.

Thus, an IIoT product or solution must meet challenges related to sensors, connectivity, security, Cloud services, storage, device hardware, device maintenance, edge/Cloud analytics, system integration, application development, and so on. The first challenge many companies face is migrating to an IoT application while balancing design time, time-to-market, and risk. 

Anatomy of a network

IoT data can be large in volume. Applications typically have real-time requirements. Transmitting massive amounts of raw data puts a load on network resources. Often it is more efficient to process data near its source and send only the valuable fraction to a cloud center.

Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. Time-sensitive data in edge computing may be processed at the point of origin by an intelligent device or sent to an intermediary server located in close geographical proximity. Data that is less time-sensitive can be sent to the cloud for historical analysis, Big Data analytics, and long-term storage.

IIoT software platform services are based on three key components: the IIoT node, edge-Intelligence server, and Cloud services. The following describes some technology choices any supplier or using enterprise must make in developing its platform.

For edge-device development, "southbound" sensing-device connectivity must handle diverse sensing protocols, such as Modbus, OPC, BACnet, and Wireless IP/NonIP. These protocols can be handled by plug-in modules that process sensor data, data normalization, and communications.

The solution then handles the "northbound" cloud connectivity and intelligence facilities using the microservice container paradigm to modularize the different Cloud connections and enable device management. Similarly, intelligence facilities also adopt the microservice container architecture to support the data ingestion workload, such as data pre-processing and cleaning.

Perhaps most valuable of all is the on-demand, real-time analytics service that extracts pre-set data features, in real-time, as data is generated. A predictive maintenance and quality capability serves as a proof-of-concept for edge field prediction. Using companies extend this framework to develop analytic or predictive-maintenance modules via the architecture’s open standard, based on the ubiquitous MQTT communications protocol and the modularizing Docker container technology.

Other technologies, like RESTful API, MQTT, and Node-RED, facilitate drag-and-drop application development. Node-RED and the configuration utility make it easy to implement custom applications. Moreover, well-documented SDK-with-MQTT sample code and the RESTful API interface allow advanced developers to fulfill high-level requirements.

The last component is cloud services, with SSL/TLS communications and Intel Security both on the edge device and for cloud. The data service can provide the PostgreSQL DB and Mongo NoSQL DB as standard offerings, and supports a standard integration interface with a wide range of data processing and storage products. The dashboard website serves as the IoT application user interface, and displays information via browser or mobile device through visualization facilities such as Azure Power BI or Tableau.

At the end of the day, a platform provides a marketplace for sourcing diverse IoT software utilities, providing pure cloud solutions such as database, dashboard, and machine-learning tools. 

Further explication

Now let’s look a little more closely at some of the technologies mentioned.

MQTT is a simple, lightweight publish/subscribe messaging protocol used for constrained devices and low-bandwidth, high-latency, or unreliable networks. The service publishes its capability and data to an MQTT broker and subscribes to specific topics for input interfaces.

A RESTful API defines a set of functions that developers use to perform requests and receive responses via HTTP protocols, such as GET and POST. Because RESTful APIs use HTTP as a transport, they can be used by practically any programming language and are easy to test. It’s a requirement of a RESTful API that the client and server are coupled loosely and independent of each other, allowing either client or server to be coded in any language and improved upon at will, which leads to system longevity and ease of evolution.

The RESTful API specifies what it can provide and how it can be used, and requires that details such as query parameters, response format, request limitations, public use/API keys, methods (GET/POST/PUT/DELETE), language support, callback usage, HTTPS support, and resource representations should all be self-descriptive.

The properties impacted by the constraints of the RESTful architectural style include: 

  • Component interactions can be the dominant factor in user-perceived performance and network efficiency
  • Scalability to support large numbers of components and interactions among components
  • The simplicity of a uniform interface
  • Modifiability of components to meet changing needs, even while the application is running
  • Visibility of communication between components by service agents
  • Portability of components by moving program code with the data
  • Resistance to failure at the system level despite failures of components, connectors, or data.

A Microservice Architecture pattern allows a designer to split the application into sets of smaller, interconnected services instead of having a single monolithic application. A service typically implements distinct features or functionalities, such as connectivity management, vertical application, or other. Each microservice is a mini-application with its own architecture, including business logic along with various adapters.

Containerization is an OS-level virtualization method for deploying and running distributed applications without launching an entire virtual machine (VM) for each application. Instead, multiple isolated subsystems, called containers, run on one control host and access one kernel. Containers share the same OS kernel as the host and are usually more efficient than VMs, each of which requires a separate OS instance.

A Docker container wraps up a piece of software in an independent subsystem, complete with file system and everything it needs to run: code, runtime, system tools, system libraries, and anything that may be installed on a server. This guarantees that it always runs the same, regardless of environment.

Containers hold the components necessary to run the desired application, such as files, environment variables, and libraries. The host OS also constrains the container’s access to physical resources—such as CPU and memory—so one container cannot consume all of a host’s physical resources.

Node-RED is available as open source, and is implemented by the IBM Emerging Technology organization. It includes a browser-based flow editor that easily wires flows together using the wide range of nodes in the palette. Flows can then be deployed to runtime with a single click. The flows created in Node-RED are stored using JSON, and easily can be imported and exported for sharing with others. It can be run at the edge of the network or in the cloud. The node package manager ecosystem is used to extend the palette of nodes available, enabling connections to new devices and services. Advantech offers a wide range of Node-RED add-on nodes.

Freeboard provides simple, real-time visualization of key performance indicators. This tool opens up many possibilities for IoT projects because it’s simple, affordable, open source, and ready for extension. Customers can get started for free and then when it is time to ramp up, they can select a plan that’s right for them. 

Architecture alignment

An architecture of the type under discussion can be classified into five category layers. Each is implemented as its own microservice, using MQTT broker as the communication bus. All microservices interface with other microservices or clients. At runtime, each instance is a Docker container. This makes it easy to deploy distinct experiences for specific users, devices, or special-use cases. 

Containerization is an OS-level virtualization method for deploying and running distributed applications.

  1. The bottom layer of the architecture is the sensor network connectivity layer. Wired sensors support various types, including supervisory control and data acquisition (SCADA), Modbus, and OPC-UA. The network connectivity layer collects data, manages sensor hubs, translates sensor protocols to the MQTT protocol, then passes data to the MQTT communication bus.
  2. The SDK layer provides software services such as EIS RESTful API, HDD Fault Prediction Algorithm Service, and so on. Developers call these services through RESTful API or MQTT. Users add their own services, such as Machine Learning Platform, Data Base engine, and so on.
  3. A flow-based layer has Node-RED as the data-flow design engine, plus add-ons such as SUSI API, WSN, and HDD prediction nodes. Users design logic paths via simple drag-and-drop operations in a graphical environment.
  4. The management and presentation UI interface layer. A Webmin for system administration and IoT connection configuration uses the Node-RED-UI for presenting IoT/sensor data.
  5. The cloud layer may be pre-installed, as for example, with the WISE-Agent connected to WISE-PaaS/RMM Cloud Server.

A flexible and scalable hardware/software architecture helps companies develop complex IoT infrastructure in an integrated ecosystem that serves different vertical markets.

Such an architecture can be customized, combining several software services. It is then installed on different hardware depending on requirements.

Kurt Au is a product manager in the Advantech Embedded IoT Group.