What industrial analytics platforms offer manufacturers

Challenges and benefits of analytics applications for the Industrial Internet of Things (IIoT) are highlighter as well as the value analytics can provide.

By Shi-Wan Lin, Alexander Lukichev February 25, 2017

Found at the core of the Industrial Internet of Things (IIoT) transformation, industrial analytics is the engine that turns machine data into actionable insights, driving intelligent industrial operations and business processes.

Whether applied to discrete manufacturing or process production, an industrial analytics platform can be a solid foundation to build this powerful engine and ease the convergence of operations technology (OT) and information technologies (IT) by adapting requisite information technologies and innovating based on operational requirements.

IIoT seeks to connect machines, equipment, and industrial control systems (ICSs)  to enterprise-information systems, business processes, and people. By applying analytics to the large volume of data collected from connected machines, we gain insights into their operations and the ability to use these insights to drive intelligent operations of the machines and business processes. Data, analytics, and applications are key elements in the intelligent lifecycles that turn data into insights, and insights into actions (Figure 1). They are applicable to the control, operations, and business loops. At its core, analytics is the engine that powers each of these intelligent loops and drives value-creation in IIoT.

The value of analytics

Manufacturing equipment in a typical production-industry environment today can be best described as digital-control automation systems-built with microcontrollers (MCU) and programmable logic controllers (PLC). Many of them are connected to supervisory control and data acquisition (SCADA) or distributed control systems (DCS), and are monitored and controlled remotely.

Equipment-operational states are monitored by human operators, who in some cases are aided by simple analytic algorithms such as threshold-based alerts. By and large, most of these systems have not benefited from advanced analytics capabilities developed over the past decade. On the other hand, these industrial-control systems connect to many sensors and have sophisticated data-collection capabilities that provide a wealth of information about their instant operational states. There is substantial value hidden in these data. By connecting to the manufacturing equipment, SCADA and DCS are able to collect data from them and then apply advanced analytics, to gain valuable insights into their operations.

This will enable us to:

  • Detect anomalies, diagnose faults, raise alerts and prescribe actions for speedy repair of machine failures, thereby increasing uptime;
  • Perform smart-monitoring of machine-usage patterns to optimize work plans and increase utilization;
  • Improve quality-control and correlate it with the manufacturing-process metrics to optimize operation parameters;
  • Predict needs for maintenance actions to repair machines before unexpected breakdowns, thereby avoiding interruptions and reducing unnecessary routine services;
  • Detect and eliminate wasteful-usage patterns to reduce energy and material consumption; and
  • Perform optimization on a fleet of machines by dynamically adjusting the operation level of individual devices based on resource availability, operation cost and production demand.

By integrating with the enterprise-information systems, operational intelligence from machine-data analytics can be combined with business insights to enhance business processes and planning in supply chain and resource planning, work scheduling, and customer relation management, as well as for engineering design and processes. All these increase productivity and operational efficiency, enhance customer experience, improve worker safety and even facilitate the emergence of new applications, products, and services. They ultimately strengthen competitiveness, create new business value, and potentially bring transformational business outcome.

The use of analytics in the production environment will reduce the reliance and burden on human operators in detecting data patterns and anomalies. Equipped with advanced analytic algorithms and techniques, a solution can monitor and detect patterns in the live streaming data more effectively and often more reliably. This is especially true for complex pattern-recognition requiring correlations of high-dimensional data over long spans of time. These kinds of patterns may not be easily detectable by a human eye.

With the latest machine-learning technologies, analytic models even can improve themselves by learning from their accumulated experience. In fact, analytics can monitor large amounts of equipment around-the-clock with equal effectiveness. Human operators will be informed via alerts only when important patterns are detected, especially those requiring human input or intervention. This would make human operators responsible for mission control and to monitor quality and productivity, freeing them from repetitive tasks. 

Analytics requirements

To meet the needs of the production industries, an industrial-analytics solution should demonstrate a few important capabilities. The first one is to deliver correct results and to "do-no-harm." This requires strong analytics and safeguards in their application. Further, as we just have seen, continuous application of analytics must be possible. However, continuous analysis often requires substantial amounts of data to be transferred from the point of data collection to the point of analysis-the decision point.

Therefore, the analytics solution must support distributed deployment at the edge, whether in IoT gateways next to the equipment, within a server cluster in a facility, or in a remote data center and the Cloud. Different deployment tiers may be required depending on the scope of the data being analyzed. For example, analytics for comparing performance of several factories may be performed better at an enterprise data center. Analytics for local supervisory monitoring may be performed better at the edge, enabling higher reliability, shorter latency, smaller data transfer volumes, and better control over the data.

Another often-overlooked characteristic of an analytics solution is the overall complexity. The analytics solution must be easy to set up, configure, and maintain. Reducing implementation and operating complexity of the system helps to accelerate the success of IIoT by reducing its development cost, risks, and time-to-value.

Industrial analytics platform

An industrial analytics platform, compared to a custom-built solution, can simplify and optimize IIoT deployments, making them effective, reliable and scalable. It can offer the power of machine learning, Big Data, cloud computing and other emergent technologies without having to directly address their complexity and demand for expertise.

To meet the requirements discussed above, an industrial-analytics platform should have the following capabilities:

  • Streaming analytics to generate continuous, near-real-time information flows from live machine data;
  • Distributed analytics in the Cloud, at the factory-floor edge, and in IoT gateways for data processing;
  • Actionable analytics to turn data into insights and insights into actions;
  • Multi-modal analytics with multi-dimensional statistical aggregation, complex-event processing (CEP) and machine-learning-based pattern recognition for powerful and efficient analysis of behavior of individual assets, as well as groups of devices;
  • Adaptive data flow for protocol adaptation, data normalization, policy-based validation and filtering, transformation, and data enrichment to enable easy integration;
  • Simple customization with code-less configurations for data injection, processing and analytics; and
  • Security by rigorous design, implementation and validation in accordance with security best practices.

In the following sections, these key characteristics will be considered in a bit more detail.

Streaming and distributed analytics

An industrial-analytics solution should handle on-the-fly live data streams from machines, equipment, and systems to generate continuous information flows at low latencies-in some cases meeting hard-timing requirements. In contrast, a common approach taken in many IIoT-analytics solutions is based on passive queries, more suitable for generating business-intelligence reports than active analytics results. Nevertheless, traditional batch-oriented, query-based analytics are still useful for either building or improving analytic modeling or for human decision-making, which may include identifying macroscopic process patterns and trends.

In complex, multi-tier distributed industrial systems, the analytics solution must be distributed as well. That means it can perform analytics close to data sources and decision points where the analytic outcome is needed. In a typical IIoT architecture (Figure 2), analytics can be deployed in IIoT gateways in the control tier, at the edge in the operations tier, in enterprise data centers, or up in the Cloud.

If the same underlying platform is deployable in different architecture tiers, then the industrial-analytics platform readily can enable dynamic workload orchestration and distribution of analytics across these tiers, balancing needs for clear decision-making and for accessible data, computational and networking resources.

With distributed analytics, an industrial-analytics platform enables edge analytics to:

  • Ensure quick response by avoiding long network latency;
  • Provide high resiliency by avoiding operations disruptions due to network interruptions or failures in a centralized system;
  • Enforce stronger security and privacy protection by keeping data within safe domains; and
  • Lower network costs by reducing data-transfer volume across the network.  

To take full advantage of IIoT analytics, a solution must affect an automatic, dynamic, and continuous process of transforming streaming machine data into insights, converting the insights into actions, and applying the actions back to the machines, operations, and business processes.

The solution analyzes data streams from the control systems, including PLCs and SCADA, and via domain-specific applications delivers continuous intelligent feedback to these systems, which means adjusting control set-points, modes, or otherwise. Operational insights derived from the analytics are available to business applications as well.

Analyzing large volumes of high-fidelity data in IIoT gateways ensures more accurate and lower latency feedbacks locally to the control systems, even when network connections to upper tiers become unavailable. Summary information from local analytics can be sent to the central component in the operations tier for further aggregation and other high-level analysis.

This distributed streaming-analytics functionality can and should be implemented at the platform level, thus shielding the analytics application developer and user from its inherent complexity. 

Platform innovation

An industrial analytics platform should provide tools to address key challenges in industrial operations, including data adaptability, analytics capability and continuous improvement. An example of such platform (from Thingswise) is shown in Figure 3.

 

One big challenge in IIoT engagements is data interoperability. This is especially true for deploying IIoT systems in brownfield environments where legacy controls and machines of different types and models from different manufacturers co-exist and operate. Data collected from these machines have many data types in many different formats and at various levels of quality.

To meet these data interoperability challenges, an industrial-analytics platform should provide a powerful yet easy-to-use data-processing engine for necessary data transformations. The configured data-processing flows will complete protocol adaptation, syntactical transformation, semantical assignment and policy-based data-quality processing, including validation, filtering, de-duplication, and others. Processing flows also will complete data enrichment (joining additional metadata with the streaming machine data) and other data processing required for quality data analytics. To meet the goal of code-less (or near code-less) design, these data-processing flows could be configured using a declarative-domain specific language (DDSL).

Detecting anomalies, capturing meaningful patterns, and predicting trends from live-streaming machine data is another challenge for analytics in production environments. The analytics engine must provide multi-modal, event-driven streaming analytics to meet the demanding industrial-analytics requirements. These may include traditional statistical analysis, complex-event processing, and machine-learning based, time-series pattern recognition and classification. Taken together these three types of analyses produce a strong synergistic effect.

Conventional statistical analysis of machine data summarizes results over temporal, spatial, and logical spans. It also establishes norms by which to identify outliers in the performance of a fleet of machines.

Complex event processing correlates events captured from machine-data streams across temporal, spatial and logical domains to identify event root-causes and trigger actions, when appropriate.

Machine-learning-based pattern-recognition and classification uses trained models for machine-learning algorithms to identify specific patterns in the machine-data streams. It works well in correlating multiple physical measurements from machine sensors as they change over time in recognition of important features in machines’ behavior. A trained model can be deployed into the analytics solution to automatically detect meaningful features for hundreds of thousands of machines in near-real time. This is particularly useful for automatic machine-anomaly detection, fault diagnostics, and predictive maintenance.

While each of these three approaches can be implemented at the generic-platform level, there is often need for the application of custom-analysis functions, such as those based on physical modeling, for designated machines as part of an analytic flow. The platform should allow the use of such extensions. 

Simple customization

A proven way to simplify configuration is to use code-less design based on declarative configurations. They can be used to customize data injection, processing, and analytics for specific use cases. This makes it easy to adapt to various data protocols, formats, and processing and analytics specificities requiring no code development. It allows quick setup of the initial analytics applications and enhances their capabilities iteratively. For example, new data streams or analytic models can be added without affecting existing operations. This also allows the developer to quickly see the outcome of the changes, thus allowing for much-needed experimentation when dealing with industrial analytics.

A carefully designed configuration language allows the underlying industrial-analytics platform to scale transparently when data volumes increase as new machines are added and as computational complexity increases as more sophisticated analytics are implemented.

An industrial-analytics solution delivers substantial value in manufacturing, process, and hybrid industries by allowing smart monitoring and automated supervision of equipment. The insights obtained from the real-time analysis of machine data can be translated into automatic or semi-automatic actions that improve the overall efficiency of production and minimize possible losses due to breakdowns and downtime.

At the same time, no matter how attractive deployment of an analytics solution may seem, the technical challenges outlined above may prevent many businesses from following this route effectively. The role of the industrial-analytics platform is to remove major technical challenges and reduce the complexity, efforts, and risks in IIoT implementations. Such a platform, as a "turn-key" solution, built with expertise in the latest technologies and enjoying the benefits of economy of scale, clearly can be more economical than a custom-built solution, especially considering the fast pace at which the technologies are evolving. Overall, it helps to get results and create value quickly and cost-effectively.

About the authors

Shi-Wan Lin brings 20+ years of broad technology and business experience from Intel, Sarvega, Lucent Technologies and Motorola to Thingswise. Prior to founding the company, he was a chief technologist in the IoT strategy and technology office at Intel. Dr. Lin co-chairs various technical groups for the Industrial Internet Consortium, the National Institute for Standards and Technology Cyber-Physical Systems Public Working Group, and the Joint Task Group between Platform Industrie 4.0 and the Industrial Internet Consortium.

Alexander Lukichev has substantial software architecture and engineering expertise in such areas as network infrastructure, distributed computing, enterprise software, Big Data platforms, cloud services and system security design. Before co-founding the company, he most recently was employed by Intel, where he spent more than a decade in various software engineering positions.

For more information, go to www.thingswise.com.