Modelling software and machine learning improve production efficiency

Advances in automated data processing and machine learning algorithms enhance the value of models and simulation

By Alessandra Da Silva September 13, 2021
Figure 1: The ideal Model 1 trendline is seldom realized because, left unmitigated, a model’s accuracy diminishes over time, like the Model 2 and Model 3 trendlines. Courtesy: Siemens

Modern manufacturing processes produce a vast volume of data, and as sensors become more numerous on industrial shop floors with the spin-up of the IIoT, even more data is now available.

In the past, data was typically used in a very con­servative manner—primarily to let users know, in ret­rospect, what happened on the shop floor. The data could then be used in regulatory reports, and in some cases, to discover trends and identify issues.

However, as artificial intelligence (AI) and machine learning (ML) gain prominence, the sky is the limit for what can be accomplished using shop floor data, most notably identifying patterns, and then using this information to speed time to market and optimize production.

Computers with intuition

Experts once believed computers could only compute algorithms understood by humans. They also assumed there was no way to teach software how to exercise intuition—such as identifying whether a picture was of an adult or a young child, a cat or a dog—but data scientists now know this is no longer true.

With ML, humans have succeeded in teaching computers to do intuitive tasks, without being able to explicitly explain the tasks to the computer. Simply by feeding the machine enough different examples— such as pictures of dogs and cats, and then identifying each accordingly—the machine can learn to determine the subject of new pictures by identifying patterns within the examples. Similarly, computers can now tackle industrial issues once considered unsolvable by using ML, without complex coding.

Devices on a plant floor can feed massive amounts of production process data into a computer’s ML algorithm. For example, in quality assurance for a printing process, the ML algorithm can detect use of an incorrect paint color, inaccurate position, missing details and other defects. The machine can learn to test for these and other qualities over time without the need for a programmer to predefine all test parameters. The analyzed details of the final printed product determine whether the subject passes or fails an overall quality test.

As a computer becomes more familiar with pro­duction procedures, it can associate manufacturing process data prior to the end results of quality testing. This information can be used to predetermine, in near real-time, whether a particular production batch will pass quality tests. Oftentimes, process adjustments can be made to save a batch otherwise heading for quality test failure. By identifying main contributors to quality within the collected dataset, the computer can quickly identify patterns humans cannot automatically adjust processes or alert operators so they can act.

Machine learning challenges

Unfortunately, ML implementation is not simply plug-and-play. Like a student, a machine requires proper instruction to execute its tasks with excellence.

Collecting high-quality data is the greatest initial chal­lenge to the ML implementation process, and this step is critical because a machine cannot learn from corrupted, noisy, distorted or misclassified data. Proper collec­tion includes recording data in a standardized format, sometimes requiring translation from its raw format.

Following data collection, an ML model requires a labeled dataset for learning. This dataset is categorized by class—such as OK or not OK, pass or fail—but many organizations do not have experts and resources avail­able to identify and label these ML patterns.

ML models often rely on static historical extracts of generally dynamic data for training. However, pro­duction data may change rapidly in an unknown and undetected manner after models are implemented because of variable process conditions, dynamic human intervention and other causes. This typically leads to the decrease of a model’s predictive performance and reliability over time (see Figure 1).

Figure 1: The ideal Model 1 trendline is seldom realized because, left unmitigated, a model’s accuracy diminishes over time, like the Model 2 and Model 3 trendlines. Courtesy: Siemens

Figure 1: The ideal Model 1 trendline is seldom realized because, left unmitigated, a model’s accuracy diminishes over time, like the Model 2 and Model 3 trendlines. Courtesy: Siemens

When a model misclassifies properties or presents results with low confidence, it is necessary to retrain it with a new or additional labeled dataset.

Monitoring productive systems

A typical ML application receives data input from sensors, such as cameras, microphones, thermom­eters and others. After importing and preprocessing, the ML model analyzes the data and infers a model output (see Figure 2).

Figure 2: An ML model receives input data, performs a defined set of calculations and actions with it, and produces resultant data with which to tune the model over time. Courtesy: Siemens

Figure 2: An ML model receives input data, performs a defined set of calculations and actions with it, and produces resultant data with which to tune the model over time. Courtesy: Siemens

For example, in a quality prediction use case, the model output is often a score related to product quality, or the probability of producing an acceptable versus a faulty part or batch. The ML application provides this quality prediction to various consumers, like a manufacturing execution system capable of adjust­ing process parameters for optimization, or simply for operator visualization.

To prevent model accuracy slip over time, the ML application must compare its predicted output to the actual results, adjusting future modeling in accordance with the model’s deviation from reality. This deviation provides actionable insights to the next iteration of the model for incident handling and bug tracking. Through this act of refining, the model can learn to recognize new patterns in data streams.

The Swiss army knife

When sensors and other field equipment send signals to on-site computers and servers—and to the cloud in many cases—they help create an inventory of a facil­ity’s current state. This helps inform operators, and the information can be used to improve productivity, ensure safety and adjust to new requirements.

A digital twin facilitates the process of continuous optimization like no other tool, and data collected from sensors on the shop floor can enrich efficiency dramatically.

This type of tool not only reveals a defined product created from specific materials or ingredients with a known weight, produced on a certain date with a specific quality—but can also reveal the humidity, temperature and many other factors impacting the final product, and then correlate these environmen­tal factors with the product’s outcome quality. This complete dataset enables creation of a digital twin of the process and product, which is a complete virtual representation of the product and how it was produced.

In cases where real data to train a model is difficult to generate, an engineer can create a digital twin, running simulations to create the necessary product dataset to train an ML model. Digital twins enable better and quicker product development because simulation tech­nology accelerates design and testing long before any physical prototypes are produced. Digital twins also boost design efficiency because they enable develop­ers to try out and compare more configurations than possible with physical models (see Figure 3).

Figure 3: The Siemens Plant Simulation Tool can be used to create a digital twin for optimizing design of a new facility, or for experimenting with methods to improve operations in an existing facility. Courtesy: Siemens

Figure 3: The Siemens Plant Simulation Tool can be used to create a digital twin for optimizing design of a new facility, or for experimenting with methods to improve operations in an existing facility. Courtesy: Siemens

For example, a digital twin can be used to increase the energy efficiency of a new building prior to construction. In addition to incorporating visualiza­tions of the building’s geometric elements, the digital twin can include project schedules and budgets, plus data regarding the building’s energy supply, lighting, fire protection and operations. As a result, engineers can optimize the building’s future climatic impact before even breaking ground.

Additionally, a digital twin continues collecting data throughout its lifecycle. This can include information about physical stresses, components that have failed or how an object—such as a milling machine, an air­craft or a building—is operating. Such information supports optimization during operations, and it aids designers, architects and engineers in preparing the next generation of a product.

Learning the “ant method”

To create accurate digital twins of real technical sys­tems, developers must understand the system’s material values, design data, functional workflows and sur­rounding laws of nature. The digital twin must also record discrepancies between modelled and real per­formance values of the system to maintain accuracy over time.

Using ML and a computer’s inherent fast processing of large datasets, the model can reveal highly complex connections a human could not determine, and the computer can invoke the ‘ant method’ to efficiently optimize the model.

In nature, colonies of ants use a scent to mark their path on their way to a food source. Because ants taking the shortest path cross it more often, marking it more frequently than those on longer paths, the shortest route becomes more strongly scented than all others over time.

Similarly, ML-based modeling utilizes this method to optimize production processes over time because simulating all conceivable procedures and compar­ing them against each other all at once would exhaust computer processing resources. Instead, the machine adjusts the production procedure bit by bit as corre­lations are drawn between operational methods and quality test results. This methodology enables an ML model to determine the most efficient way of perform­ing operations.

A leading manufacturer of gasoline, diesel and electric automotive powertrains began using Siemens SimCenter Amesim software and services to develop and couple virtual sensors with AI. The simulation software enabled them to create virtual models (see Figure 4) of their powertrain to determine ideal design parameters prior to production, reducing time and costs due to failed physical prototypes.

Figure 4: Siemens SimCenter Amesim software is used to create AI-equipped virtual models for simulating and optimizing mechanical system components, speeding up physical production and testing. Courtesy: Siemens

Figure 4: Siemens SimCenter Amesim software is used to create AI-equipped virtual models for simulating and optimizing mechanical system components, speeding up physical production and testing. Courtesy: Siemens

In addition to hardware design, the manufacturer used the virtual model to optimize control strategies. By also translating the model to a format for deploy­ment to the powertrains’ onboard electronic control units, they equipped the controllers with AI-based ML capabilities. This provided powertrains the ability to automatically adapt their output based on current driving cycle and application.

Pattern recognition empowers

Over time and with properly labeled datasets, ML algo­rithms are improving, and digital twins are becoming standard in the development of products, plants and other automated systems. As a result, these technologies are increasingly accepted for certifications, such as com­pliance with security and environmental regulations.

As digital twins become more commonplace, they will be coupled with the delivery of physical systems and products, empowering users to hypothesize and test the results of modifications to their designs and processes prior to production.

With insightful ML and pattern recognition, simu­lated data from digital twins and actual data from field sensors can be processed to create accurate and time-tested production models. These models accelerate process and machine optimization, increasing produc­tivity and time to market, while reducing maintenance costs and downtime.


Author Bio: Alessandra Da Silva is product marketing manager, Siemens Industry Inc.