PLM and connectivity reach a crucial point
Keep focused on the goal of improving plant information flow.
We are seeing technology developments with far-reaching effects in the manufacturing world. For product lifecycle management (PLM) vendors, this is manifesting itself in an upsurge of research and development activity, acquisition and partnering. For those in industry considering future investment, both from a technology and business improvement perspective, the decision making just got harder.
The driving force for this is a combination of three areas:
- Hardware and devices: New industrial devices are becoming more powerful and capable, enabling direct communication with business systems. Scanners and autonomous vehicles are providing new ways to capture the as-built, as-produced, as-operated environment.
- Smart devices: Products, buildings and even cities are becoming ever smarter as automation devices and networks provide comprehensive interconnectivity.
- Cloud infrastructure: The proliferation of mobile technologies enables access to information anywhere. Cloud based infrastructure affords a way of enabling this access in a highly connected environment.
Here are a few examples of how PLM is coping with these drivers and the changes occurring across multiple industries.
PLM for asset-intensive industries
For some industries, managing the complete lifecycle has been a way of life because of the long-term nature of the assets. These include highly capital-intensive and regulated industries such as nuclear power, marine and civil construction.
The construction, process and mining industries have been monitoring, collecting and processing operational data for years. Facilities, asset management and location tracking solutions capture significant volumes of data on a daily basis. Does PLM technology have a role to play in this in a future connected world? In this context PLM can provide a wrapper around the lifecycle of individual assets. It can provide the definition and operational parameters of individual equipment items whether they be pumps and actuators, conveyors, HVAC or heavy equipment and can respond to in-service issues.
Performance data, whether flow and temperature in process plant, structural deformation of structures or environmental control in inhabited spaces, can be processed not just for short term corrective action but fed back for re-simulation as part of long term continuous improvement and to provide data for next generation design.
One consequence of the continuous quest for more intensive asset use in these industries is in the adoption of mobile technology and the trend is towards the delivery of live information in the form of 3D models, animations and service data. This kind of delivery is common in some manufacturing industries using PLM as the backbone, providing production instructions, simulations, inspection information and exception reporting directly at the point of production.
As the plant, building and construction industries move more towards modular, fabricated structures and even ‘printed’ buildings, PLM may have an increasing role to play. The processes involved are more in the realms of traditional manufacturing and assembly than in construction.
But we must be careful not get carried away with the notion of PLM as an all-encompassing technology. There are many other enterprise environments that already provide much of this capability, from traditional facilities and asset management solutions to the rapidly-developing Building Information Management (BIM) sector.
As physical environments become more connected so the underlying supporting infrastructures will need to be able interact and connect seamlessly and reliably. This is as true in the PLM world as it is in the plant world where established protocols are also being challenged to support new levels of connectivity.
The PLM contribution
To put this in context it is worth taking a step back to look at some core capabilities associated with PLM. The scope is potentially very wide so let’s focus in on two key areas particularly relevant to this discussion: product definition management and configuration management.
PLM product definition encompasses requirements, systems models, 3D models, tests, instructions, process plans, tooling, quality metrics, service information and more, all essential for plant maintenance and operation. These are areas which would traditionally have been in the form of documentation but are increasingly captured as part of a complete virtual definition.
It also includes the definition of product structures and, critically, the process trail that led to the definition. This latter capability is of vital importance when considering the potential increase in feedback from both production and in-service monitoring of smart products.
However, the scope of product definition is changing. PLM is now found in service industries like telecoms, finance and fashion. It has to manage products with hard, soft, electrical and electronic components. Increasingly software affords the opportunity to provide incremental improvements to products in service. This, combined with in-service monitoring, is moving the definition of products towards service provision where the physical item is only part of the product.
As the move towards service is accelerated what will be the impact on providers of the wealth of monitored data from in-service products? With access to more defect and risk related information it is clear that the full audit trail from definition to delivery will become increasingly important.
Configuration management is the second key capability. This isn’t just version control, it is the management of multiple complex configurations of multiple items, maintaining not just bill of material definitions but all the associated definition data. As products and facilities become more customizable the management of this complexity will become more important and will include the need to manage updates in embedded control software.
Coordinating all of the information involved in developing, building and maintaining plants involving mixed technology, mixed material, locally produced items in an increasingly customised world, requires configuration management capabilities of considerable force coupled with rigorous traceability.
Improving access to plant data
Lifecycle management environments, supported by enterprise software, have been chipping away for some time at the organisational and functional silos associated with building and operating complex plants. But the technologies that have aided this are themselves in danger of creating their own silos.
As vendors jockey for position the danger lies in creating new silos. These are not the traditional functional and organizational silos of engineering, operations and supply; software, hardware and electronics; architecture, engineering and construction; utility production and network planning, these are data silos which hold the potential for enormous value.
Is there an answer as to where this data should reside? The notion of a single source of truth is often mooted in the PLM world but what does this really mean in practice and how can this work in a future yottabyte world?
This is not a job for a single source of truth. This requires a highly connected solution stack that includes at least PLM, enterprise resource planning (ERP) and real-time control capabilities and extends to facilities and asset management, BIM and other industry specific environments.
The answer surely must be openness driven by protocols, standards and defined architectures. In the same way that protocols are essential to communication between automation components, similar protocols are required to connect the data service platforms that are currently delineated in the realms of enterprise systems.
But with statements such as, ‘data is the new oil", the stakes may be too high to allow for such openness. Without it the smart, connected future and everything that goes with it will not achieve its true potential.
Simon Hailstone is principal consultant for Cambashi (www.cambashi.com).