Best practices in mature engineering information management
Mature engineering information management can save millions in operational costs.
With the year 2020 behind us, many companies have adjusted to different operational models. Remote working is a prime example of how companies have changed. From a survey conducted by PWC, 22% indicated remote work was not an option. Work methods were not digitized causing remote work to not be possible for their role. This realization among companies has led to a further acceleration of digital projects. To reap the full benefits of these projects, automating end-to-end processes is key. This is where questions arise concerning focus, organizational and culture changes, technology systems and more. To help establish a roadmap for improvement, maturity assessments can be a great management tool.
What are maturity assessments?
Maturity assessments are borne from the belief that if processes are well defined and executed, the result of that process is also of the desired quality. This concept originates from the software development world in the late 1980s. Software projects were of an extremely high cost and rarely delivered the desired results. The software Engineering Institute, with support from Mitre Corp, developed the first Capability Maturity Model, to measure the quality of the software development process.
The Capability Maturity Model (CMM) describes a series of increasing capabilities that can be measured (see Figure 1). Translate this to the engineering information management world and take the example of assigning new equipment numbers for a project executed in the facility.
For example, a process at the initial level would describe that the organization is not even aware of the need for a process (see Figure 1). The result of this is that equipment numbers are assigned at free will by either an engineering, procurement and construction (EPC) contractor or the engineer that might be working on the project. As a result, there is a duplication of numbers, no consistency and no alignment between the different systems that use this information. The result of this is it becomes difficult to find information or establish if the information is accurate. People do not trust the information and often do a field check to make sure what they have is accurate.
Consider the opposite scenario, a process description at the optimizing level would, for example, describe a software solution controls the assignment of equipment numbers, there is a procedure that describes this, and the system is aligned with industry standards and there is a yearly review on the quality of the information (see Figure 2).
As a result, information is accurate, unique, consistent, concise, clear, etc. Based on open application programming interfaces (APIs), the information is also made available for other systems and removes the need for duplication or issues with manually entering the equipment tag into other systems.
How equipment tags are managed is only one example. Professional maturity models can have hundreds of these types of indicators across many different areas. The case for engineering information is no different. One should not just think about the systems, but also how these systems are managed, how the processes are defined, how people are trained in these systems, how the company assigns budgets, how the initiative is supported, how strategies are set up and much more. The input gathered for such an assessment is across various stakeholders in the organization. For each indicator, every stakeholder selects a rating (initial, repeatable, etc.) for how the company is performing. This then builds thousands of data points that can reveal underlying barriers that can resolve structural information management issues.
Figure 2 shows a condensed and completed maturity model for a sample company. This model has seven different areas to build a holistic picture concerning engineering information management. In Figure 2, note the column “organization” has the lowest score, whereas the columns “process” and “system” score highest. Engineering information management professionals often hear that “our system is not working,” but in the sample case from Figure 2, the problem is at least also an organizational issue. Perhaps people are not properly trained or there is insufficient documentation and when individuals leave the company, managing information quickly turns into chaos.
Based on the results from an assessment, an organization can start to think about the areas upon which it wants to improve. Models like the one above, provide guidance and steppingstones to raise the level of maturity.
How improved engineering information can save cost
When we look at the cost savings aspect, direct and indirect factors will emerge. One of the challenges with this topic is most of the savings come from indirect sources such as productivity gains. The research performed on this topic — specifically toward engineering information management — is limited. To help quantify the savings, consider these aspects based research and experiences.
- Trust — 50% or worse: from a limited poll during a live event, among process manufacturing companies, 50% of the attendees rated the trust level of their information at 50% or worse. Meaning that at a minimum a field check is required, and multiple systems would need to be verified, before committing changes to critical information.
- Perceived reliability — 60%: from a study conducted in 2016, among nine petroleum production facilities and 133 interviews, 60% of the group described that communication and access to information, combined with the efficiency of the tools available, have a large influence — in a negative way — on the perceived reliability of the equipment .
- Safety/incidents — 86%: When an incident (or near miss) happens, 86% of the attendees believed that poor, missing and/or not timely information had a large contributing factor to the event. This was a question asked during a live event among process manufacturing companies.
- Cost — 10% of the operating budget: from a recent study in 2019, it was concluded that up to 10% of an annual operating budget could be lost because of poor engineering information. Many variables are used and depending on the specific situation, these numbers will vary considerably. However, the range of potential cost is significant for any asset that provides a solid start for a business case.
- Cost — 1.5% of annual revenue: from research performed by ARC Advisory group, it was concluded that the cost of poor asset information can be up to 1.5% of annual revenue for an organization. Throughout the study, these numbers have been validated against other known studies to contrast, compare and validate the outcome of the study .
- Cost — $15.8 billion per year: from a well-known report produced by the National Institute of Standards and Technologies (NIST), it was estimated that the inadequate interoperability of information is costing the capital facility industry a staggering amount yearly. Although this study was performed in 2004, reviewing some of the more recent research shown above, the presented number still seems relevant .
Based on the above findings, finding a business case should be straight forward. The reality is challenging. Due to the nature of mostly soft savings, organizations tend to focus more on investments that have a direct operational impact.
Performing an assessment can help in this sense. A well thought out assessment will provide a holistic picture of how an organization performs. Secondly, these assessments can inspire the organization. There can be many topics that were not previously thought of but could be valuable topics for improving the organization.
Translating results to actual goals
With the outcome of an assessment, the next step is to determine where to go next. An assessment will provide insights into how the organization performs today. The goal should be to raise the level on the maturity scale and implement improvements step by step, across all the different areas. If for example, the organization acquires a new software solution to manage engineering records, this system must be embedded in the corporate framework. If there are not appropriate resources for continued training, the solution will quickly end up in the digital landfill.
While a maturity assessment will validate and provide direction, it is key to identify practical goals that can be resolved within a foreseeable timeframe and build success. Organizations currently use the phrase “business agility,” meaning there is a desire to address problems quickly as the business evolves in an ever-changing landscape. The purpose of an assessment is it will provide a roadmap to which identified improvements can be executed, contributing to the goals of the company.
One methodology is to discuss the different areas of the assessment and rate low scoring areas on a priority scale, considering the company goals, budgets, resources available and so on. Based on the selected areas (e.g., deliverables management) the organization then uses the SMART goals to define the improvement project.
The SMART criteria is defined as follows:
- Specific: This describes what the project will accomplish with a specific goal.
- Measurable: This describes how the success of the goal can be measured.
- Assignable: This describes who in the organization (internally or externally) will be responsible for this goal.
- Realistic: This describes why this goal is realistic.
- Time-related (time-bound): This describes the timeframe for when the goal is to be completed.
The SMART goals, combined with the outcome of the assessment, provides management with the instruments to measure the status, but also any future progression. The defined goals provide a method to execute specific projects that will raise the level of maturity on the topic of data quality.