The common denominator that analytics hinge on
Identifying efficiencies in exploration and production is paramount for recovery and competitiveness
Digitalization is ideal to illuminate efficiencies and enable companies to utilize them. This has spurred increasing interest in digital tools and advances in analytics, machine learning (ML) and artificial intelligence (AI). While these advances hold great promise for oil & gas companies. In fact, their effectiveness hinges on a rarely discussed common denominator: the quality of the data inputs.
For operators and oilfield service companies to receive accurate insights from such powerful digital tools, the vast amounts of data input into analytics and other digital platforms must first be evaluated to ensure it is relevant and high quality. A two-pronged approach to ensure the data is trustworthy and valuable for decision making is strongly recommended. This approach should include weighing the data against the business’ objectives as well as applying quality control measures.
These types of high-level checks are crucial to remove data that doesn’t contribute to a company’s goals, is duplicative or incomplete. Upon the removal of such data, remaining data must be filtered through a quality control lens to determine if the data’s values are reliable and if the data has been processed appropriately. Traditionally, much of this work has been performed by specialist subject matter experts, but as operators look to increase speed, scale and accessibility of data many of these workflows are being automated.
Quality control
When designing automated quality control (QC) processes, it is highly valuable to engage data consultants to assist with the scoping and implementation process. By drawing on 20 years of experience of working with operators of all sizes, experienced data consultants work with internal stakeholders to identify key workflows, rules and processes that allow end users to maximize the value of their data in the most efficient manner. Automated isolation, ranking and a categorization of data at scale, is imperative to maximize the accuracy and intricacy of the insights yielded for optimal bottom-line results.
The quality of an energy company’s source data cannot be understated, as the experience of a large crude oil producer demonstrates. In this case, the operator was seeking to maximize production from a series of highly stacked but thin turbidite sand beds, which had previously been drilled but remained untapped.
These thin but valuable target beds lead to a low margin of error when completing the well. Prior to running casing and perforating, a suite of wireline logs was run and used to create a perforation plan. The data was compiled into Ikon Science’s knowledge management solution, Curate, and compared against data previously obtained from the borehole, including whole core images.
Comparing these data types in a single platform enabled engineers to identify that the completion plan was subject to small but important depth errors. The perforation zones were remapped, and the zones successfully flowed commercial levels of hydrocarbons. Dependable access to clean, vetted high-quality data in Curate enabled the resolution of a crucial issue ahead of performing an expensive perforation job, earning the producer an estimated $5 million on the well. The analytics generated enabled the producer to avoid a costly mistake, saving potentially tens of millions of dollars on the entire field.
This is the type of bottom-line impact analytics, powered by clean data, that can produce meaningful results for energy companies. Analytics tools can alert companies to anomalies and unique geology that indicate if rock characteristics are consistent with the subsurface model, or if drilling and production plans are optimal. Such insights prevent companies from wasting valuable financial and human resources.
Analytics value
Analytics and ML are methods that can amplify the rich experience of a company’s personnel to drill better, safer and more profitable wells. These forms of digitalization require a symbiotic relationship between the modern technology that can process and integrate the huge volumes of data inherent in the energy industry and the human interpretation required for QC and to make use of resulting insights.
Analytics are especially needed in today’s climate as access to skilled workers and capital can be difficult to obtain. Given current challenges, companies can no longer afford to drill well after well while ignoring the data. This puts companies at risk of being unable to capitalize on the true potential of a field; a costly mistake.
As the industry evolves, analytics can leverage the subsurface asset data that companies possess to help pivot and refine business activities. Some major companies are utilizing clean data-fueled analytics to not only streamline operations and improve ROI, but also to evaluate ways to mitigate climate change such as through geothermal or carbon capture sequestration initiatives. The ability to run the necessary scenarios to determine the viability of such endeavors demonstrates that analytics are a fundamental necessity to navigate the industry’s dynamics.
Analytics, ML and AI are pathways to make energy businesses smarter, more efficient and more profitable. However, those pathways rely upon quality data to yield such outcomes. Future growth and survival lie in ensuring that the appropriate data flows into analytics solutions that enable companies to identify opportunities, maximize productivity and control rising costs to achieve long-term business viability.
Original content can be found at Oil and Gas Engineering.
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.