Four Steps to Advanced Data Science in the Oil and Gas Industry

Four Steps to Advanced Data Science in the Oil and Gas Industry

When it comes to advanced data science – including machine learning and AI – there’s a perception that energy is lagging behind other industries, like retail or technology. While understandable, especially given the sheer visibility of AI solutions like online recommendations or ride-sharing apps, it’s also a bit unfair.

While “fail fast and fail often” is a common mantra in the tech industry, the amount of data and AI experimentation that energy companies can pursue is restricted: In energy AI needs to be deployed into highly sophisticated systems with multiple variables at play, so trial and error is risky.

Also, many activities within oil and gas happen relatively infrequently, such as developing a well or field, so obtaining data at the required scale – crucial for a number of algorithms, such as deep learning – can be difficult. At the same time, available data is often highly commercially valuable, so there’s great incentive not to share it.

Therefore, the number of opportunities in which AI can be applied in oil and gas may be more limited than in other industries. However, when AI is applied to the appropriate areas, the impact can be considerable, even game-changing.

Oil and gas players understand the potential of advanced data science, and the level of investments in digital technologies reflects this. Since 2011, over $1 billion in seed and venture funding has been raised by oil and gas startups. In 2018, more than 35 percent of this funding has been allocated to software, analytics and AI products. Between 2011 and 2018, over 700 U.S. oil and gas software patents were granted.

Frequently operating at the cutting edge of science and engineering, the oil and gas industry stands to benefit considerably from data-driven analytics. But to do so, there are four key areas to optimize: good problem formulation, data readiness, expertise availability and organizational enablement.

Addressing the Right Problem

Not all problems can – or should – be addressed using advanced analytical techniques. In general, AI-driven solutions are appropriate for two broad classes of problems: 1) complex business decisions that hinge on predictions inferred from data patterns and 2) automation of processes with complex but discernible underlying patterns.

For example, GE determined that it could improve the effectiveness of its equipment maintenance by applying predictive algorithms to heat loss data. By handling anomalies proactively, operators can avoid unplanned, costly downtime. However, there are certain critical components that may not contain sensors and cannot be monitored easily by service engineers. In response, GE developed a heat-monitoring smartphone app that uses an iPhone equipped with a thermal camera to provide noninvasive monitoring. Thermal images can then be classified as normal or irregular based on engineers’ domain knowledge, providing a labeled dataset. This informs an image recognition algorithm, derived through machine learning, which then identifies when equipment needs repairs.

Gathering the Right Data

AI and machine learning algorithms almost always require significant amounts of data, especially since both training and testing datasets are needed to effectively test a model. The data must be of sufficient quality, granularity and representative of what’s being modeled.

BP was seeking to reduce fugitive emissions (emissions resulting from leaks or gases
that are unintentionally released during industrial activities) that were significant in many of its mature fields. While engineers believed that machine learning could be effective in reducing fugitive emissions, they still needed to obtain the data to develop and test appropriate models. But outfitting all their wells with sensors to gather this data would be costly and hard to justify.

BP came up with an inexpensive way to gather data and test their hypothesis by fixing Android phones to a selection of beam pumps and then combining the data gathered with historical maintenance logs and weather recordings. This allowed them to test the algorithmic approach and prove the business case.

Following this successful pilot, permanent sensors were installed that were able to yield large amounts of data on equipment telemetry and well conditions. Armed with new data, the algorithm now provides recommendations to engineers, allowing them to make the necessary changes at each of the wells to minimize fugitive emissions.

Assembling the Right Expertise

Of course, effective application of AI requires more analytics expertise to ensure the right tools and technology are being implemented. But that’s rarely enough. For the complex problems faced by the oil and gas industry, other capabilities are also critical to effectively close the gap between technical skills and commercial understanding.

To optimize its energy portfolio, Exelon wanted to accurately dispatch excess power generated by its wind turbines, but it needed a five-minute forecasting capability to predict when wind speed would change suddenly. The company was looking for an OEM-agnostic data aggregation and analytics solution, but didn’t have all the required capabilities and didn’t want the risk and cost of in-house development.

So, Exelon decided to partner with GE’s Renewables Data Science Team. Exelon provided the team with access to a year’s worth of turbine data to use in building and training machine learning models for wind ramp prediction.

GE used its Predix industrial IoT software within Exelon’s IT infrastructure for a purely software-based machine learning solution. The result was an increase in annual energy production of around three percent, and reduction in operating costs of 25 percent. The real-time forecasting model was also applied to longer-term forecasts, resulting in improved overall accuracy.

Ensuring the Right Organizational Configuration

A receptive organization is key to scaling up AI solutions. Senior leadership needs to be willing to step up and take ownership of the process, and facilitate overall organizational buy-in to maximize use of the new technologies by personal across all levels.

Rio Tinto sought to combine its in-house mining and analytics expertise with the specialties of various partner companies (including Komatsu, Caterpillar and Amazon) to develop automation solutions for use in drilling, extraction and ore transportation.

To do this, Rio Tinto both leveraged specific partner strengths and focused on designing supportive organizational structures. It created a dedicated data science unit within a centralized innovation function to foster the spread of ideas across business units

Rio Tinto succeeded in embedding cutting-edge automation as a central part of operations. Since 2014, it has been growing its use of automated haulage system trucks, which now make up about 20 percent of the fleet. The trucks lowered costs by 15 percent, and automated drills improved productivity by 10 percent.

It’s a daunting prospect to start an advanced analytics and machine learning initiative, especially in an industry as complex as oil and gas. Often, it makes most sense to think in terms of manageable short-term efforts (such as focusing on one or two problems of high value to the business, running pilots first, making the most efficient use of data and partnering when possible) that can be broadened into more ambitious longer-term initiatives (like building in-house capabilities, focusing on innovations with tangible and immediate benefits, providing stakeholder incentives and incorporating data analytics into core business activities).

One thing’s clear: Advanced data science applications have a place in the oil and gas industry, and the potential to yield tangible benefits is considerable. Innovation has always been at the core of the oil and gas industry – and many companies are already finding creative ways to implement data science solutions.

Author Profile
Senior Manager - 

Stuart Robertson is a Senior Manager in L.E.K. Consulting’s London office, and leads L.E.K.’s Disruptive Analytics initiative. He has extensive experience across both public and private sectors, and has provided strategy and transaction support to clients in numerous industries.

Author Profile
Managing Director and Partner - 

Nilesh Dayal is a Managing Director and Partner in L.E.K. Consulting’s Houston office, and is head of the firm’s Oil & Gas practice. He has more than 20 years of experience advising clients on growth strategies related to acquisitions, new business ventures, corporate restructuring, supply chain management, profitability and operations improvement, and more.

Author Profile
Principal - 

Franco Ciulla is a Principal in L.E.K. Consulting’s Houston office. He has 25 years of experience working in the oil and gas industry in technical, operational, commercial and strategic roles, with a focus on upstream activities and oilfield supply chain strategies.

Author Profile
Senior Manager - 

Amar Gujral is a Senior Manager in L.E.K. Consulting’s Houston office. He is focused on growth and commercial strategy, M&A, and due diligence in the energy sector.

3 Ways Technology is Going to Shape the Oil and Gas Industry Free to Download Today

Oil and gas operations are commonly found in remote locations far from company headquarters. Now, it's possible to monitor pump operations, collate and analyze seismic data, and track employees around the world from almost anywhere. Whether employees are in the office or in the field, the internet and related applications enable a greater multidirectional flow of information – and control – than ever before.

Related posts