Halliburton's Octiv Auto Frac pairs automated frac control with the insights from AI-driven sensing of fracture propagation.
Idle time for a drilling rig costs about $250,000 an hour. AI is being embraced across the industry to ensure it doesn't happen.
A while back, an energy major had an unfathomable, intermittent problem with its drilling operations. It was beginning to cost big money, and experts from IBM were called in. After an extensive AI and human examination of historical sensor data, the problem was determined to be caused by whale song during the mating season which was reverberating in the drill pipe. As a classic demonstration of the power of AI, the situation brought together the key components of data science: data, models, processing speed and the human mind.
The aim is never to take the human out of the loop, says Carol Lee Anderson, IBM's technology GM for the oil and gas industry, just to rid them of laborious and repetitive tasks and provide them with real-time decision support. "You still need somebody that can interpret the data and make the right decisions."
According to IBM, AI is technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy. For oil and gas companies, IBM often puts it to use for asset management, operational efficiency and safety. It's an industry with big money and big risk. If things go wrong, people could die. "When the stakes are high, so are the benefits," says Anderson.
Directly underneath AI is machine learning (ML) which involves creating models by training an algorithm to make predictions or decisions based on data. Here it's important to guard against hallucinations - when a model perceives patterns or objects that are nonexistent, creating nonsensical or inaccurate outputs. "IBM is about risk reduction, and we back our governance in legal terms," she says.
AI is technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy. For oil and gas companies, IBM often puts it to use for asset management, operational efficiency and safety. "When the stakes are high, so are the benefits."
- Carol Lee Anderson, Technology GM for the oil and gas industry, IBM
Woodside and IBM have worked together to implement solutions that can extract meaningful insights from 30 years of dense and complex engineering data. Woodside considers that AI enables the development of faster models that can explore larger solution spaces, identifying optimal operating conditions more efficiently. This approach can be validated with traditional chemical process simulation, leading to more efficient production processes, lower costs and reduced carbon emissions (e.g. advanced analytics to optimize mixed refrigerant out of the liquefaction process).
Woodside is also using generative AI - deep-learning models that can generate high-quality text, images, and other content based on the data they are trained on. With safety is a top priority, the company uses generative AI to review scopes of work on its assets to help identify lessons learned and relevant training. For example, it is developing an AI project to count birds at its operations in Trinidad and Tobago after a number of bird-related incidents involving helicopters. The solution uses CCTV and AI vision models to provide status updates on the number of birds near the offshore landing facility, maximizing worker safety and the integrity of infrastructure.
Halliburton is using natural language processing (NLP) in the design of oil and gas wells. NLP is AI that uses ML to enable computers to understand and communicate with human language. Milos Milosevic, Halliburton's Senior Director, Digital Well Construction, says: "We use NLP algorithms to help read many text entries captured by experts from previous wells to suggest optimal design features. Likewise, we use NLP to read industry standards and documentation to extract relevant sections for consideration in the current well."
Halliburton has introduced the next generation of its LOGIX automation and remote operations platform that responds to changes in geological formations with the analysis of data from adjacent wells and updates the drilling plan with live data. Image courtesy Halliburton
Halliburton has also introduced the next generation of its LOGIX automation and remote operations platform that leverages downhole data to assist with autonomous drilling. LOGIX responds to changes in geological formations with the analysis of data from adjacent wells and updates the drilling plan with live data. The platform's latest developments use ML to boost drilling efficiency, fine tune shoe-to-shoe performance and predict bit wear with more accuracy, and it can be coupled with Halliburton's iCruise Force intelligent, high-performance motorized rotary steerable system.
The company has also developed the first-ever automation service that enables customers to execute their fracture design without human intervention. Octiv Auto Frac pairs automated frac control with the insights from AI-driven sensing of fracture propagation. This automates thousands of decisions while pumping, based on job designs and pre-job control inputs, with constant response to dynamic stimulation conditions.
Milosevic sees cloud computing as seminal for the industry. "Many problems could either not be solved before due to the lack of processing power and attached storage or could be solved partially by very few with large resources. In parallel with the increase in cloud processing capabilities, the industry has realized that we need to liberate data from various compartmentalized databases.
"We are also able to deploy economically more processing at the rig site and in downhole tools to automate corrective actions based on sensor inputs. This is fueling a revolution in upstream oil and gas automation and AI-based decisions."
This year, SLB launched its Lumi data and AI platform which will be available on all major cloud service providers as well as on-premises. It includes large language models which are models trained on large amounts of data and capable of understanding and generating natural language and other types of content to perform a wide range of tasks. These models help contextualize data across domains so customers can scale advanced AI workflows using generative AI. "As we navigate the delicate balance between energy production and decarbonization, generative AI is emerging as a crucial catalyst for change," says Olivier Le Peuch, chief executive officer, SLB.
To come is agentic AI - a system or program that is capable of autonomously performing tasks on behalf of a user or another system by designing its workflow and using available tools. The system has "agency" to make decisions, take actions, solve complex problems and interact with external environments. Software developer eDrilling is developing a drilling agent it says would behave like an experienced engineer to free up human engineers for more strategic activities.
Beyond that, quantum computing is expected to dwarf the potential that AI brings to data processing and interpretation. Where a supercomputer might take a year to crunch a load of data, it could take just a few hours with a quantum computer. IBM, ExxonMobil, Woodside and others are already involved.
SLB's Lumi data and AI platform will be available on all major cloud service providers as well as on-premises. Image courtesy SLB