[Editor’s note: A version of this story appears in the April 2019 edition of E&P magazine. Subscribe to the magazine here.]
A once quiet debate is nearing critical mass with repercussions in the oil and gas space. Indeed, the event horizon of sub-$60 oil suggests the critical mass threshold may be at hand.
The debate stems from the vexatious issue of parent/child well interaction and the related phenomenon of interference on nearby wells from frac hits during well stimulation. The phenomenon implies restrictions on recoverable reserves and threatens the financial model associated with reserve-based lending as a mechanism for financing capital-intensive oil and gas development.
Essentially, if the first well drilled on any section is the best the industry will see out of that section, well interference diminishes recovery, strands reserves and threatens sustainable profitability.
The rising volume of trade presentations and the increased mention of well interference in energy company earnings calls indicate the problem is expanding, especially as E&P companies move inexorably toward full field development in tight formation oil and gas.
At its root, the debate involves whether the old paradigm of physics-based modeling will yield to a new paradigm via data-driven artificial intelligence (AI) and machine learning. Intertwined in the debate is whether the way forward should be dominated by domain experts or whether the industry needs to go “open source” by making room for data experts who may have only a passing knowledge of oil and gas.
Traditional approaches to reservoir simulation and decline curve projections are based on decades-old physics-based approaches that worked admirably in conventional settings where each model developed its own “street cred.”
The problem has been adapting both models to tight formation oil and gas. This is not the first time such issues have vexed an industry. Albert Einstein’s idea of a unified theory generated equations about the universe that withstand the test of time but run into trouble on the quantum mechanics level.
The analogy loosely applies to oil and gas. Reservoir simulation is based, in part, on the constants of differential equations applied to fluid dynamics. But what works in a conventional reservoir runs into trouble at the nano-level associated with tight formation oil and gas.
Think of it this way: The old paradigm for addressing reservoir development is “input – rules = output” or “Mother Nature – man – devised rules = output.”
The problem is the middle portion of the equation, which injects personal assumptions into the mix and adds bias, whether the model is overly inflated type curves for tight formation oil and gas or well interference. Proof of bias is inherent in the fact that well interference and frac hits are getting worse, not better.
This implies domain experts have become captive of their own bias within the broader spectrum of industry inertia.
The new paradigm is the potential embodied in data-driven AI, which postulates “input – output = rules” as an approach to the multivariate asynchronous Rubik’s Cube of downhole challenges. Think of it as the “Mother Nature – output = rules” paradigm that eliminates bias by treating each well, each pad and each 3-D cube of earth as unique but interrelated.
For the most part, domain experts remain passionately skeptical about the new data-driven paradigm. Unfortunately, a low commodity price environment does not accommodate passion as a solution.
While operational efficiency will help the industry adjust to lower prices, it alone will not solve the challenge of generating sustainable profitability. The path forward calls for collaboration between the data gurus and industry domain experts. Ford, GM and Toyota did great with cars. But Silicon Valley postulated driving could be reframed as a computer science problem, which is why Silicon Valley leads the way on autonomous vehicles.
That analogy applies to oil and gas.
This post appeared first on Hart Energy.