Refining Oil and Gas Discovery with Deep Learning

Over the last two years, we have highlighted deep learning use cases in enterprise areas including genomics, large-scale business analytics, and beyond, but there are still many market areas that are still building a profile for where such approaches fit into existing workflows. Even though model training and inference might be useful, for some areas that have complex simulation-driven workflows, there are great efficiencies that could come from deep neural nets, but integrating those elements is difficult.

The oil and gas industry is one area where deep learning holds promise, at least in theory. For some steps in the resource discovery workflow, deep learning could lead to faster and more accurate results for potential discovery zones. Reservoir characterization is a critical step in this discovery process and is currently a hot area for explorations into how deep learning might be applied. Characterization uses seismic image data to analyze various features of potential resource sites. Making this process accurate is critical, hence a great deal of supercomputing might goes into resource discovery to ensure drilling investments are not wasted on sites that do not yield.

Machine learning methods have been applied to reservoir characterization for decades already, but these models are becoming more sophisticated and can benefit from algorithmic advances in deep learning. As one group of researchers notes in their analysis of existing machine learning methods for reservoir characterization, the very problem of reservoir uncertainty is better suited to machine learning and deep learning than traditional numerical approaches.

“Natural porosity, water saturation, permeability cannot be adequately estimated by linear relations. With relevant log measurements representing the dynamics representing the dynamics of the subsurface, these techniques have proved to have the capability to use the available log-core pairs to predict the missing reservoir properties of the uncored but logged sections of the reservoir.”

They go on to note that these approaches get these results by “establishing non-linear relations between the log measurements and the core values for prediction—techniques that have been reported to outperform the statistical regression tools.”

Existing approaches to machine learning include the support vector machines (SVM), self-organizing maps, decision trees, ensemble methods (random forests, case-based reasoning, and multivariate adaptive regression splines. As a group from the China University of Geosciences describes, other methods, including artificial neural networks also play a role but these use cases are still somewhat limited.

“Currently, data analysis methods play a central role in geosciences. While gathering large collections of data is essential in the field, analyzing this information becomes more challenging….Considering the significant capabilities of machine learning, it seems to be an efficacious approach to handle this type of information.”

Of course, as this team, among others point out, there is still the perception that newer methods for reservoir characterization and other oil and gas workloads create a “black box” problem. In short, more development is needed for neural networks in particular before they enter production. There are companies working towards this, including global oil and gas giant, Chevron.

Although standard machine learning techniques have a long history in the reservoir characterization workflow, deep learning is something of a newcomer. Researchers at Chevron are among several companies considering the ways deep learning might fit into existing reservoir characterization workflows. In 2016 a team from the company led an effort to employ deep neural networks and fuzzy kriging (a numerical method applied to imperfect data, common in this field) to analyze the viability of a reservoir in California’s San Joaquin Valley. This site proved a good candidate for the experimental approach because of the complexity of the geological features (diverse and heavy with clay, silt, sandstone, and shale).

The team’s model using these new models “better capture the uncertainty associated with the characteristics of these reservoirs,” and “once successfully trained, the system was applied to generate a 3D earth model and estimate reservoir properties at any point in the field.” In addition to helping with estimation of valuable locations, the resolution of the model with a trained network versus using kriging estimates alone. “Additional capability includes generation of synthetic logs for wells anywhere in the reservoir, with significant application for infill drilling.” The team sees future application for this approach in cases where there is faults and other complex formations.

From another point of a view, a team from the SAS Institute found that merging traditional seismic analysis methods and deep learning approaches led to more efficient resource discovery in an upstream exploration use case that sought patterns in seismic image data. During the preprocessing phase, “patches” are created from larger image sets that can “reduce the number of features required to represent an image and can decrease the training time needed by algorithms to learn from the images. If a training label is available, a classifier is trained to identify patches of interest.” In an unsupervised learning approach, a network can create a series of representative patches that can help seismic analysts narrow in on promising locations.

According to a team comprised of major oil and gas-focused academic institutions in Saudi Arabia, there is great value in adopting non-linear approaches to reservoir characterization in particular. Such a method would require a hybrid system that employs traditional simulation with deep learning methods. Existing hybrid models “are mostly limited to genetic algorithm with neuro-fuzzy and artificial neural network coupled with fuzzy logic.” They note that “though some hybrid techniques may have been applied in petroleum reservoir characterization, there are still rooms for further exploration and investigation. A lot of hybrid learning possibilities are yet to be discovered.”

One of the advantages oil and gas companies have over some other market segments is that they are often already equipped with the same HPC hardware required to train deep neural networks. GPUs, which are generally considered to be the best way to accelerate training, are used for other parts of oil and gas workflows, particularly on the surface modeling side. With more research–and this appears to be just beginning–deep learning could make resource discovery more efficient and accurate. We expect to see this area as one of the emerging deep learning research areas in the next few years.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

1 Comment

  1. The oil and gas industry is one area where an editor must excercise extreme restraint to refrain from some pun on “deep” learning for the headline. Well done.

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.