More about exploration efficiency
Exploration efficiency
Prototyping future geoscience data organization and analytics tools for improved exploration workflows
Geoscience workflows applied in hydrocarbon exploration cover the acquisition, processing and interpretation of vast amounts data e.g. seismic and wellbore data (Big Data). These are extensive, time-consuming and costly processes. Most of these workflows are expert-driven, with geoscien-tists being closely involved in data interpretation, providing inputs to high level investment decisions. The rapidly growing amount of available data (the ever-in-creasing amounts of log data, multiple measurements of a single parameter, higher resolution data, etc.) and the reduced time allocated for its analysis and high-confidence decision making, make these tasks even more challenging and call for new methods. Development of fully- or partly automated methods, data-driven methods like machine or deep learning, in combination with rapidly increasing computational power open novel opportunities for the geoscience community.
Challenges and opportunities
The oil and gas industry is acquiring an ever-increasing amount of costly subsurface data, both at the earth’s surface, through seismic (2D or 3D) and repeated seismic (4D) surveying, and in wellbores, where thousands of drilling and wireline logs are acquired. The amount of acquired data in, for example, drilling and wireline logs is nowadays far beyond the current standard from the NPD’s “Blue book”. Fast and efficient processing and interpretation of these vast amounts of data is still a challenge. This is partly because the geoscience workflows are still highly manual and, hence, limited by geoscientists’ capacity to digest and analyze data. The other challenge is related to time/effort needed to access all acquired and processed seismic and wellbore data, which are vast, heterogeneous, are not well structured, and require quality control and subsequent corrections.
To address these challenges, one can leverage recent developments in computational power and data analytics methods, including methods like machine learning. These kinds of tools can explore hidden connections among different physical quantities through automated analysis of all acquired field measurements, seismic and wellbore data. Like any other automated technologies, these methods require high data quality and a data infrastructure that is optimized for their use.
Automated methods for data quality control as well as data organization suitable for streamlined application of data-driven/machine learning algorithms are prerequisites for automating the current geoscience workflows or even for completely replacing them with more efficient workflows enabled by machine learning tools.
Achieving the ambition of digitalized workflows will require intelligent integration of, and easy access to ‘’all’’ available subsurface data: well logs, drilling data, laboratory data, all kinds of meta data and geophysical field data, e.g. surface seismic. It is especially in the storage of drilling and wireline log data where better data organization, easy access, auto-mated data quality control and tracking updates of metadata associated with each single log measurement can lead to significant value creation.
Replacing humans by machine-learning algorithms in an expert-driven workflow is of course very difficult, since the experience and knowledge of geoscientists will always be a crucial ingredient in the geoscience workflows. However, seamless and easier access to all acquired data in combination with the utilization of machine learning-based approaches and fully automatic low-level subsurface data analysis may deliver high confidence results with enough speed and precision to make the exploration and production assets more efficient.
If successful, the new data-driven workflows may release new opportunities in terms of more complete and comprehensive data analysis and, not least, may release the creativity and capacity of geologists and geophysicists towards solving new or more complicated subsurface challenges.
Research strategy
The BRU21 program area “Exploration Efficiency” focuses on the following main research directions:
1. Data organization: Establish the fundaments for new well databases with easy access for a machine learning code.
2. Automatic data quality analysis: Develop algorithms for automatic/automated quality assessment and pre-pro-cessing of geoscience data.
3. Data analytics: Develop data analytics methods for both well and seismic data in the quest to automatically detect and classify features within the data sets. This will be tried both with supervised and unsupervised machine learning methods. Precision and accuracy metrics are to be developed to check and control perfor-mance of the individual methods.