It seems we can’t find what you’re looking for. Perhaps searching can help.
It seems we can’t find what you’re looking for. Perhaps searching can help.
An innovative Fault Pattern Detection Methodology has been carried out using a combination of Machine Learning Techniques to produce a seismic volume suitable for fault interpretation in a structurally and stratigraphic complex field.
Through theory and results, the main objective was to demonstrate that a combination of ML tools can generate superior results in comparison with traditional attribute extraction and data manipulation through conventional algorithms.
The ML technologies applied are a supervised , deep learning, fault classification followed by an unsupervised, multi-attribute classification combining fault probability and instantaneous attributes.
The results are encouraging showing a higher level of structural details when compared with other interpretation techniques. Furthermore, visualization enhancement to better define stratigraphic relationships has also been partially achieved by the combination of fault probability volumes obtained from ML CNN fault detection procedure and multi-attribute classification of seismic features or SOM (Self Organizing Maps).
Machine Learning Technology was applied to a reprocessed seismic dataset in depth domain to generate a detailed, robust and reliable seismic fault attribute volume. The results are being used to constructing a more confident structural framework in the area and better understand stratigraphic trends and relationships to serve as input for static modeling.
Bachelor’s Degree in Geophysical Engineering from Central University in Venezuela with a specialization in Reservoir Characterization from Simon Bolivar University.
Over 20 years exploration and development geophysical experience with extensive 2D and 3D seismic interpretation including acquisition and processing.
Aldrin spent his formative years working on exploration activity in PDVSA Venezuela followed by a period working for a major international consultant company in the Gulf of Mexico (Landmark, Halliburton) as a G&G consultant. Latterly he was working at Helix in Scotland, UK on producing assets in the Central and South North Sea. From 2007 to 2021, he has been working as a Senior Seismic Interpreter in Dubai involved in different dedicated development projects in the Caspian Sea.
Dr. Elita De Abreu has Master's and Bachelor's degrees in Physics from the State University of Campinas in Brazil and a Ph.D. in Geophysics from the University of Houston. For the past 15 years, she has worked with the PETROBRAS geoscience exploration team, focused on applying and developing tools for Quantitative Interpretation, such as Rock Physics, AVO, Seismic Inversion, and Multi-Attribute analysis. Most recently, Dr. De Abreu has been leading digital transformation projects across different disciplines, integrating multi-scale data in a multi-disciplinary environment. She is motivated by challenging projects, where she can share her knowledge and learn through other people's experiences. She also has a great interest in basic sciences and supports educational causes, particularly STEM and women in science. In her free time, Dr. De Abreu enjoys contact with nature and reading about diverse topics.
Offset wells proximal to new hydraulically fractured wells often see intense pressure spikes that can damage the structure of the well, promote sand production and/or impact post-completion production. To mitigate these impacts, operators may shut in all offset wells within a radius of the new completion. Determination of the shut-in radius is often based on analogous operations and experience alone, and tends to be conservatively derived, potentially leading to the unnecessary shut-in of wells that may otherwise not experience any pressure event. Shutting-in too many wells can be the largest expense incurred by a new completion as operators not only work-over the offset wells, but also lose production for the entirety of the completion job. On the other hand, an underestimated shut-in radius might enhance fracture-driven interference during a completion job. An analytics and machine learning approach is presented to better understand the area of concern for offset wells and provide a data-driven recommendation for shut-in radius. The approach has been applied to a large field data set and indicates that historical data can be used to quantify the zone of communication and provide recommendations for future operations.
Carolan (Carrie) Laudon holds a PhD in geophysics from the University of Minnesota and a BS in geology from the University of Wisconsin Eau Claire. She has been Senior Geophysical Consultant with Geophysical Insights since 2017 working with Paradise®, their machine learning platform. Prior roles include Vice President of Consulting Services and Microseismic Technology for Global Geophysical Services and 17 years with Schlumberger in technical, management and sales, starting in Alaska and including Aberdeen, Scotland, Houston, TX, Denver, CO and Reading, England. She spent five years early in her career with ARCO Alaska as a seismic interpreter for the Central North Slope exploration team.
Oil and Gas Upstream Operations is a complex network of assets, processing plants, and inventory tank farms with continuous flows of material and products. The industry has decades of optimization experience and teams of professionals working to maximize production, reduce costs, and improve safety and reliability on these different business areas. The overarching challenge is optimizing the entire end-to-end operation involving multiple business areas in real time. The challenges arise due to the volume and variety of data, as well as the silos created by localized operating objectives.
By leveraging the power of cloud computing, machine learning, operations research, and AI, the Oil & Gas industry can make the most out of their real time IoT data to uncover optimization opportunities across their end-to-end operations. This presentation focuses on how a prediction-optimization framework is enabling one of Canada’s leading integrated energy companies to realize production improvements in extraction, upgrading, and tank farms operations. This solution augments their current workflow with new capabilities to dynamically identify opportunities, better manage operations with early process upset predictions, and near-real time operating plan generation capacity.
Crystal is an Analytics & AI Manager at IBM’s Advanced Analytics & AI practice in Canada. She focuses on leading multi-disciplinary teams to help companies build digital transformative capabilities and delivering AI solutions to empower businesses in making data-driven decisions. Prior to joining IBM, Crystal spent several years as an engineer in the Oil & Gas sector where she honed her technical skills and passion for analytics application in industrial settings. She specializes in use case and program design, predictive maintenance, and production optimization. She holds a BASc. in Mineral Engineering and a Masters degree in Business Analytics.
Overview of examples where IBM applied advanced cognitive computing to exploration and production processes in the oil & gas industry. Starting from Geoscience AI platform that rapidly digests and interprets geological information spread across geological papers, seismic images, well logs and supports knowledge capture from a broad range of studies; and concluding with AI Production Optimization solution applied to multi-facilities continuous production process. The Geoscience solutions have delivered material reduction of exploration and production risks and avoided hundreds of millions in exploration costs. The Production Optimization solution demonstrated the efficacy of AI-based systems in predicting complex interdependent production process failures, reducing recovery time, and alerting operators to production improvement opportunities in a steady-state operations mode materially reducing production costs per barrel.
Dariusz Piotrowski, Director, Global AI Solutions, IBM Natural Resources Industry Platform, leads the strategy and development of AI (Artificial Intelligence) solutions in natural resources (oil, gas, and mining). Dariusz specializes in large transformational projects focused on optimization, machine learning, and cognitive analytics. He has more than 20 years of technology and consulting experience working with senior leaders within some of the world’s largest natural resources companies. Dariusz helps these companies realize business value through fusing advanced technologies, data science, applied R&D, and agile methodologies and practices to transform business processes and performance. Currently, Dariusz leads the global natural resources AI development team within IBM Industry Platforms. Dariusz holds architecture and civil engineering degrees from Warsaw University of Technology in Poland and an M.B.A. from the Richard Ivey School of Business in Western Ontario.
In 2019, innovation and in particular digital innovation have taken center stage in enabling companies to re-orient their businesses to the new economic reality. Artificial Intelligence is seen by many as a key enabling technology, but companies often struggle to move their innovation pipeline into impactful production systems. This talk will discuss :
The role of Generation Z and Generation See-the-Beach in innovation
Vendors, Communities and Secret Sauce – how to balance investments in innovation
Building enterprise-grade digital innovation platforms
As Chief Technology Officer – Energy, David is responsible for Dell Technologies’ strategy for the Energy industry. He works with partners and clients to identify business needs and leads teams to develop strategies and architectures to support these requirements. David oversees the roll out of Dell Technologies’ Energy Industry strategy and solutions within the client base globally.
Prior to working for Dell Technologies, David established a highly respected position in the Petro-technical Computing and Information Management community where he was described as “a leader in the global Information Management and Infrastructure community” where he was “highly respected for his ability to engage with clients, develop new solutions and master complex technical/business problems.”
As Information Management Practice Manager for Halliburton-Landmark, David was responsible for architecting, building and operating some of the world’s largest outsourced geotechnical information management and application hosting solutions including for Shell and PGS as well as the National Hydrocarbon Databanks for the UK, Norway and Oman.
Prior to joining Dell Technologies, David was Director of Operations for FUSE IM where he built a team to bring to market FUSE IM’s cloud-based workflow, collaboration and data management tools for petro-technical data. FUSE IM was acquired by Target Energy Solutions in 2014.
David is Dell Technologies’ representative on the OpenEarth Community Executive Committee and was recently elected to the Open Group’s Open Subsurface Data Universe Management Committee. He has previously served on a number of industry committees including the European ECIM Management Committee and SPE’s “Petabytes in Asset Management.” He has delivered numerous technical papers at conferences around the world and holds a patent for his work on the remote visualization of geotechnical applications.
Upstream processes for the petroleum industry are now data-driven. Throughout the history of exploration and production in unconventional environments, multidisciplinary engineering and geoscience data have been continuously collected. Now, using machine learning and AI, this data is being used in new ways to address fracture stimulation and frac-driven interference by enhancing frac modeling. Through the use of data-driven machine learning techniques, models and frameworks can be tailored to individual assets of interest to predict and analyze frac stimulation within a reservoir. However, it seems the data available is still not enough to accurately predict frac behavior.
This presentation will focus on fluid tracking using an electromagnetic technique that provides the missing link to improve machine learning frameworks for frac modeling without disrupting existing workflows. This data directly correlates to completions and production data by providing direct measurements taken during frac stages, which in turn generates relevant features for machine learning inputs. An outlook is also given on recent developments in real time fluid tracking and how it can be integrated with AI to prevent frac-driven interference and forecast performance on a per-stage and per-well basis.
David Moore is currently the President and CEO of Deep Imaging, a leading provider of onshore magnetic imaging. A seasoned energy executive, David spent over a decade at GE Power and GE Oil & Gas and then went on to lead a PE-backed upstream manufacturing company as President and CEO. David began his career as a Captain in the United States Air Force.
The process of statistically analyzing multiple seismic attributes using a SOM (Self-Organized Map) algorithm has been around for several decades. However, advances in computing power, coupled with the many new attributes developed in the last 30 years, has made this type of analysis extremely powerful.
In the past, SOM has been used on only one attribute at a time using the seismic wavelet as the basis for the neural analysis. The approach in this presentation is using SOM on multiple seismic attributes at one time, and in a sample-based, not wavelet, format.
Studies done in the Meramec Formation in Central Oklahoma and the Woodbine Formation of East Texas will be highlighted for the SOM process’s ability to find the best reservoir through the statistical analysis of seismic attributes. Then, converting the neural clusters into Geobodies, calculations can be made to determine reservoir size and reserve estimates. A statistical tool is also embodied to show how the neural patterns can be compared to distinct petrophysical rock properties to confirm the presence of the reservoir.
Deborah is a geologist/geophysicist with 44 years of oil and gas exploration experience in Texas, Louisiana Gulf Coast and Mid-Continent areas of the US. She received her degree in Geology from the University of Oklahoma in 1976 and immediately started working for Gulf Oil in their Oklahoma City offices.
She started her own company, Auburn Energy, in 1990 and built her first geophysical workstation using Kingdom software in 1996. She helped SMT/IHS for 18 years in developing and testing the Kingdom Software. She specializes in 2D and 3D interpretation for clients in the US and internationally. For the past nine years she has been part of a team to study and bring the power of multi-attribute neural analysis of seismic data to the geoscience public, guided by Dr. Tom Smith, founder of SMT. She has become an expert in the use of Paradise software and has seven discoveries for clients using multi-attribute neural analysis.
Deborah has been very active in the geological community. She is past national President of SIPES (Society of Independent Professional Earth Scientists), past President of the Division of Professional Affairs of AAPG (American Association of Petroleum Geologists), Past Treasurer of AAPG and Past President of the Houston Geological Society. She is also Past President of the Gulf Coast Association of Geological Societies and just ended a term as one of the GCAGS representatives on the AAPG Advisory Council. Deborah is also a DPA Certified Petroleum Geologist #4014 and DPA Certified Petroleum Geophysicist #2. She belongs to AAPG, SIPES, Houston Geological Society, South Texas Geological Society and the Oklahoma City Geological Society (OCGS).
Sr Field is a gas field located 100 km offshore west Nile Delta, within the West Delta Deep Marine (WDDM) concession of Shell (Rashid/ El-Burullus joint venture). The water depth ranging from 500-1000 meters.
The Field was penetrated by 4 wells, two of them are exploration wells (So-1 & Sr-1) and the other two are development wells (Sr-a & Sir-c). It is a stratigraphic Pliocene channel of 35-40 km length and 500- 100-meter width, with a clastic sand reservoir of 100-200 meter thickness. The filed is covered by 3D seismic of good quality. One segment of this channel has about 0.65 TCF of gases and three wells from the four wells are included at this segment.
The channel was conventionally mapped using Hampson Russel Spectral decomposition. Unsupervised machine learning is recently used for clustering the data and isolate the channel using the Paradise® software from Geophysical Insights.
Spectral magnitude, spectral voice, spectral phase, sweetness, energy ratio similarity, and Envelop are were identified by Principal Component Analysis (PCA) as having the highest energy in the region of interest. These attributes were then classified simultaneously over the reservoir zone using the Self Organizing Map (SOM) application, a form of machine learning.
Application of the unsupervised Machine learning using SOM clearly demonstrate the strike and geomorphology of the Pliocene marine turbidities. The southern segment of the channel penetrated with the three wells are very well defined after posting the wells. No significant difference on the neurons (hexagons) at the locations of the three wells which reflects the similarity of reservoir nature, thickness, sand content and pay thickness. A significant other channel is resolved to the east of the main channel that is not detected using the conventional spectral decomposition.
Ali is the CEO of RockServ for petroleum services based in Cairo. He has a PhD in Exploration seismology from Cairo-Leister University and Geology B.Sc., from Alexandria University. He worked at many different scale companies included Shell, Phillips, Apache, Phoenix, IPR and many Joint Ventures. Has a 40 years oil and gas exploration and development industry and leads many true oil finder groups in Egypt, Sudan, Syria, and Yemen. He is a teaching professor at the American University in Cairo, Ain Shams, Alexandria and Cairo Universities.
Frac hits are a form of fracture-driven interference that occurs when newly drilled wells communicate with existing wells during a completion job. In most cases, frac hits leave a negative production impact. Understanding the main causes of frac hits is complicated and at the same time crucial for optimizing the net profit value of a well pad. Frac hits happen due to a combination of different parameters such as depletion and stress history, inter-well spacing, completion design, and rock characteristics. The available physics-based diagnostics workflows produce outputs with a high degree of uncertainty. These approaches also are unable to handle a database beyond a single well or a few stages. We developed a data-driven approach based on the pattern recognition capabilities of machine learning techniques to characterize and aid understanding of the root causes of frac hits in a well pad during a completion job. The approach was applied to a field data set and indicated that frac hits can be quantitatively attributed to operational or subsurface parameters such as spacing or depletion. A better understanding of frac hits will help to optimize well spacing and completion design parameters and consequently improve hydrocarbon recovery and maximize the return on capital investment.
Dr. Ali Shahkarami is a Senior Engineer at Baker Hughes GE (BHGE) where he leads the Reservoir Analytics team leading a team of subsurface domain experts and data scientists for developing the next generation of data-driven solutions and workflows for oil and gas and energy industry. He is based at the Energy Innovation Center in Oklahoma City. Prior to joining BHGE, he was an Assistant Professor of Petroleum and Natural Gas Engineering (PNGE) at Saint Francis University in Loretto, Pennsylvania. Dr. Shahkarami started the undergraduate PNGE program at Saint Francis University in 2014 and was the program head and led the program accreditation process before joining BHGE in 2018. He holds Ph.D. and MSc degrees in Petroleum and Natural Gas Engineering from West Virginia University.
Most O&G technicians have one day or another received a core analysis report, a geochemistry report, a pressure-volume-temperature report with several pages of tabulated values with no other choice but to retype the values in Excel to make them usable. This issue not only occurs in the subsurface domain but in all of the O&G sector as well as in other industries. With the support of TOTAL, Technip, Saipem, Schlumberger, Subsea7 and IFPen, Agile Data Decisions has developed an open source solution to automate the extraction of tabular data from documents. This solution combines probabilistic modeling of sequences with machine learning to detect and segment the tables. The approach is very similar to the human approach: A reader can localize a table in a page by looking at the text alignment, the spacing between strings, the ratio between numerical and letters, etc. All of these features are captured by machine learning and deep learning methods for training and detection of tabular data in this project. As required by our sponsors the technology will be released to an open source platform in October 2020 under MIT license for the benefit of the Data Management community. The presentation will describe the novel machine learning approach and provide examples of table automatic detection and segmentation.
Dr. Amit Juneja is an experienced machine learning scientist, developer and business leader. He is currently VP of Data Science at Agile Data Decisions LLC where he leads machine learning and software development projects. For the past 15 years he has led artificial intelligence (AI) and full stack projects in diverse industry sectors including US defense, oil and gas, and automotive.
At Agile Data Decisions LLC, he has led the development an AI system for information extraction from documents that is being used by several major oil and gas companies. At IBM, Amit collaborated with oil and gas companies to build optimal 2-D and 3-D vision models with deep learning and enabled several US manufacturing plants with deep learning based visual quality inspection of products in assembly lines. At Think A Move Ltd. Amit proposed and led development for $3 million of US Defense contracts to build speech recognition and natural language modelling (NLP) applications. At Goodyear Tire and Rubber Company, he led an agile and lean development for a major Internet of Things application and took the concept from prototype to production on a scalable cloud environment.
Amit pursued his love of learning by obtaining a PhD in Machine Learning/Automatic Speech Recognition from University of Maryland, College Park. He holds an MS in Electrical Engineering from Boston University and a BS in Electronics and Communication Engineering from Indian Institute of Technology.
I will share a broad overview of the forces behind AI and ML. An overview of current challenges to E&P will be discussed and ways in which ML can address these challenges. Additionally, where can ML take the industry in the future and how companies can prepare now for maximizing their benefit as technology in this area rapidly grows.
Dr. Arvind Sharma is the VP of Data & Analytics at TGS. In this role, he is responsible for Machine Learning initiatives as well as broader digital transformation. He has 10+ years of experience in various seismic and software-related work. Arvind has bachelors and masters degrees in Applied Geology and Exploration Geophysics respectively from the Indian Institute of Technology (IIT) Kharagpur. He has a Ph.D. from Virginia Tech (VT) in Geophysics.
Arvind has a broad background in the oil and gas industry as well as outside the industry. He has worked in jobs ranging from software engineering (Infosys) to efficient seismic acquisition design (PGS) to developing seismic image algorithms (BP) to prospecting and drilling exploration wells (BP). Most recently Arvind was Chief Geophysicist at PGS and held a similar role at TGS before his current position where he led the industry’s first crowdsourcing challenge “TGS-Kaggle Salt Identification Challenge.” Additionally, he holds several patents, has been the keynote speaker at major conferences and has been featured on several ML podcasts.
At TGS, his mission is to create a platform to integrate and analyze all available sub-surface information for risking and decision making. Arvind believes that data integration and machine learning will be pivotal to this industry’s future success.
In the Oil & Gas industries’ quest for efficiency and cost reduction, the need to significantly reduce the cycle time of conventional interpretation workflows while utilizing maximum detail of the seismic information (Big Data) for all exploration projects is mandatory. This talk will demonstrate our approach to enhance and accelerate the seismic interpretation task based on technology democratization.
German Larrazabal has been an R&D Geophysics Advisor for Repsol since 2011. Previously, Dr. Larrazabal was an Applied Mathematician and Computer Scientist from Universidad Central de Venezuela (UCV), Faculty of Sciences, School of Mathematics. Also, Larrazabal has a Master of Sciences in Computer Science from “Universidad Central de Venezuela”, Faculty of Science, School of Computer Science. Moreover, Larrazabal has a Ph.D. in Computer Sciences, Cum Laude award, from University Polytechnic of Catalonia (UPC), Barcelona Supercomputing Center (BSC), Barcelona, Spain.
Larrazabal has been a Professor and Researcher at the University of Carabobo, Venezuela, San Diego State University, USA and University of Texas at El Paso, USA. Also, Larrazabal has been Visiting Professor in the Computational Science Research Center (CSRC) at San Diego State University, USA. He has authored numerous papers and is a prominent speaker at digital transformation conferences.
Seismic fault detection is one of the critical steps in seismic interpretation. Identifying faults is crucial for characterizing and finding the potential oil and gas reservoirs. Machine learning holds promise for eliminating some of the tedious and repetitive steps in fault interpretation. Seismic amplitude data serves as input for automatic fault detection and deep learning Convolutional Neural Networks (CNN) perform well on fault detection without any human interactive work. This presentation shows an integrated CNN-based fault detection workflow which enhances the final fault detection volume by applying pre and post processing and an unsupervised seismic classification to ultimately isolate faults within a 3D volume. The pre and post processing objectives were to suppress noise or stratigraphic anomalies subparallel to reflector dip and to sharpen fault and other discontinuities that cut reflectors. To suppress cross-cutting noise as well as sharpen fault edges, a principal component edge-preserving structure-oriented filter is first applied. The conditioned amplitude volume is then fed to a pre-trained 3D synthetic CNN model to compute fault probability. Finally, a 3D Laplacian of Gaussian filter is applied to the CNN fault probability to enhance fault images. The resulting fault detection volumes (fault probability, fault dip magnitude and fault dip azimuth) compare favorably with traditional human interpretation and in complex structural settings, provide a more complete and unbiased image of faults. Finally, the fault volume is input into an unsupervised machine learning seismic classification (SOM) to generate a 3D volume in which the faults volumes are classified into discrete neurons with known values. This provides superior final results which can subsequently be used to generate geobodies of individual faults or used directly as input to other fault surface extraction tools.
Jie Qi received a Ph. D. (2017) in geophysics from the University of Oklahoma, Norman. He was a postdoctoral research associate at University of Oklahoma from 2017 to 2020. He is currently a research geophysicist at Geophysical Insights. His research interests include machine learning-based seismic interpretation, pattern recognition, image processing, seismic attribute development and interpretation, and seismic facies analysis.
Interpreters face two main challenges in computer-assisted seismic facies analysis. The first challenge is to define, or “label”, the facies of interest. The second challenge is to select a suite of attributes that can differentiate target facies from each other and from the background reflectivity. Accurately defining the seismic expression of a given seismic facies requires an understanding of not only geologic processes but also the limits of seismic acquisition, processing, and imaging. Our goals are to provide a good classification model in terms of validation accuracy, provide quantitative metrics (and ideally, geological insight) as to why a given attribute suite is chosen, and to minimize the computational and memory required.
In principle, a desirable attribute subset is built by detecting relevant and discarding irrelevant attributes. Relevant attributes are those that are highly correlated with the output classes using a technique called univariate attribute analysis. In contrast, redundant attributes are highly correlated with each other. We hypothesize that the redundant and useless attributes that confuse human interpreters also pose problems in machine-learning classification.
Kurt J. Marfurt joined The University of Oklahoma in 2007 where he serves as the Frank and Henrietta Schultz Professor of Geophysics within the ConocoPhillips School of Geology and Geophysics. Marfurt’s primary research interest is in the development and calibration of new seismic attributes to aid in seismic processing, seismic interpretation, and reservoir characterization. Recent work has focused on applying coherence, spectral decomposition, structure-oriented filtering, and volumetric curvature to mapping fractures and karst with a particular focus on resource plays. Marfurt earned a Ph.D. in applied geophysics at Columbia University’s Henry Krumb School of Mines in New York in 1978 where he also taught as an Assistant Professor for four years. He worked 18 years in a wide range of research projects at Amoco’s Tulsa Research Center after which he joined the University of Houston for eight years as a Professor of Geophysics and the Director of the Allied Geophysics Lab. He has received the SEG best paper (for coherence), SEG best presentation (for seismic modeling), as a coauthor with Satinder Chopra best SEG poster (one on curvature, one on principal component analysis) and best AAPG technical presentation, and as a coauthor with Roderick Perez Altimar, AAPG/SEG Interpretation best paper (on brittleness) awards. Marfurt also served as the SEG/EAGE Distinguished Short Course Instructor for 2006 (on seismic attributes). In addition to teaching and research duties at OU, Marfurt leads short courses on attributes for SEG and AAPG and currently serves as Editor-in-Chief of the AAPG/SEG journal Interpretation.
Lennart Johnsson is a Hugh Roy and Lillie Cranz Cullen Distinguished University Chair of Computer Science, Mathematics, and Electrical and Computer Engineering at the University of Houston and is Professor Emeritus at the Royal Institute of Technology, Stockholm, Sweden. Professor Johnsson has served on the Faculties of California Institute of Technology, Yale University, Harvard University, and the Royal Institute of Technology). He has served as Manager of Systems Engineering, Electrical Systems, ABB Corporate Research, Sweden and Director of Computational Sciences at Thinking Machines Corp. (TMC).
Quantitative seismic reservoir characterization poses a mathematically ill-constrained inversion problem traditionally solved by methods relying on pre-stack seismic inversion and subsequent rock physics transforms. Alternatively, subsurface models can be matched to field seismic data by seismic forward modeling using wells as calibration points. Both these approaches face practical limitations in the sparsity of calibration data and severe non-linearity of the problem requiring multiple simplifying assumptions. Recent extensive developments in machine learning and data-driven model building can provide significant accuracy and efficiency uplift in solving this problem by streamlining seismic attribute analysis and avoiding the need to pass through the elastic domain. We present various approaches to seismic machine learning and their application to both static and dynamic reservoir characterization projects and discuss comparisons to conventional 3D and 4D quantitative interpretation workflows. Emphasis will be given to practical approaches enhancing cross-discipline integration and validation of data analytics methods using both geophysical and data science approaches. We highlight the advantages, challenges and systematic biases encountered in this type of analysis and discuss potential extensions of the data analytics approach using deep learning methods.
After receiving his Ph.D. in Physics from Florida State University, Mike has worked as a researcher in particle physics and cosmology with a keen interest in phenomenological modeling and the interface between theory and experiment. He joined Shell in 2001 and focused on various aspects of geophysics, both seismic and non-seismic, most importantly the applications of inversion theory to quantitative interpretation and reservoir characterization. Since 2010, he has been working for ConocoPhillips in a similar capacity and with the onset of data analytics and machine learning has taken a strong interest in applications of data-driven model building approaches to solving geophysical problems
Dr. Rahul Gajbhiye working as an Assistant Professor in the Department of Petroleum Engineering at KFUPM. He earned his Ph.D. degree in petroleum engineering from Louisiana State University. Before joining KFUPM he worked as a post-doc research associate at Tulsa University Drilling Research Project (TUDRP) at Tulsa University, Oklahoma. His research area includes Surface Production facilities, Optimization and Automation, AI applications in the petroleum industry, multiphase flow in pipes, and EOR.
He is serving as a faculty advisor for the SPE-KFUPM student chapter since 2013 and reviewer for several journals including Arabian Journal for Science and Engineering, Petroleum Science and Engineering, Colloids and Surfaces A: Physicochemical and Engineering Aspects, and Industrial and Engineering Chemistry. He is also the principal investigator of projects on EOR, Multiphase flow, and GOSP optimization.
He is a member of SPE and AADE and received awards for presentations at the GOM Deepwater Technical Symposium, New Orleans (2009), and the AADE (Premier Fluid Conference), Houston (2010). Recently he is awarded the SPE regional service award 2020, from the Middle East and North Africa region.
Randall (Randy) W Gentry (Ph.D., Civil Engineering, University of Memphis, 1998) has over 20 years of experience leading research teams and organizations across academia and federal research organizations within federal agencies focused on energy and environmental issues. Dr. Gentry led many of these organizations through strategic planning and on-going organizational change while targeting mission critical programmatic research needs at the U.S. Department of Energy laboratories (National Energy Technology Laboratory and Argonne National Laboratory), U.S. EPA and served as a tenured faculty member at the University of Tennessee while also leading several research initiatives. Dr. Gentry joined the Petrolern team as R&D Director in early October 2021 and is excited to work with such a strong entrepreneurial group.
Prior to joining Petrolern, Dr. Gentry served as the Deputy Director and Chief Research Officer at the National Energy Technology Laboratory, one of the U.S. DOE’s National Laboratories. At the National Energy Technology Laboratory Dr. Gentry commissioned the strategic planning and inauguration of the Science-based Artificial Intelligence and Machine Learning Institute (SAMI) and served on its inaugural advisory board.
Dr. Gentry has pursued interdisciplinary research during his career and built teams around complex subsurface behavior and phenomena to better understand aquifer mixing behavior and characterize temporal scale response to various stress induced states on those systems as predictors of risk for better management tools, and for better understanding fundamental conceptual model development of reservoir systems. Dr. Gentry has co-authored papers with many of his team members in top tier international journals and is a recognized subject matter expert in multi-layer aquifer mixing behavior. Dr. Gentry was on two teams recognized in January 2021 by U.S. DOE Secretary’s Achievement Awards, the National Virtual Biotechnology Laboratory Team and the Science and Technology Risk Matrix Team for their work performed in 2020 across the National Laboratory system.
Dr. Srikanta Mishra is Technical Director for Geo-energy Modeling & Analytics at Battelle Memorial Institute, the world’s largest not-for-profit private R&D organization. He is a recognized expert on integrating computational modeling and machine-learning assisted data-driven activities for various subsurface energy resource projects, and the recipient of the 2021 SPE International Award for Distinguished Membership. He was an SPE Distinguished Lecturer on Big Data Analytics during the 2018-19 season, visiting 16 countries to deliver 32 lectures. He is the author of ~200 refereed publications, conference papers and technical reports, and the book "Applied Statistical Modeling and Data Analytics for the Petroleum Geosciences" published by Elsevier. He is also a popular instructor of short courses on statistical modeling and data analytics for SPE as well as other organizations. He holds a PhD degree in Petroleum Engineering from Stanford University.
Dr. Tom Smith, the founder of Seismic Micro-Technology (SMT) and creator of the KINGDOM Software Suite, is the President and CEO of Geophysical Insights (geoinsights.com), where he leads a team of geophysicists, geologists and computer scientists in developing machine learning technologies for interpretation. Dr. Tom Smith received BS and MS degreeS in Geology from Iowa State University and a Ph.D. in Geophysics from the University of Houston. Over a 50-year career, Dr. Smith has been recognized numerous times for his accomplishments in pioneering the science of geophysics. The Society of Exploration Geologists (SEG) recognized Dr. Smith’s work with the SEG Enterprise Award in 2000, and in 2010, the Geophysical Society of Houston (GSH) awarded him an Honorary Membership. Iowa State University (ISU) recognized Dr. Smith’s accomplished career with the Distinguished Alumnus Lecturer Award in 1996, the Citation of Merit for National and International Recognition in 2002, and the highest alumni honor in 2015, the Distinguished Alumni Award. The University of Houston College of Natural Sciences and Mathematics recognized Dr. Smith with the 2017 Distinguished Alumni Award.
Today most of the focus of AI on the Machine Learning / Deep Learning industry requires a large amount of data that is not always available in early phases of Exploration and Production. To address the data restrictions, I will discuss neuro-symbolic approaches to combine Machine Learning / Deep Learning with formal pre-existing domain knowledge and formal knowledge representation and reasoning. This knowledge-enhanced Machine Learning approach, coupled with transfer learning techniques, allows working with a smaller amount of data and effort necessary to train AI-based models. We will discuss how we can augment ML technologies with domain and contextual knowledge, and enabled more effective transfer learning in applications to real cases for AI-assisted seismic Interpretation and well-log analysis.
Ulisses T. Mello is the director of IBM Research – Brazil with sites in São Paulo and Rio de Janeiro. Ulisses is also an IBM global research executive for the chemicals and petroleum Industry sector. He holds a Ph.D. (1994) and MA (1992) in geology from Columbia University, an M.Sc. in geology from Federal University of Ouro Preto (1987), and a B.Sc. (1983) in geology from the University of Sao Paulo, Brazil. His research interests are large-scale basin modeling, hydrogeological modeling, digital oil fields, integrated operation optimization, data assimilation, unstructured meshing, parallel computing, advanced water management, and computational geosciences.
In recent years we have seen very active development and applications of machine learning/AI technologies in various upstream sectors, such as geophysics and geosciences in exploration, process and asset monitoring and optimization in development and production for both conventional and unconventional fields, as well as sustainability challenges such as carbon storage and sequestration. The progress is enabled by technology gains in powerful computing infrastructure, deep learning algorithms and models, and more importantly the understanding learned at the interface between and integration of subject matter knowledge and machine learning/AI techniques. The diversity and complexity of many applications in the industry involving data from wide range of time-spatial scales and measurement modality also pose interesting research challenges for machine learning.
In this talk, I will share a number of examples for machine learning in upstream applications, including the generalizability and risk of overfitting in deep learning based geophysical inversion, integrating surrogate physics constraints in machine learning based coherent seismic noise removal, cross-modality machine learning in quantitative microscale characterization, and fiber optic DAS based hydraulic fracturing monitoring using deep learning. While the results from these examples are encouraging, the goal is to highlight the challenges and opportunities for machine learning/AI research and development in upstream applications.
Weichang Li is the head of the machine learning group at Aramco Americas’ Houston Research Center where he joined in 2015. His current research is focused on developing machine learning and signal processing algorithms/models for geophysics, geoscience and petroleum engineering applications. Prior to Aramco, he had been with ExxonMobil’s Corporate Strategic Research lab. since 2008 where he led the machine learning team from 2011-2014. Weichang has co-organized the SEG machine learning post-convention workshop from 2018 to 2021, the SIAM Data Mining workshop in Geoscience Applications in 2018, and is the associate editor for Geophysics special section on Machine Learning, and recently IEEE Transaction on Neural Network and Learning Systems special issue on Deep Learning for Earth Sciences and Planetary Geosciences. He is a member of the SEG research committee and the NSF IRIS working group on machine learning for fiber optic DAS. Weichang obtained his M.S. (dual) in Electrical Engineering and Computer Sciences, and Ocean Engineering (2002), and Ph.D. in Electrical and Oceanographic Engineering (2006), all from MIT. He was also an Office of Naval Research (ONR) postdoctoral fellow at Woods Hole Oceanographic Institution from 2006-2007.
The past a few years mark the fastest development of machine learning in seismic interpretation community we have ever seen. One of the major accelerators of such growth is the success of deep learning methods, originated from the computer vision discipline. Over the past three years, we witnessed the rapid adoption of deep learning techniques in seismic interpretation. However, most of such adoption is still limited to academia and research institutes, primarily due to the fact that, albeit providing a high-quality result, deep learning methods usually requires much more training data to work effectively. Preparing such training data is often time-consuming, if not impractical, for a seismic interpreter. This presentation is focused on deep learning applications that require limited or even no training data from a general seismic interpreter, which make deep learning more accessible to general seismic interpreters. Examples on deep learning-based seismic facies classification and fault detections demonstrate that a general seismic interpreter can benefit from the high-quality result, while also with greatly improved efficiency.
Dustin Dewitt has spent the last 10 years in pursuit of geoscience excellence, integrating advanced seismic interpretation techniques with general geoscience workflows to develop methods, which enhance geologic interpretations. After service in the United States Navy, Dustin moved into the private sector, attaining both a B.S. and an M.S. in Geophysics and Seismology from the University of Oklahoma. Upon graduating, he gained experience with BHP Billiton as a QI Geophysicist and subsequently, an Exploration Geophysicist. Currently, he is a Product Manager for Geophysical Insights, contributing to the development of Paradise, the AI workbench.
Waves of elastic energy travel through the Earth with the same physical principles as other waves travel through different mediums. This video focuses on the physics of traveling waves and why energy absorption is important to an understanding of seismic data. Starting with a liner second-order vibrating system as a mathematical model, the 50-minute short course presents the classes of waves and their associated measurements. The basics of energy loss in traveling waves are described, as well as the relationship between vibration and energy loss. A simplified model of the seismic geophone is used to described attenuation and damping ratio. Read more
Instructor: Dr. Tom Smith, President and CEO, Geophysical Insights
Certification Available: No
Total classroom time: 1 hour
Cost: $20
Despite its time-honored use of technologies, the upstream oil and gas sector has been slower than other industries to embrace the breakthrough solutions that have transformed several sectors over the past decades. That may be about to change. Much has happened and COVID-19 has had multiple impacts on the oil and gas industry including plunged prices and government revenues, significantly lowered demand, and inflated stockpiles of crude oil. As the industry seeks ways to return to profitability, Next-Generation technologies are emerging as a part of the answer, presenting the possibility of a radically more efficient new reality.
The development of Next-Generation technologies has the potential to transform the upstream oil and gas industry’s fortune by sustainably improving exploration portfolio, capital efficiency, lowering exploration and development costs, and reducing time to first oil. These new technologies bring significant changes to an upstream oil industry’s ecosystem, disrupting the traditional value chain and redefining business models. We took a Next-Generation Technology Stack Lens approach to explore how these breakthrough solutions will alter the upstream industry eco-system by examining Digital Programs, Workflow efficiency and Innovative Projects. This breakthrough approach offers the structure that the upstream industry needs to understand Next-Generation Technologies.
To this end, this presentation will dive deep into PETRONAS’s NexTGEN Technology plan for exploration that is converting data to insights and taking actions based on those insights. We believe that the process of NexTGEN will be unique to each step because upstream exploration contains a discrete set of technologies such as Machine Learning, Data Analytics, and Cloud-based Solutions for a distinct flow of data, a particular set of challenges, different digital maturity, and nature of data generated and utilized in each step differs from that others. NexTGEN is set to have a profound impact on PETRONAS’ exploration delivery mandates. A wave of Next-Generation Technologies is bringing greater technological innovation that powers value creation in Oil and Gas.
The paper describes the combined use of machine learning and statistics to correlate to reservoir properties, thereby discriminating between the presence or absence of a reservoir. The results, presented using multiple statistical techniques, are repeatable and dependable in positive reservoir identification, including the reservoir's extent.
The first stage of the proposed statistical method has proven to be very useful in testing whether or not there is a relationship between two qualitative variables (nominal or ordinal) or categorical quantitative variables in the fields of health and social sciences. Its application in the oil industry allows geoscientists not only to test dependence between discrete variables but to measure their degree of correlation (weak, moderate or strong). The talk shows its application to reveal the relationship between a SOM classification volume of a set of nine seismic attributes (whose vertical sampling interval is three meters) and different well data (sedimentary facies, Net Reservoir, and effective porosity grouped by ranges). The data were prepared to construct the contingency tables, where the dependent (response) variable and independent (explanatory) variable were defined, the observed frequencies were obtained, and the frequencies that would be expected if the variables were independent were calculated, and then the difference between the two magnitudes was studied using the contrast statistic called Chi-Square. The second stage implies the calibration of the SOM volume extracted along the wellbore path through statistical analysis of the petrophysical properties VCL and PHIE, and SW for each neuron, which allowed to identify the neurons with the best petrophysical values in a carbonate reservoir.
Fabian Rada joined Petroleum Oil and Gas Services, Inc (POGS) in January 2015 as Business Development Manager and Consultant to PEMEX. In Mexico, he has participated in several integrated oil and gas reservoir studies. He has consulted with PEMEX Activos and the G&G Technology group to apply the Paradise AI workbench and other tools. Since January 2015, he has been working with Geophysical Insights staff to provide and implement the multi-attribute analysis software Paradise in Petróleos Mexicanos (PEMEX), running a successful pilot test in Litoral Tabasco Tsimin Xux Asset. Mr. Rada began his career in the Venezuelan National Foundation for Seismological Research, where he participated in several geophysical projects, including seismic and gravity data for micro zonation surveys. He then joined China National Petroleum Corporation (CNPC) as QC Geophysicist until he became the Chief Geophysicist in the QA/QC Department. Then, he transitioned to a subsidiary of Petróleos de Venezuela (PDVSA), as a member of the QA/QC and Chief of Potential Field Methods section. Mr. Rada has also participated in processing land seismic data and marine seismic/gravity acquisition surveys. Mr. Rada earned a B.S. in Geophysics from the Central University of Venezuela.
Technology and innovation respond to the global and sectorial challenges that Energy Companies face. These challenges are mainly driven by regulatory, societal, and economic demands, as well as the emergence of new technology enablers. Artificial intelligence, a key enabler, has become the protagonist in the decision-making process in multiple industries.
This presentation will be focused in two AI products releases that joined Repsol E&P's technology catalogue during 2021:
Federico Giannangeli is the E&P Director of Technology and Operating Model at Repsol. He currently leads a multidisciplinary organization with strong focus in value driven technology product development applications to the upstream portfolio. Federico has spent over 18 years in the Energy Industry with primary contribution in delivering global Oil & Gas developments offshore (i.e. ultra-deep, deep and shallow water), onshore and in remote harsh environments. He graduated as a Mechanical Engineer in Venezuela and holds a business certificate from Georgetown University.
Gabriel Guerra, currently is the leader of Shell's Exploration transformation journey, integrating the introduction of new business practices and ways of working to an established technology, digital and data strategy; providing the platform for lasting and enhanced exploration performance. Gabriel is a geologist by background and brings a breadth of experience and energy, after 18 years in various exploration positions with Enterprise Oil and Shell. He also has been involved from the start in shaping and leading exploration digitalisation and is deeply involved in leading the digital agenda for Shell. Gabriel is based in London but hails from Rio de Janeiro, in Brazil, and is married with two kids.
There is much hype around all topics related to digitalization and IR 4.0 and their underlying technologies. At the same time, industries have already seen some real-life applications and benefits as well as some real disruption to their value chains, making it more urgent than ever for oil & gas companies to accelerate their digital transformations. Despite the hype and the promise of machine learning, adoption outside of the tech sector is still at an early, often experimental stage with few firms having deployed it at scale.
This presentation will discuss some of the successes and challenges of moving machine learning in Oil & Gas from hype to reality. There have been plenty of promising successful deployments across robotics and autonomous vehicles, computer vision, virtual agents, and machine learning, with the latter including deep learning and underpinning many recent advances in the other AI technologies. However, the fact that it requires big data that must be trained on what’s often sparse, incomplete, and messy data, its tendency to cut across functional, geographic, and organizational silos, and its dependence on having a digital foundation and an upskilled workforce pose formidable challenges to scaling digital and machine learning in particular.
Hani Elshahawi is Digitalization Lead – Deepwater Technologies at Shell where he has spent the last 14 years. Before that, he led FEAST-Shell’s Fluid Evaluation and Sampling Technologies center of excellence, before becoming Deepwater Technology Advisor. Prior to Shell, Hani spent 15 years with Schlumberger in over 10 countries in Africa, Asia, and North America during which he has held various positions in interpretation, consulting, operations, marketing, and technology development. He holds several patents and has authored over 130 technical papers in various areas of petroleum engineering and the geosciences. He was the 2009-2010 President of the SPWLA, distinguished lecturer for the SPE and the SPWLA 2010-2011 and 2013, and recipient of the SPWLA Distinguished Technical Achievement Award in 2012.
Compressor stations used to move natural gas are one of the largest sources of fugitive methane emissions in the midstream sector, accounting for approximately 50% of all fugitive emissions(Zimmerle et al., 2015). This problem is most widespread at reciprocating compressors (Subramanian et al., 2015) where faulty seals are a key contributor to methane emissions (Johnson et al., 2015). As such, there is a significant need for a robust technology that could provide an early indication of an unexpected emission. Equally important, the technology needs to be able to account for biogenic versus anthropogenic sources of methane. One means of indirectly making this determination is to leverage optical technologies that can autonomously pinpoint the source of such leaks.
This presentation discusses recent work funded by the U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL), focused on the development of an innovative remote sensing technology that can reliably and autonomously detect fugitive methane emissions in near real-time, using computer vision and deep learning. The technology called the Smart Methane Leak Detection (SLED/M) system was initially developed to monitor facilities such as compressor stations in a stationary, pan-tilt-zoom configuration.
The system has recently been adapted to monitor facilities from an unmanned aerial system (UAS). The speed and maneuverability of UAS platforms are attractive to leak detection and repair program operators but introduce several challenges. Many existing methane detection algorithms rely on mostly static backgrounds becoming unusable with motion. In addition, top-down views of fugitive methane emissions present differently in Optical Gas Imagers (OGI) compared to looking across the plume. Our work has focused on overcoming these challenges, enhancing the operators' ability to detect methane emissions and pinpoint their sources. Another recent adaptation to SLED/M is the ability to quantify methane emissions using passive sensors (OGI, thermal camera), environmental conditions, plume modeling, and deep learning.
SLED/M advances the state-of-the-art for methane emission detection and quantification by focusing on three key critical criteria for effective methane emission mitigation: (1) autonomy (no need for a human to be in the loop), (2) high reliability (low false alarm rates), and (3) real-time performance. Results from this work will be presented.
Mr. Spidle is a research engineer in the Advanced Inspection Systems Section at Southwest Research Institute. He has a background in digital image processing, computer vision, and machine learning. Mr. Spidle is the Principal Investigator and lead developer on a project for the US Department of Energy, focused on autonomous detection of methane leaks using onlyMidwave Infrared (MWIR) optical sensing and deep learning.
Every day our lives are intertwined with applications, services, orders, products, research, and objects that are incorporated, produced, or effected in some way by Artificial Intelligence and Machine Learning. Buzz words like Deep Learning, Big Data, Supervised and Unsupervised Learning are employed routinely to describe Machine Learning, but how do these applications relate to geoscience interpretation and finding oil and gas? More importantly, do these Machine Learning methods produce better results than conventional interpretation approaches? This course will initially wade through the vernacular of Machine Learning and Data Science as it relates to the geoscientist. An overview of how these methods are being employed, as well as, interpretation case studies of different machine learning applications will be presented. An overview of how high-performance computing and the utilization of Cloud Services related to Machine Learning will be described. Machine Learning is a disruptive technology that holds great promise and this course will be presented from an interpreter’s perspective, not a data scientist. This course will provide an understanding of how Machine Learning for interpretation is being utilized today and provide insights on future directions and trends.
Instructor: Rocky Roden, Sr. Consulting Geophysicist, Geophysical Insights
Certification Available: No
Total classroom time: 1 hour
Cost: $20
The aim of the Digital Democracy program is scaling digitalization to speed up time to value.
We will mobilize and empower the OMV Petrom workforce to use Data visualization (PowerBI), eSignatures, Desktop automation and Advanced analytics to solve personal or departmental digitalization challenges and make our company more agile and efficient.
Empowering people (Citizen Developers) through access to data relevant to their job and providing them with the tools and skills to solve local digitalization opportunities will create insight and efficiency driven savings and identify and grab new opportunities faster.
Upskilling the workforce and moving them to a digital mindset will also increase the speed of adoption of the corporate digital programs.
By combining the top down large digitalization projects with the bottom up Digital Democracy approach, we expect to scale faster and be more agile.
Jaco is Chief Innovation at OMV Petrom, where he is driving the change towards a more innovative and digitally dexterous company with a healthy portfolio of new business options. He does so a.o. by sponsoring a portfolio of proof of concepts to kick start innovations, building a digital academy to develop skills, forging collaborations with external partners, and leading a cross-divisional innovation council to drive alignment and implementation throughout the company.
In his four years at Shell, he helped their transition to a more Open Innovation approach. He re-invigorated initiatives like GameChanger and External Technology Collaborations, and set up internal tribes and learning programs to drive culture change. He built Shell TechWorks, an innovative outfit that maximizes the use of external technologies in a ‘skunkworks’ approach.
Jaco also spent 23 years with Royal DSM in progressive roles across four different businesses, including Business manager Flavors, BU Director (super)Fibers, VP Business Incubator, and Director of the China Innovation Centre. As a member of the DSM Innovation Council, he drove the company change from operational excellence to innovation excellence.
Enterprises are now operating at the edge. On factory floors. In stores. On city streets. In urgent care facilities. On Rigs. In Refineries. On Utility Lines. In Smart Meters. At the edge, data flows from billions of IoT sensors to be processed by edge devices and servers, driving real-time decisions where they are needed. All of this is possible—smart retail, cities, manufacturing, utilities, and oil and gas—by bringing the power of AI to the edge.
The presentation discusses methods of leveraging a cloud-native, edge-first, and scalable software stack that enable quick and easy provisioning of infrastructure across a range of devices and servers. Come to the session to discuss the many opportunities to deliver the power of accelerated AI computing at the edge.
Ken Hester is a Solution Architect Manager for NVIDIA supporting the Energy / O&G Industry in HPC, AI Deep Learning and Machine Learning, and CUDA GPU compute. He is based out of Houston, Texas, and has been with NVIDIA for over 5 years. Prior to NVIDIA, Ken worked in Energy for 15+ years as an industry expert in data science, software architecture, software design and development.
For more information about Ken, visit LinkedIn (https://www.linkedin.com/in/kenhester).
Kurt Marfurt is an Emeritus Professor of Geophysics at the University of Oklahoma, where he mentors students and conducts research to aid seismic interpretation. Marfurt‘s experience includes 23 years as an academician, first at Columbia University, then later at the University of Houston and the University of Oklahoma. His career also includes 18 years in technology development at Amoco’s Tulsa Research Center working on a wide range of topics. At OU, Marfurt contributes to the Attribute-Assisted Seismic Processing and Interpretation (AASPI) consortium with the goal of developing and calibrating new seismic attributes to aid in seismic processing, seismic interpretation, and data integration using both interactive and machine learning tools. He has served as an SEG distinguished lecture short course instructor, as Editor-in-Chief for the AAPG/SEG journal Interpretation, and is currently Director-at-Large for the SEG and the AAPG/SEG distinguished lecturer for 2021-2022.
Post Graduate Degree in Geoscientist from Science University in Tunisia, with 18 years’ experience in geomodeling, reservoir management and development de-risking. Prior to joining Dragon Oil, Lamia was in a leadership position with OMV, involved in many subsurface studies. She implements the latest advanced workflow for Reservoir Characterization, such as OBN, AI and Machine Learning.
Mapping and extracting features of interest is one of the most important objectives in seismic data interpretation. Due to the complexity of seismic data, geologic features identified by interpreters on seismic data using visualization techniques are often challenging to extract. With the rapid development in GPU computing power and the success obtained in computer vision, deep learning techniques, represented by convolutional neural networks (CNN), start to entice seismic interpreters in various applications. The main advantages of CNN over other supervised machine learning methods are its spatial awareness and automatic attribute extraction. The high flexibility in CNN architecture enables researchers to design different CNN models to identify different features of interest.
Instructor: Dr. Tao Zhao
Certification Available: No
Total classroom time: 45 minutes
Cost: Free
The course is ideal for geoscientists, engineers, and data analysts at all experience levels. Concepts are supported with ample illustrations and case studies, complemented by mathematical rigor benefiting the subject. Aspects of supervised learning, unsupervised learning, classification, and reclassification are introduced to illustrate how these methods apply to seismic data. For this version of the class, assessments are given at intervals throughout the course to gauge comprehension. Upon completion with a passing total score, a certificate is issue, certified by Geophysical Insights.
Instructor: Dr. Tom Smith, President and CEO, Geophysical Insights
Certification Available: Yes
Total classroom time: 12 hours
Cost: $75 (with certification), $50 (without certification)
This course covers big-picture machine learning buzz words with both humor and unassailable frankness. The goal of the course is for every geoscientist to gain confidence in these important concepts and how they add to our well-established practices, particularly seismic interpretation. Presentation topics include a machine learning historical perspective, what makes it different, a fish factory, Shazam, comparison of supervised and unsupervised machine learning methods with examples, tuning thickness, deep learning, hard/soft attribute spaces, seismic wavelets and multi-attribute samples, and several interpretation examples. On conclusion, you may not know how to run machine learning algorithms, but you should be able to appreciate their value and some of their limitations.
Instructor: Dr. Tom Smith, President and CEO, Geophysical Insights
Time-lapse (4D) seismic analysis plays a vital role in reservoir management and reservoir simulation model updates. However, 4D seismic data are subject to interference and tuning effects. Being able to resolve and monitor thin reservoirs of different quality can aid in optimizing infill drilling or locating bypassed hydrocarbons. Using 4D seismic data from the Maui field in the offshore Taranaki basin of New Zealand, we generate typical seismic attributes sensitive to reservoir thickness and rock properties. We find that spectral instantaneous attributes extracted from time-lapse seismic data illuminate more detailed reservoir features compared to those same attributes computed on broadband seismic data. We develop an unsupervised machine learning workflow that enables us to combine eight spectral instantaneous seismic attributes into single classification volumes for the baseline and monitor surveys using self-organizing maps (SOM). Changes in the SOM natural clusters between the baseline and monitor surveys suggest production-related changes that are caused primarily by water replacing gas as the reservoir is being swept under a strong water drive. The classification volumes also facilitate monitoring water saturation changes within thin reservoirs (ranging from very good to poor quality) as well as illuminating thin baffles. Thus, these SOM classification volumes show internal reservoir heterogeneity that can be incorporated into reservoir simulation models. Using meaningful SOM clusters, geobodies are generated for the baseline and monitor SOM classifications. The recoverable gas reserves for those geobodies are then computed and compared to production data. The SOM classifications of the Maui 4D seismic data seems to be sensitive to water saturation change and subtle pressure depletions due to gas production under a strong water drive.
Marwa Hussein received a B.S. (2010) in Geophysics from Ain Shams University, and then worked as teaching assistant in the Geophysics department at Ain Shams University for five years. She received her M.S. (2014) in applied Geophysics from Ain Shams University. She later received her Ph.D. in Geophysics (2020) from the University of Houston. She did different research projects and internships for different companies such as BP, Shell, Schlumberger and many joint ventures. Dr. Hussein is currently assistant lecture and researcher at Ain Shams University. Her research focusses primarily on advanced seismic interpretation techniques, seismic attribute analysis, reservoir characterization, machine learning and time-lapse seismic studies.
Machine learning has evolved over several decades. Since the mid-2000s, neural networks have re-emerged along with various deep learning architectures. These advances have enabled successful applications of deep learning methods in many industries. However, these methods are not being fully exploited in the Oil and Gas industry. Today, many researchers and engineers have taken the initiative to actively research and develop numerous modern machine learning applications in various domains in this industry, including geosciences, drilling, field development planning, completions, production forecasting, to name a few. During the talk, I will demystify the Machine Learning journey and share examples of how machine learning is changing our industry for the years to come. I will share lessons learned from our own company’s experience in this domain.
Data – its availability, volume, and diversity, along with advances in machine learning and computational power, have created unique opportunities to not only achieve significant improvements in both our and our clients, performance, but also to transform how the oil and gas service industry operates today, and in the future.
Maurice Nessim is the President of WesternGeco - Schlumberger, the world’s leading geophysical exploration company specialized in understanding complex subsurface challenges and designing an integrated data to discovery solution that will provide a better, faster, and cheaper answer using advanced processing and interpretation technologies. With over 35 years of oil and gas industry experience in various management and technical positions. Today, Maurice is pioneering WesternGeco’s progressive transformation into an asset-light business, built on WesternGeco’s leading position in multiclient, data processing and geophysical interpretation services. He has launched a digital subsurface platform using groundbreaking technology for a one of a kind access to data and technology. Driving a vision that capitalizes on Schlumberger’s digital technology and digital assets to accelerate energy discovery in exploration, development, and production cycles through forging long-lasting strategic collaboration and partnerships.
Michael A. Dunn is an exploration executive with extensive global experience including the Gulf of Mexico, Central America, Australia, China and North Africa. Mr. Dunn has a proven a track record of successfully executing exploration strategies built on a foundation of new and innovative technologies. Currently, Michael serves as Senior Vice President of Business Development for Geophysical Insights.
He joined Shell in 1979 as an exploration geophysicist and party chief and held increasing levels or responsibility including Manager of Interpretation Research. In 1997, he participated in the launch of Geokinetics, which completed an IPO on the AMEX in 2007. His extensive experience with oil companies (Shell and Woodside) and the service sector (Geokinetics and Halliburton) has provided him with a unique perspective on technology and applications in oil and gas. Michael received a B.S. in Geology from Rutgers University and an M.S. in Geophysics from the University of Chicago.
The energy industry is one of few sectors in the economy that established new technologies incubation for more than a century. Strive to enhance efficiencies, mitigate risks, and ensure safety across workflows resulted in significant discoveries and accumulated knowledge by SMEs.
The field of artificial intelligence has evolved through four distinctive waves of evolution. Witnessing the emergence of the fifth wave of AI is coincidental with a disruptive era of the energy transition. Primary agents of change, largely driven by fundamentals, are not the primary players enabling the radical shift in defining energy futures. We provide examples of AI development waves with examples of applications to the energy industry in general, and to the oil and gas, in particular. We introduce the fifth wave of AI and highlight intertwined opportunities for addressing complex science and engineering innovations to achieve natural transition to carbon-neutral future.
Dr. Mohamed Sidahmed leads the Deep Learning and Artificial Intelligence R&D at Shell. Dr. Sidahmed does research in Petroleum Engineering, Machine Learning and Artificial Intelligence applications.
Dr. Nam Hoai Pham is currently principal geophysicist and QI specialist (Quantitative Interpretation) at Idemitsu Petroleum Norge AS. Nam spent almost 20 years working both as consultant and employee in various oil and gas companies, Statoil(now called Equinor), Den Norske, Envision, and Idemitsu with his specialties in Quantitative seismic Interpretation, Rock-physics, AVO, and Inversion. His academic background is master’s degrees in reservoir engineering at the University of Stavanger - Norway, and Ph.D. in Geophysics and Rock-physics at NTNU (Norwegian University of Science and Technology).
3D seismic imaging revolutionized hydrocarbon exploration providing a robust picture of the subsurface. Higher prices enabled expensive technologies and investments in the development of previously uneconomic deposits. The balance between development and the market value of the gas or oil is critical. Recent advances in 3D seismic allow interpreters to map areas of higher productivity and identify bypassed reserves. MicroSeismic mapping has made completion more efficient and safer. Geophysical data is now an accepted early development tool of successful oil and gas companies.
Nancy House, a member of SEG for nearly 40 years, joining in 1978 as a graduate student at CSM, has worked as a geophysicist for multinational corporations and small independent oil companies primarily as an interpreter on and offshore US, South America and Africa (West and East), and other areas. She is a second-generation geoscientist, growing up in South America and Singapore. She has a BA in Geology/Geophysics from the University of Wyoming, (1976), an MSc in Geophysics from Colorado School of Mines (1979), and did additional postgraduate work at Colorado School of Mines in Reservoir Characterization, Economics and Geophysics (2000-2002).
From the first SEG Annual Meeting Nancy attended in San Francisco in 1978 as a student, she knew that SEG would play an essential part in her career. Early on, SEG provided valuable training, networking opportunities and guidance in professional standards and ethics. Nancy has served on numerous committees including, GAC, Women’s Network Committee, Finance Committee, Membership Committees in SEG. She served as Denver Geophysical Society President/Past President from 2008-2010, General Chairman for SEG AM 2010, and Secretary-Treasurer 2011-2012, Chairman of SEG Women’s Network Committee 2012-2013, and the Finance Committee 2012-2014. Nancy has been a regular contributor to TLE, presenter at meetings (Best Poster 1995), a reviewer for Geophysics and a session chair for various meetings. She also served on several task forces to understand critical business issues around SEGs global activities. She has been a member of AAPG, Dallas GS, Den GS, RMAG, DivEnvirGeol(AAPG), AGU, AWG, and EAEG.
As SEG President 2017-2018, she focused on increasing diversity and inclusion in the profession of geophysics, continued strategies implemented by Rd. Bradford and Bill Abriel and recognized the social contribution of Geophysics and applied geophysics in areas additional to oil and Gas
Over the last decade, the O&G Energy industry has slowly evolved to incorporate essential digitalization components from the so-called “Industry 4.0 Revolution”. Nevertheless, this digitalization process has led to an explosive increase in the magnitude and value of data generated from several interconnected systems, each with their proprietary framework, standards, and analytical platforms that leave the engineers in charge of critical decision making overwhelmed by the amount of endless segregated information flow. Companies are required to break an ongoing project into small individual tasks and develop the required digital interfaces between every single moving subsystem in-house. This leads to unnecessary complexity, lack of long-term support, and inefficiencies. Furthermore, progress is still lacking in artificial intelligence, visualization, and interoperability of the drilling/energy/business processes.
This presentation will focus on how we aim to solve these problems by developing a state-of-the-art energy analytics and visualization digital twin platform that leverages photorealistic real-time graphics, high-fidelity numerical simulation, and trained physics-based machine learning algorithms to provide the entire energy extraction processes at the fingertips of the operator through a revolutionizing cloud-based platform.
Narendra Vishnumolakala (Vish) is currently a doctoral student at Texas A&M University. He is currently working on developing a reinforcement learning based autonomous downhole drilling tool. Vish has a Master’s in Petroleum Engineering and Bachelor’s in Electronics and Instrumentation Engineering. Before joining the Ph.D. program, Vish worked for ExxonMobil as a Subject Matter Expert in Automation for about 3 years. Prior to that, he worked in the downstream sector, for Indian Oil Corporation Limited in India as an Operations Officer for 3 years. He is currently co-founder of Teale, a tech startup focused on accelerating digital transformation of the Energy Industry.
Neil is a Principal Data Scientist for McKinsey QuantumBlack. Neil started his career working as a Field Engineer for Schlumberger working in the Manifa and Shaybah fields in Saudi Arabia in Drilling and Measurements. Neil then joined Shell in Houston as an Automation Wells Engineer. As part of Shell's digital transformation, Neil moved into the role of Deep Learning Lead where he led the Geodesic – Digital Accelerator Product and founded the Shell AI Residency Programme.
Neil received his undergraduate degree in 2007 in Astrophysics from Peterhouse College at the University of Cambridge. Neil then went on to receive his Engineering Doctorate from the Department of Aerospace Sciences from Cranfield University in the UK on the topic of Trajectory Control for Autonomous Systems. Neil is currently an Adjunct Professor in the Practice at Rice University in the Statistics Dept and holds eleven patents and patent-pendings in Machine Learning and Control systems for oil and gas applications. Recently Neil co-organizes the Houston Machine Learning Meetup Group.
Rob is a Principal Program Manager on the Azure Global Energy Team at Microsoft. He recently joined Microsoft after 17 years at ExxonMobil. He brings industry experience and geoscience expertise to Microsoft where he is focused on helping oil and gas customers find value through innovation. Over the past 17 years at ExxonMobil, he held 15 different technical and leadership positions in Exploration, Development, and Production. He’s worked with partners and governments in 10 different countries on 5 continents.
Most recently, Rob led the Upstream Innovation team within ExxonMobil. His team was focused on rapidly delivering solutions to help ExxonMobil increase profitability from exploration to production. They were also working to instill a culture of innovation by bringing an entrepreneurial mindset to all employees. Leveraging design thinking and lean startup, the team demonstrated they can challenge paradigms, empower others, and deliver value.
Rob earned a Bachelor of Science in Geology from Vanderbilt University and a Master of Science from the University of California Santa Cruz. He lives in the Woodlands, Texas with his wife and two sons (11 and 14 years old). Rob is a former Loaned Executive and Young Leaders Chair with United Way of Greater Houston. He enjoys snow skiing, traveling, and inspiring others to innovate.
Every day our lives are intertwined with applications, services, orders, products, research, and objects that are incorporated, produced, or effected in some way by Artificial Intelligence and Machine Learning. Buzz words like Deep Learning, Big Data, Supervised and Unsupervised Learning are employed routinely to describe Machine Learning, but how does this technology relate to geoscience interpretation and finding oil and gas? More importantly, do Machine Learning methods produce better results than conventional interpretation approaches, or are they simply a means of automating existing processes? Traditional interpretation approaches that geologists and geophysicists employ are physics-based solutions and now Machine Learning threatens to alter that accepted practice. Will the integration of machine learning improve our present interpretation workflows, provide moderate to no improvements (overhyped) or produce “profound” results that have not been identified previously? Machine Learning is a disruptive technology that holds great promise and this presentation will explore that potential from a geoscience interpreter’s perspective.
Rocky R. Roden owns Rocky Ridge Resources, a consulting practice, and works with several oil companies on technical and prospect evaluation issues. Mr. Roden advises Geophysical Insights in technology development direction and consulting engagements. He also is a principal in the Rose and Associates DHI Risk Analysis Consortium, where he works with producers worldwide. Rocky is a proven oil finder (36 years in the industry) with extensive knowledge of modern geoscience technical approaches and past Chairman of The Leading Edge Editorial Board. As Chief Geophysicist and Director of Applied Technology for Repsol-YPF, his role included advising corporate officers, geoscientists, and managers on interpretation, strategy, and technical analysis for exploration and development in offices in the U.S., Argentina, Spain, Egypt, Bolivia, Ecuador, Peru, Brazil, Venezuela, Malaysia, and Indonesia. He has been involved in the technical and economic evaluation of Gulf of Mexico lease sales, farmouts worldwide, and bid rounds in South America, Europe, and the Far East. His previous experience includes exploration and development at Maxus Energy, Pogo Producing, Decca Survey, and Texaco. Mr. Roden holds a B.S. in Oceanographic Technology-Geology from Lamar University and an M.S. in Geological and Geophysical Oceanography from
Different tech industries in silicon valley were successfully able to reap the benefits of integrating machine learning into their business models. The oil and gas industry is slowly but surely catching up to the trend of utilizing machine learning in different aspects of the business. Several oil companies and service providers have partnered with tech companies like Microsoft and others to reap the benefits of data analytics for their organizations. In this talk, I will briefly touch upon different applications of data analytics possible in reservoir engineering and how some of the recent work done in this space have benefited the reservoir engineering community with reduced cycle times. Finding the right models to fit your data is of extreme importance and all engineers need to step up and get digitally fluent to be able to leverage data science into mainstream workflows fully. Traditional reservoir engineering workflows are time and labor-intensive. Integrating multiple sources and scale of data coming in from a variety of surveillance operations need an integrated approach to characterize reservoirs quickly and accurately. Three case studies will be presented from published literature that have presented new ways of using machine learning techniques to improve reservoir simulation, reserves forecasting and reservoir monitoring.
Sarath Ketineni currently works as a senior reservoir engineer at Chevron’s Mid-Continent Business Unit, working in an Asset Development role. He began his career 4 years ago with Chevron and has had two years of prior downstream experience. In his current role, he optimizes field development plans for conventional waterflood and tertiary floods within Chevron’s portfolio and unconventional EOR projects. Prior to this role, he worked as a reservoir simulation engineer at Chevron’s Energy Technology Company. He also serves as a Technical Editor for several SPE journals, the Journal of Natural Gas Science and Engineering, and the Journal of Petroleum Science and Engineering. Apart from these, he is also an SPE e-mentor, student paper contest judge, and virtual career pathways advisor at SPE. He holds a B.Tech in Chemical Engineering from IIT Madras and M.S., Ph.D. in Petroleum Engineering from Penn State. His broad research interests lie in artificial intelligence for oil and gas, advanced reservoir simulation techniques, 4D seismic data integration, and unconventional EOR. Sarath Ketineni currently serves on SPE GCS Young Professionals Board as Roughneck Camp co-Chair.
The evaluation of seismic attributes is a powerful tool in the interpretation of different geologic environments of deposition. Seismic attributes, specifically geometric and spectral decomposition attributes, provide a framework for interpreting geologic features that define depositional environments. This video course identifies the appropriate seismic attributes for various geologic settings and describes how these attributes are applied. Lecture and demonstrations cover the use of attributes in interpretation workflows and manipulate attribute parameters to highlight geologic features. The last video segment of the course describes how sets of attributes are analyzed and classified using multi-attribute, Machine Learning processes to extract more information from the seismic response. Read more
Instructors: Dr. Kurt Marfurt, The University of Oklahoma | Dr. ChingWen Chen, Geophysical Insights | Rocky Roden, Geophysical Insights
Certification Available: No
Total classroom time: 5 hours
Cost: $40
A multidisciplinary approach that is maximizing information extraction from seismic to predict lithofacies and reservoir properties, based on the following steps is presented:
Multi-attribute seismic analysis was applied based on an unsupervised machine learning process called Self- Organizing Maps (SOMs) in Paradise software. The selection of input attributes was thoroughly tested and optimized, based on close co-operation between geophysicists and geologists to extract more extensive and detailed geological features from seismic.
Using the information from nearby wells and knowledge of rock physics, the individual neural classes were quantified and validated and then reorganized and translated to formation properties such as lithofacies, porosity, and clay content.
The study focused on the benefits and additional information that can be gained with this new approach compared to traditional quantitative interpretation approaches (i.e., a prediction from acoustic impedance alone). Multi-attribute classification using machine learning, SOM, gave a better representation of seismic characters, detecting the geologic trends in the field. A detailed quantitative interpretation of SOM neural classes was established to validate and translate formation-related classes optimally for reservoir prediction, and to eliminate classes irrelevant to the formations (i.e., seismic noise).
The result from the Wisting case study shows that the new method gives the best match to the well data and extracts more reservoir related information from seismic compared to the conventional quantitative interpretation (QI) approach. In the upper part of the Triassic, with fluvial sediments assigned to the RG2 unit (Fruholmen Fm.), the reservoir quality and extent of mud clast rich channel intervals are debated. In two of the wells (E and B), a thicker mud unit was observed, and it was debated if this could act as a barrier towards the overlying good reservoir. The result shows that the mud intervals are deposited locally and do not represent a regional mud layer. The additional information from seismic seems to be valuable when used as input and refinement to the digital geological model.
Sharareh is currently working as principal geophysicist for Idemitsu Petroleum Norge (IPN). Sharareh Holds a BSc. In physics and MSc in geophysics. She has been working for 17 years as geophysicist and Quantitative Interpretation Specialist for Norsk Hydro, Statoil (currently known as Equinor) and Idemitsu Petroleum Norge. She has been engaged in various exploration projects in Brazil, Gulf of Mexico, offshore Canada, Nigeria, Angola, Tanzania, Mozambique and Norwegian Continental Shelf.
The promise of Machine Learning applications in oil and gas is well-known, with most of the sector some way along the journey – particularly upstream. But the journey is not straightforward, with plenty of cautionary tales about high costs, low uptake, and dashed expectations.
Defining the right problem – and then developing the right solution - requires strong partnerships with technology companies and academia. Sophisticated and semi-automated data science tools are becoming increasingly accessible to technical staff, but the complementary skillsets of the data science discipline are vital to avoid common pitfalls like overtraining and poor data quality.
Woodside Energy has developed AI/ML solutions that impact almost everybody in the business. From natural language assistants capable of booking leave and retrieving purchase orders, to fast numerical simulations of complex physics, this presentation will reflect on Woodside’s experiences and learnings of the past five years, with insights into the importance of working effectively with people of all skill and awareness levels to fully unlock value.
Shaun Gregory has a Bachelor of Science (Hons) from the University of Western Australia in Mathematical Geophysics and a Master of Business and Technology from the University of New South Wales.
Shaun has over 25 years industry experience and leads Woodside’s Sustainability Division, including Exploration, Technology, Digital, New Energy and Carbon management. He is passionate about technology innovation, the role they play in enabling business outcomes and the future skills needed to be successful.
Shaun is a member of Dean’s Council for the faculty of Engineering, Computing and Mathematics at UWA and is a Board member of Scitech WA.
The 49-minute short-course focuses on Single trace seismic attributes, which include two general varieties: instantaneous and banded, sometimes called Wavelet attributes. The material starts with an organization of seven principal groups or types of attributes and proceeds to set out five primary groups of Single Trace attributes, including Instantaneous, the ‘Tool Kit’ attributes, Instantaneous Layer attributes, Banded attributes, and additional Banded attributes on phase breaks.
Instructor: Dr. Tom Smith, President and CEO, Geophysical Insights
Certification Available: Yes
Total classroom time: 1 hour
Cost: $30 (with certification), $20 (without certification)
Teresa Santana is currently Chief Geophysicist, Advisor and Diversity Officer at YPF S.A., the national energy company in Argentina. Her role is to ensure technical excellence of the discipline in projects looking for innovation and certifies seniorities for the geoscience community.
With more than 30 years of experience in the Energy industry, Teresa is among pioneers in applied geoscience for quantitative and volume interpretation. She worked for domestic and international companies, such as Shell in Argentina, Europe and the US for 20 years, before joining YPF. Teresa participated and led multidisciplinary teams at numerous onshore/offshore basins around the world for conventional/unconventional reservoirs at different scales. During her international career, Teresa master’s in quantitative seismic interpretation and seismic characterization to predict reservoir properties and fluids from the subsurface. As a result, she was immersed in huge global discoveries such as fields in offshore Guyana and offshore North West Australia. Then, she decided to return to Argentina, her home country, to give back her expertise to the local community.
Teresa’s aspiration for equal business opportunities leads her to volunteer as Diversity Officer at YPF, executive member of the SEG Women’s network, and as ambassador of the WomenTech network. In 2020, she received the Globant Women that Build award in the category Technology Executive in Argentina, an international recognition of global women leaders with STEM training, who occupy a leadership position and are an inspiration for other women and the industry at large. Teresa is also an active mentor for young professionals and students at local and international associations (SEG, EAGE, Fundación YPF).
Machine learning has been used over several decades in applied geoscience within the energy industry. Machine learning and deep learning can provide a clearer and faster understanding of the reservoirs with more efficient data analysis and integration. Artificial intelligence evolved with the development of sophisticated techniques and software. Additionally, powerful CPUs/GPUs allow complex algorithms to increase their prediction power and reduce both uncertainties and running time. Lately, innovative learning algorithms allow computers to re learn from their own predictions.
With artificial intelligence, geoscientists are capable to make reliable and faster prognosis based on existing logs, seismic, core to support the numerous requirements at the different scales of the subsurface, from frontier exploration to development projects. Besides, geoscientists demand a searchable, organized, diverse, validated, and unique database for all data types, with historical and current data, to generate new play concepts for conventional and unconventional reservoirs in onshore and offshore basins.
During my talk, I will reinforce the opportunities that innovation and diversity have for artificial intelligence success in applied geoscience, for prediction, uncertainty management and efficiency in the years to come.
Born in 1953 and presently Advisor Exploration in Idemitsu Petroleum Norway AS (IPN). Worked as independent consultant for IPN from 2000 until he became employed in his present position in 2009.
He is educated as sedimentologist and stratigrapher with a thesis on the Late Permian Tempelfjorden Group on Svalbard, at the University in Oslo in 1980 and worked there as Scientific Assistant until 1983 when he joined Saga Petroleum AS Geological Lab. as sedimentologist and worked with the depositional model for the Troll discovery. The earliest model was published in 1986, when he became partner in READ Geology Services and continued to work with the Troll model. A sequence stratigraphic model for this field was published in 1989 while he was working as an independent consultant for Saga Petroleum AS. In 1993 he became Senior Advisor in sedimentology in Saga and continued there until the company was bought up by Norsk Hydro AS.
During the last 14 years he has been working in the Barents Sea area and has contributed to the discoveries of Wisting, Alta and Neiden.
He has also an extensive experience in field geology, comprising eleven seasons on Svalbard, several fieldwork periods in Sinai (Egypt), Libya, several places in Europe and the USA. He has been leading excursions to Svalbard, Sinai and Luxembourg for the oil industry and proprietary excursion to Crete for IPN.