Join thought leaders and innovators using Machine Learning applications in oil and gas.
29 OCTOBER 2020 | A VIRTUAL EVENT
Revealing stratigraphic patterns in seismic data in the early stages of exploration and development can be very challenging without sufficient well information and a clear understating of the depositional environment of the reservoir. Implementation of Machine learning technologies such as unsupervised classification methods and supervised convolutional neuronal networks enables a greater understanding of patterns in the data, similarities, or heterogeneities efficiently and effectively. Using machine learning and deep learning methods, interpreters are better positioned to forecast rock types and assess reservoir quality, thereby increasing the probability of success in wildcat wells and maximizing recovery efficiencies in future development projects.
The methodology and workflow that applied these new tools were successful in differentiating the main rock types in an exploratory project in a frontier basin in South America. The geology was a complex depositional environment with lateral discontinuity and compartmentalization of the reservoir. The environment was challenging to drilling. Together, these conditions made conventional interpretation workflows insufficient for proper exploration analysis and selection of future well locations.
Camilo Sierra Cardenas is Geology and Geophysics Exploration Manager, Colombia Asset, at Lewis Energy in San Antonio, Texas. Mr. Cardenas holds a B.Sc. in Geology from the National University of Colombia, 2005, Bogota, and has 14 years of Experience in conventional reservoir exploration. He has participated in over 50 wildcat projects in different structural styles, complex tectonic settings, and depositional environments in Colombia.
Overview of examples where IBM applied advanced cognitive computing to exploration and production processes in the oil & gas industry. Starting from Geoscience AI platform that rapidly digests and interprets geological information spread across geological papers, seismic images, well logs and supports knowledge capture from a broad range of studies; and concluding with AI Production Optimization solution applied to multi-facilities continuous production process. The Geoscience solutions have delivered material reduction of exploration and production risks and avoided hundreds of millions in exploration costs. The Production Optimization solution demonstrated the efficacy of AI-based systems in predicting complex interdependent production process failures, reducing recovery time, and alerting operators to production improvement opportunities in a steady-state operations mode materially reducing production costs per barrel.
Dariusz Piotrowski, Director, Global AI Solutions, IBM Natural Resources Industry Platform, leads the strategy and development of AI (Artificial Intelligence) solutions in natural resources (oil, gas, and mining). Dariusz specializes in large transformational projects focused on optimization, machine learning, and cognitive analytics. He has more than 20 years of technology and consulting experience working with senior leaders within some of the world’s largest natural resources companies. Dariusz helps these companies realize business value through fusing advanced technologies, data science, applied R&D, and agile methodologies and practices to transform business processes and performance. Currently, Dariusz leads the global natural resources AI development team within IBM Industry Platforms. Dariusz holds architecture and civil engineering degrees from Warsaw University of Technology in Poland and an M.B.A. from the Richard Ivey School of Business in Western Ontario.
In 2019, innovation and in particular digital innovation have taken center stage in enabling companies to re-orient their businesses to the new economic reality. Artificial Intelligence is seen by many as a key enabling technology, but companies often struggle to move their innovation pipeline into impactful production systems. This talk will discuss :
The role of Generation Z and Generation See-the-Beach in innovation
Vendors, Communities and Secret Sauce – how to balance investments in innovation
Building enterprise-grade digital innovation platforms
As Chief Technology Officer – Energy, David is responsible for Dell Technologies’ strategy for the Energy industry. He works with partners and clients to identify business needs and leads teams to develop strategies and architectures to support these requirements. David oversees the roll out of Dell Technologies’ Energy Industry strategy and solutions within the client base globally.
Prior to working for Dell Technologies, David established a highly respected position in the Petro-technical Computing and Information Management community where he was described as “a leader in the global Information Management and Infrastructure community” where he was “highly respected for his ability to engage with clients, develop new solutions and master complex technical/business problems.”
As Information Management Practice Manager for Halliburton-Landmark, David was responsible for architecting, building and operating some of the world’s largest outsourced geotechnical information management and application hosting solutions including for Shell and PGS as well as the National Hydrocarbon Databanks for the UK, Norway and Oman.
Prior to joining Dell Technologies, David was Director of Operations for FUSE IM where he built a team to bring to market FUSE IM’s cloud-based workflow, collaboration and data management tools for petro-technical data. FUSE IM was acquired by Target Energy Solutions in 2014.
David is Dell Technologies’ representative on the OpenEarth Community Executive Committee and was recently elected to the Open Group’s Open Subsurface Data Universe Management Committee. He has previously served on a number of industry committees including the European ECIM Management Committee and SPE’s “Petabytes in Asset Management.” He has delivered numerous technical papers at conferences around the world and holds a patent for his work on the remote visualization of geotechnical applications.
Upstream processes for the petroleum industry are now data-driven. Throughout the history of exploration and production in unconventional environments, multidisciplinary engineering and geoscience data have been continuously collected. Now, using machine learning and AI, this data is being used in new ways to address fracture stimulation and frac-driven interference by enhancing frac modeling. Through the use of data-driven machine learning techniques, models and frameworks can be tailored to individual assets of interest to predict and analyze frac stimulation within a reservoir. However, it seems the data available is still not enough to accurately predict frac behavior.
This presentation will focus on fluid tracking using an electromagnetic technique that provides the missing link to improve machine learning frameworks for frac modeling without disrupting existing workflows. This data directly correlates to completions and production data by providing direct measurements taken during frac stages, which in turn generates relevant features for machine learning inputs. An outlook is also given on recent developments in real time fluid tracking and how it can be integrated with AI to prevent frac-driven interference and forecast performance on a per-stage and per-well basis.
David Moore is currently the President and CEO of Deep Imaging, a leading provider of onshore magnetic imaging. A seasoned energy executive, David spent over a decade at GE Power and GE Oil & Gas and then went on to lead a PE-backed upstream manufacturing company as President and CEO. David began his career as a Captain in the United States Air Force.
Frac hits are a form of fracture-driven interference that occurs when newly drilled wells communicate with existing wells during a completion job. In most cases, frac hits leave a negative production impact. Understanding the main causes of frac hits is complicated and at the same time crucial for optimizing the net profit value of a well pad. Frac hits happen due to a combination of different parameters such as depletion and stress history, inter-well spacing, completion design, and rock characteristics. The available physics-based diagnostics workflows produce outputs with a high degree of uncertainty. These approaches also are unable to handle a database beyond a single well or a few stages. We developed a data-driven approach based on the pattern recognition capabilities of machine learning techniques to characterize and aid understanding of the root causes of frac hits in a well pad during a completion job. The approach was applied to a field data set and indicated that frac hits can be quantitatively attributed to operational or subsurface parameters such as spacing or depletion. A better understanding of frac hits will help to optimize well spacing and completion design parameters and consequently improve hydrocarbon recovery and maximize the return on capital investment.
Dr. Ali Shahkarami is a Senior Engineer at Baker Hughes GE (BHGE) where he leads the Reservoir Analytics team leading a team of subsurface domain experts and data scientists for developing the next generation of data-driven solutions and workflows for oil and gas and energy industry. He is based at the Energy Innovation Center in Oklahoma City. Prior to joining BHGE, he was an Assistant Professor of Petroleum and Natural Gas Engineering (PNGE) at Saint Francis University in Loretto, Pennsylvania. Dr. Shahkarami started the undergraduate PNGE program at Saint Francis University in 2014 and was the program head and led the program accreditation process before joining BHGE in 2018. He holds Ph.D. and MSc degrees in Petroleum and Natural Gas Engineering from West Virginia University.
I will share a broad overview of the forces behind AI and ML. An overview of current challenges to E&P will be discussed and ways in which ML can address these challenges. Additionally, where can ML take the industry in the future and how companies can prepare now for maximizing their benefit as technology in this area rapidly grows.
Dr. Arvind Sharma is the VP of Data & Analytics at TGS. In this role, he is responsible for Machine Learning initiatives as well as broader digital transformation. He has 10+ years of experience in various seismic and software-related work. Arvind has bachelors and masters degrees in Applied Geology and Exploration Geophysics respectively from the Indian Institute of Technology (IIT) Kharagpur. He has a Ph.D. from Virginia Tech (VT) in Geophysics.
Arvind has a broad background in the oil and gas industry as well as outside the industry. He has worked in jobs ranging from software engineering (Infosys) to efficient seismic acquisition design (PGS) to developing seismic image algorithms (BP) to prospecting and drilling exploration wells (BP). Most recently Arvind was Chief Geophysicist at PGS and held a similar role at TGS before his current position where he led the industry’s first crowdsourcing challenge “TGS-Kaggle Salt Identification Challenge.” Additionally, he holds several patents, has been the keynote speaker at major conferences and has been featured on several ML podcasts.
At TGS, his mission is to create a platform to integrate and analyze all available sub-surface information for risking and decision making. Arvind believes that data integration and machine learning will be pivotal to this industry’s future success.
In the Oil & Gas industries’ quest for efficiency and cost reduction, the need to significantly reduce the cycle time of conventional interpretation workflows while utilizing maximum detail of the seismic information (Big Data) for all exploration projects is mandatory. This talk will demonstrate our approach to enhance and accelerate the seismic interpretation task based on technology democratization.
German Larrazabal has been an R&D Geophysics Advisor for Repsol since 2011. Previously, Dr. Larrazabal was an Applied Mathematician and Computer Scientist from Universidad Central de Venezuela (UCV), Faculty of Sciences, School of Mathematics. Also, Larrazabal has a Master of Sciences in Computer Science from “Universidad Central de Venezuela”, Faculty of Science, School of Computer Science. Moreover, Larrazabal has a Ph.D. in Computer Sciences, Cum Laude award, from University Polytechnic of Catalonia (UPC), Barcelona Supercomputing Center (BSC), Barcelona, Spain.
Larrazabal has been a Professor and Researcher at the University of Carabobo, Venezuela, San Diego State University, USA and University of Texas at El Paso, USA. Also, Larrazabal has been Visiting Professor in the Computational Science Research Center (CSRC) at San Diego State University, USA. He has authored numerous papers and is a prominent speaker at digital transformation conferences.
Interpreters face two main challenges in computer-assisted seismic facies analysis. The first challenge is to define, or “label”, the facies of interest. The second challenge is to select a suite of attributes that can differentiate target facies from each other and from the background reflectivity. Accurately defining the seismic expression of a given seismic facies requires an understanding of not only geologic processes but also the limits of seismic acquisition, processing, and imaging. Our goals are to provide a good classification model in terms of validation accuracy, provide quantitative metrics (and ideally, geological insight) as to why a given attribute suite is chosen, and to minimize the computational and memory required.
In principle, a desirable attribute subset is built by detecting relevant and discarding irrelevant attributes. Relevant attributes are those that are highly correlated with the output classes using a technique called univariate attribute analysis. In contrast, redundant attributes are highly correlated with each other. We hypothesize that the redundant and useless attributes that confuse human interpreters also pose problems in machine-learning classification.
Kurt J. Marfurt joined The University of Oklahoma in 2007 where he serves as the Frank and Henrietta Schultz Professor of Geophysics within the ConocoPhillips School of Geology and Geophysics. Marfurt’s primary research interest is in the development and calibration of new seismic attributes to aid in seismic processing, seismic interpretation, and reservoir characterization. Recent work has focused on applying coherence, spectral decomposition, structure-oriented filtering, and volumetric curvature to mapping fractures and karst with a particular focus on resource plays. Marfurt earned a Ph.D. in applied geophysics at Columbia University’s Henry Krumb School of Mines in New York in 1978 where he also taught as an Assistant Professor for four years. He worked 18 years in a wide range of research projects at Amoco’s Tulsa Research Center after which he joined the University of Houston for eight years as a Professor of Geophysics and the Director of the Allied Geophysics Lab. He has received the SEG best paper (for coherence), SEG best presentation (for seismic modeling), as a coauthor with Satinder Chopra best SEG poster (one on curvature, one on principal component analysis) and best AAPG technical presentation, and as a coauthor with Roderick Perez Altimar, AAPG/SEG Interpretation best paper (on brittleness) awards. Marfurt also served as the SEG/EAGE Distinguished Short Course Instructor for 2006 (on seismic attributes). In addition to teaching and research duties at OU, Marfurt leads short courses on attributes for SEG and AAPG and currently serves as Editor-in-Chief of the AAPG/SEG journal Interpretation.
Lennart Johnsson is a Hugh Roy and Lillie Cranz Cullen Distinguished University Chair of Computer Science, Mathematics, and Electrical and Computer Engineering at the University of Houston and is Professor Emeritus at the Royal Institute of Technology, Stockholm, Sweden. Professor Johnsson has served on the Faculties of California Institute of Technology, Yale University, Harvard University, and the Royal Institute of Technology). He has served as Manager of Systems Engineering, Electrical Systems, ABB Corporate Research, Sweden and Director of Computational Sciences at Thinking Machines Corp. (TMC).
Quantitative seismic reservoir characterization poses a mathematically ill-constrained inversion problem traditionally solved by methods relying on pre-stack seismic inversion and subsequent rock physics transforms. Alternatively, subsurface models can be matched to field seismic data by seismic forward modeling using wells as calibration points. Both these approaches face practical limitations in the sparsity of calibration data and severe non-linearity of the problem requiring multiple simplifying assumptions. Recent extensive developments in machine learning and data-driven model building can provide significant accuracy and efficiency uplift in solving this problem by streamlining seismic attribute analysis and avoiding the need to pass through the elastic domain. We present various approaches to seismic machine learning and their application to both static and dynamic reservoir characterization projects and discuss comparisons to conventional 3D and 4D quantitative interpretation workflows. Emphasis will be given to practical approaches enhancing cross-discipline integration and validation of data analytics methods using both geophysical and data science approaches. We highlight the advantages, challenges and systematic biases encountered in this type of analysis and discuss potential extensions of the data analytics approach using deep learning methods.
After receiving his Ph.D. in Physics from Florida State University, Mike has worked as a researcher in particle physics and cosmology with a keen interest in phenomenological modeling and the interface between theory and experiment. He joined Shell in 2001 and focused on various aspects of geophysics, both seismic and non-seismic, most importantly the applications of inversion theory to quantitative interpretation and reservoir characterization. Since 2010, he has been working for ConocoPhillips in a similar capacity and with the onset of data analytics and machine learning has taken a strong interest in applications of data-driven model building approaches to solving geophysical problems
Dr. Tom Smith, the founder of Seismic Micro-Technology (SMT) and creator of the KINGDOM Software Suite, is the President and CEO of Geophysical Insights (geoinsights.com), where he leads a team of geophysicists, geologists and computer scientists in developing machine learning technologies for interpretation. Dr. Tom Smith received BS and MS degreeS in Geology from Iowa State University and a Ph.D. in Geophysics from the University of Houston. Over a 50-year career, Dr. Smith has been recognized numerous times for his accomplishments in pioneering the science of geophysics. The Society of Exploration Geologists (SEG) recognized Dr. Smith’s work with the SEG Enterprise Award in 2000, and in 2010, the Geophysical Society of Houston (GSH) awarded him an Honorary Membership. Iowa State University (ISU) recognized Dr. Smith’s accomplished career with the Distinguished Alumnus Lecturer Award in 1996, the Citation of Merit for National and International Recognition in 2002, and the highest alumni honor in 2015, the Distinguished Alumni Award. The University of Houston College of Natural Sciences and Mathematics recognized Dr. Smith with the 2017 Distinguished Alumni Award.
Today most of the focus of AI on the Machine Learning / Deep Learning industry requires a large amount of data that is not always available in early phases of Exploration and Production. To address the data restrictions, I will discuss neuro-symbolic approaches to combine Machine Learning / Deep Learning with formal pre-existing domain knowledge and formal knowledge representation and reasoning. This knowledge-enhanced Machine Learning approach, coupled with transfer learning techniques, allows working with a smaller amount of data and effort necessary to train AI-based models. We will discuss how we can augment ML technologies with domain and contextual knowledge, and enabled more effective transfer learning in applications to real cases for AI-assisted seismic Interpretation and well-log analysis.
Ulisses T. Mello is the director of IBM Research – Brazil with sites in São Paulo and Rio de Janeiro. Ulisses is also an IBM global research executive for the chemicals and petroleum Industry sector. He holds a Ph.D. (1994) and MA (1992) in geology from Columbia University, an M.Sc. in geology from Federal University of Ouro Preto (1987), and a B.Sc. (1983) in geology from the University of Sao Paulo, Brazil. His research interests are large-scale basin modeling, hydrogeological modeling, digital oil fields, integrated operation optimization, data assimilation, unstructured meshing, parallel computing, advanced water management, and computational geosciences.
The past a few years mark the fastest development of machine learning in seismic interpretation community we have ever seen. One of the major accelerators of such growth is the success of deep learning methods, originated from the computer vision discipline. Over the past three years, we witnessed the rapid adoption of deep learning techniques in seismic interpretation. However, most of such adoption is still limited to academia and research institutes, primarily due to the fact that, albeit providing a high-quality result, deep learning methods usually requires much more training data to work effectively. Preparing such training data is often time-consuming, if not impractical, for a seismic interpreter. This presentation is focused on deep learning applications that require limited or even no training data from a general seismic interpreter, which make deep learning more accessible to general seismic interpreters. Examples on deep learning-based seismic facies classification and fault detections demonstrate that a general seismic interpreter can benefit from the high-quality result, while also with greatly improved efficiency.
Dustin Dewitt has spent the last 10 years in pursuit of geoscience excellence, integrating advanced seismic interpretation techniques with general geoscience workflows to develop methods, which enhance geologic interpretations. After service in the United States Navy, Dustin moved into the private sector, attaining both a B.S. and an M.S. in Geophysics and Seismology from the University of Oklahoma. Upon graduating, he gained experience with BHP Billiton as a QI Geophysicist and subsequently, an Exploration Geophysicist. Currently, he is a Product Manager for Geophysical Insights, contributing to the development of Paradise, the AI workbench.
There is much hype around all topics related to digitalization and IR 4.0 and their underlying technologies. At the same time, industries have already seen some real-life applications and benefits as well as some real disruption to their value chains, making it more urgent than ever for oil & gas companies to accelerate their digital transformations. Despite the hype and the promise of machine learning, adoption outside of the tech sector is still at an early, often experimental stage with few firms having deployed it at scale.
This presentation will discuss some of the successes and challenges of moving machine learning in Oil & Gas from hype to reality. There have been plenty of promising successful deployments across robotics and autonomous vehicles, computer vision, virtual agents, and machine learning, with the latter including deep learning and underpinning many recent advances in the other AI technologies. However, the fact that it requires big data that must be trained on what’s often sparse, incomplete, and messy data, its tendency to cut across functional, geographic, and organizational silos, and its dependence on having a digital foundation and an upskilled workforce pose formidable challenges to scaling digital and machine learning in particular.
Hani Elshahawi is Digitalization Lead – Deepwater Technologies at Shell where he has spent the last 14 years. Before that, he led FEAST-Shell’s Fluid Evaluation and Sampling Technologies center of excellence, before becoming Deepwater Technology Advisor. Prior to Shell, Hani spent 15 years with Schlumberger in over 10 countries in Africa, Asia, and North America during which he has held various positions in interpretation, consulting, operations, marketing, and technology development. He holds several patents and has authored over 130 technical papers in various areas of petroleum engineering and the geosciences. He was the 2009-2010 President of the SPWLA, distinguished lecturer for the SPE and the SPWLA 2010-2011 and 2013, and recipient of the SPWLA Distinguished Technical Achievement Award in 2012.
Enterprises are now operating at the edge. On factory floors. In stores. On city streets. In urgent care facilities. On Rigs. In Refineries. On Utility Lines. In Smart Meters. At the edge, data flows from billions of IoT sensors to be processed by edge devices and servers, driving real-time decisions where they are needed. All of this is possible—smart retail, cities, manufacturing, utilities, and oil and gas—by bringing the power of AI to the edge.
The presentation discusses methods of leveraging a cloud-native, edge-first, and scalable software stack that enable quick and easy provisioning of infrastructure across a range of devices and servers. Come to the session to discuss the many opportunities to deliver the power of accelerated AI computing at the edge.
Ken Hester is a Solution Architect Manager for NVIDIA supporting the Energy / O&G Industry in HPC, AI Deep Learning and Machine Learning, and CUDA GPU compute. He is based out of Houston, Texas, and has been with NVIDIA for over 5 years. Prior to NVIDIA, Ken worked in Energy for 15+ years as an industry expert in data science, software architecture, software design and development.
For more information about Ken, visit LinkedIn (https://www.linkedin.com/in/kenhester).
3D seismic imaging revolutionized hydrocarbon exploration providing a robust picture of the subsurface. Higher prices enabled expensive technologies and investments in the development of previously uneconomic deposits. The balance between development and the market value of the gas or oil is critical. Recent advances in 3D seismic allow interpreters to map areas of higher productivity and identify bypassed reserves. MicroSeismic mapping has made completion more efﬁcient and safer. Geophysical data is now an accepted early development tool of successful oil and gas companies.
Nancy House, a member of SEG for nearly 40 years, joining in 1978 as a graduate student at CSM, has worked as a geophysicist for multinational corporations and small independent oil companies primarily as an interpreter on and offshore US, South America and Africa (West and East), and other areas. She is a second-generation geoscientist, growing up in South America and Singapore. She has a BA in Geology/Geophysics from the University of Wyoming, (1976), an MSc in Geophysics from Colorado School of Mines (1979), and did additional postgraduate work at Colorado School of Mines in Reservoir Characterization, Economics and Geophysics (2000-2002).
From the first SEG Annual Meeting Nancy attended in San Francisco in 1978 as a student, she knew that SEG would play an essential part in her career. Early on, SEG provided valuable training, networking opportunities and guidance in professional standards and ethics. Nancy has served on numerous committees including, GAC, Women’s Network Committee, Finance Committee, Membership Committees in SEG. She served as Denver Geophysical Society President/Past President from 2008-2010, General Chairman for SEG AM 2010, and Secretary-Treasurer 2011-2012, Chairman of SEG Women’s Network Committee 2012-2013, and the Finance Committee 2012-2014. Nancy has been a regular contributor to TLE, presenter at meetings (Best Poster 1995), a reviewer for Geophysics and a session chair for various meetings. She also served on several task forces to understand critical business issues around SEGs global activities. She has been a member of AAPG, Dallas GS, Den GS, RMAG, DivEnvirGeol(AAPG), AGU, AWG, and EAEG.
As SEG President 2017-2018, she focused on increasing diversity and inclusion in the profession of geophysics, continued strategies implemented by Rd. Bradford and Bill Abriel and recognized the social contribution of Geophysics and applied geophysics in areas additional to oil and Gas
Neil is a Principal Data Scientist for McKinsey QuantumBlack. Neil started his career working as a Field Engineer for Schlumberger working in the Manifa and Shaybah fields in Saudi Arabia in Drilling and Measurements. Neil then joined Shell in Houston as an Automation Wells Engineer. As part of Shell's digital transformation, Neil moved into the role of Deep Learning Lead where he led the Geodesic – Digital Accelerator Product and founded the Shell AI Residency Programme.
Neil received his undergraduate degree in 2007 in Astrophysics from Peterhouse College at the University of Cambridge. Neil then went on to receive his Engineering Doctorate from the Department of Aerospace Sciences from Cranfield University in the UK on the topic of Trajectory Control for Autonomous Systems. Neil is currently an Adjunct Professor in the Practice at Rice University in the Statistics Dept and holds eleven patents and patent-pendings in Machine Learning and Control systems for oil and gas applications. Recently Neil co-organizes the Houston Machine Learning Meetup Group.
Rob is a Principal Program Manager on the Azure Global Energy Team at Microsoft. He recently joined Microsoft after 17 years at ExxonMobil. He brings industry experience and geoscience expertise to Microsoft where he is focused on helping oil and gas customers find value through innovation. Over the past 17 years at ExxonMobil, he held 15 different technical and leadership positions in Exploration, Development, and Production. He’s worked with partners and governments in 10 different countries on 5 continents.
Most recently, Rob led the Upstream Innovation team within ExxonMobil. His team was focused on rapidly delivering solutions to help ExxonMobil increase profitability from exploration to production. They were also working to instill a culture of innovation by bringing an entrepreneurial mindset to all employees. Leveraging design thinking and lean startup, the team demonstrated they can challenge paradigms, empower others, and deliver value.
Rob earned a Bachelor of Science in Geology from Vanderbilt University and a Master of Science from the University of California Santa Cruz. He lives in the Woodlands, Texas with his wife and two sons (11 and 14 years old). Rob is a former Loaned Executive and Young Leaders Chair with United Way of Greater Houston. He enjoys snow skiing, traveling, and inspiring others to innovate.
Every day our lives are intertwined with applications, services, orders, products, research, and objects that are incorporated, produced, or effected in some way by Artificial Intelligence and Machine Learning. Buzz words like Deep Learning, Big Data, Supervised and Unsupervised Learning are employed routinely to describe Machine Learning, but how does this technology relate to geoscience interpretation and finding oil and gas? More importantly, do Machine Learning methods produce better results than conventional interpretation approaches, or are they simply a means of automating existing processes? Traditional interpretation approaches that geologists and geophysicists employ are physics-based solutions and now Machine Learning threatens to alter that accepted practice. Will the integration of machine learning improve our present interpretation workflows, provide moderate to no improvements (overhyped) or produce “profound” results that have not been identified previously? Machine Learning is a disruptive technology that holds great promise and this presentation will explore that potential from a geoscience interpreter’s perspective.
Rocky R. Roden has been involved in the application, evaluation, testing, and development of geoscience technical approaches for the last 44 years (past Chairman – The Leading Edge Editorial Board). In the previous 18 years at his consulting company Rocky Ridge Resources he has worked with numerous oil companies and geoscience software development companies on geoscience technology. As former Chief Geophysicist and Director of Applied Technology for Repsol-YPF, his role comprised advising corporate officers, geoscientists, and managers on interpretation, strategy and technical analysis for exploration and development in offices in the U.S., Argentina, Spain, Egypt, Bolivia, Ecuador, Peru, Brazil, Venezuela, Malaysia, and Indonesia. He is involved in the technical and economic evaluation of Gulf of Mexico lease sales, farmouts worldwide, and bid rounds in South America, Europe, and the Far East. Previous work experience includes exploration and development at Maxus Energy, Pogo Producing, and Texaco. He holds a B.S. in Oceanographic Technology-Geology from Lamar University and a M.S. in Geological and Geophysical Oceanography from Texas A&M.
Different tech industries in silicon valley were successfully able to reap the benefits of integrating machine learning into their business models. The oil and gas industry is slowly but surely catching up to the trend of utilizing machine learning in different aspects of the business. Several oil companies and service providers have partnered with tech companies like Microsoft and others to reap the benefits of data analytics for their organizations. In this talk, I will briefly touch upon different applications of data analytics possible in reservoir engineering and how some of the recent work done in this space have benefited the reservoir engineering community with reduced cycle times. Finding the right models to fit your data is of extreme importance and all engineers need to step up and get digitally fluent to be able to leverage data science into mainstream workflows fully. Traditional reservoir engineering workflows are time and labor-intensive. Integrating multiple sources and scale of data coming in from a variety of surveillance operations need an integrated approach to characterize reservoirs quickly and accurately. Three case studies will be presented from published literature that have presented new ways of using machine learning techniques to improve reservoir simulation, reserves forecasting and reservoir monitoring.
Sarath Ketineni currently works as a senior reservoir engineer at Chevron’s Mid-Continent Business Unit, working in an Asset Development role. He began his career 4 years ago with Chevron and has had two years of prior downstream experience. In his current role, he optimizes field development plans for conventional waterflood and tertiary floods within Chevron’s portfolio and unconventional EOR projects. Prior to this role, he worked as a reservoir simulation engineer at Chevron’s Energy Technology Company. He also serves as a Technical Editor for several SPE journals, the Journal of Natural Gas Science and Engineering, and the Journal of Petroleum Science and Engineering. Apart from these, he is also an SPE e-mentor, student paper contest judge, and virtual career pathways advisor at SPE. He holds a B.Tech in Chemical Engineering from IIT Madras and M.S., Ph.D. in Petroleum Engineering from Penn State. His broad research interests lie in artificial intelligence for oil and gas, advanced reservoir simulation techniques, 4D seismic data integration, and unconventional EOR. Sarath Ketineni currently serves on SPE GCS Young Professionals Board as Roughneck Camp co-Chair.