Interviews

We need integrative innovation

An interview with Maitri Erwin

Coordinated by: Satinder Chopra
Maitri Erwin

Maitri Erwin is currently a geophysicist at Nexen Petroleum USA working on deepwater Gulf of Mexico assets. She previously worked in interpretation and subsurface information technology at Shell Exploration & Production Co. and as a geospatial technology executive. Maitri received a BS in geology from the University of Illinois at Urbana-Champaign, and a MS in structural geology and an interdisciplinary MS in computational sciences and geophysics from the University of Wisconsin at Madison. Maitri is also advisor to Project Gutenberg, the oldest publisher of free electronic books. She is @Maitri on Twitter.

A big challenge for geophysicists and seismic interpreters is to do good work today and to make results available to tomorrow’s workers. With rapidly advancing subsurface data acquisition, analysis, storage, and access tools at our disposal, how do we maximize their use for accurate and efficient project execution and retain information during and after project completion? This requires a two-fold approach: geoscience-focused technological innovation, and a philosophy of interdisciplinary collaboration and communication.

Technology fit for purpose

Seismic analysis is a synthesis of many different data types – the seismic data itself, wells, velocity models, geology, cores, previous work – which in turn generates more data and parameters that are important to reservoir engineers, economists and drillers. Furthermore, geoscientists are inherently interested in exchanging ideas and information, but not necessarily the information-technology business models whereby that happens. What will be helpful then are not new software applications in which to create, analyse, and record, but versatile innovations that help us increase the efficiency of existing data and tools required for interpretation, archival, and transmission. These solutions will acknowledge and fit the subsurface workflow of geophysical interpretation and model building, followed by dynamic well planning and drilltime model refinement, and not the other way around. The user should not be constrained by the tools.

Work together and bridge the gap

The geoscience community has the same problem as the intelligence community. Each person on the project has at least one crucial bit of information that no one else possesses. Analyses also create an immense wealth of knowledge that is not effectively transmitted through the organization and to the next generation of workers. Technical profficiency in geophysics and interpretation is traditionally rewarded, and collaborating with geologists, petrophysicists, reservoir engineers, economists, and drillers takes time. Such collaboration is invaluable as we move forward, however, especially in the areas of inversion and 4D interpretation which involve reservoir architecture, volumetric properties, and fluid movement. Adopted at a management level, a culture of sharing within projects will encourage similar interdisciplinary associations across the organization.

As scientists in a cutting-edge industry, we know that technology is not just software but also the effective utilization and management of data and people. Flexible and light solutions that address our workflow requirements and boost the capabilities of existing tools as well as cross-disciplinary work and communication are two ways to get us there.

Q&A:

Maitri, you suggest a two-fold approach for maximizing project execution efficiency that entails geoscience-focused technological innovation, and a philosophy of interdisciplinary collaboration and communication. Could you elaborate on the versatile innovations that you are suggesting, as well as the model you have in mind for interdisciplinary cooperation?

“Interdisciplinary” means more than just a team containing one of each type of subsurface professional. It lies in first asking the right questions, assigning the right people to collaborate on them regardless of job title, and measuring success from these interdisciplinary results. As an example, a regional exploration team at Nexen recently made breakthroughs on problems of reservoir and source by first constructing a series of purpose-built projects and then bringing together the required geologists, geophysicists, petrophysicists and technicians to investigate them.

Given that not all hydrocarbons are found and developed in the same way, companies need to empower themselves to develop tools that fit their workflows and timelines. The current over-reliance on commercial, off-the-shelf (COTS) interpretation products presents us with two big problems: Juggling various outputs along with their incompatible file formats, increasing size and impenetrable quality control, as well as long waiting periods for new and improved functionality. Recognizing that an everything-to-everyone platform is impossible, oil companies and their software providers can adopt the following steps to increase platform and tool interoperability:

  • Create dynamic workflows that translate different types of spatial data so that they are compatible in a single space (e.g. translate 3D rock properties to a 2D GIS environment for common-risksegment mapping),
  • Write extensible programs that allow original and third party programmers to move beyond the “wait for next year’s release” approach and quickly build on existing capabilities,
  • Make open platforms and fully-documented external application programming interfaces (APIs) which make it easier for third parties to interact with platforms,
  • Support a coordinated set of data-type and data-sharing standards in use by all shops, and
  • Find or invent the right people: On the software vendor side, it is obvious why a development team of geoscientists, coders and hybrids who have end-user experience will generate a superior product. There is also a huge space for independent programmers and small shops to generate lean custom plug-ins and standalone apps (does one really need an expensive platform with site licenses to generate a 1D forward model?). The more critical human-resource innovation is on the client side, however. The extensible programs and APIs described above need skilled workers who can help adapt COTS tools to in-house workflows. These can be coding-savvy geotechnicians or contractors who specialize in solving problems of geologic data and analysis. The need for these types of professionals is immense and will only grow.

Since long, the industry has been abuzz with integration and calibration of seismic data with all the petrophysical and geological information. Do you think it is really being done in our industry?

Not yet, regardless of the problem of seismic vs. log/core scales. There is an increasing emphasis on corroboration between seismic and petrophysical data than between seismic data and geological information. One can argue that seismic inversion and rock physics using well logs are predictive enough and generate geologic properties that feed directly into reservoir models, while coring and geology-focused formation evaluation are challenging and pricey. However, comparing the seismic image with the depositional model, investing in coring programs and analyses, studying seismic and field analogues and closing the loop between the reservoir model and seismic data are things our industry can do a lot more of to deepen understanding of the play/prospect and to avoid costly dry holes.

My doubts arise from the fact that depending on the type of project, many times the relevant data is not available, and if available, it may not be useable. Comments?

Use all the relevant data that are available and address the missing or questionable data with sensitivities and model-based approaches. Clearly delineating the unknown or highly uncertain is valuable analysis, too.

Would your innovative solutions also include accurate reservoir characterization using the latest techniques to extract information from seismic data in the form of attributes?

If reservoir presence and quality are key risks, it is very important to conduct characterization using the latest techniques. Again, the innovative solution here is the workflow more than the cutting-edge geophysical method: Frame the project with the right questions and value of information, ensure that the seismic-based reservoir characterization techniques inform and are informed by ongoing geologic and petrophysical work, and communicate results in time for decision-making.

I have heard of geoscientists providing a static reservoir model. This is then supplemented with production data when it becomes available when a dynamic reservoir model evolves. Is this something that is done within or outside Nexen?

All of Nexen’s reservoir modeling and flow simulation are done in-house. Geophysicists, geologists, and reservoir engineers work together on both the static and dynamic modeling. The static reservoir model usually starts in Appraisal and is carried to Development, quite often along with the team. In Development, a dynamic reservoir model is then made based upon the static model. Once the field begins to produce, the dynamic model is refined with production data.

How is the uncertainty accounted for in such models?

Static uncertainty is handled within the static modeling software. A distribution is defined for each static variable (seismic uncertainty in picking a top reservoir surface away from well control, porosity, acreage, net-to-gross, etc.). This is followed by hundreds or thousands of simulation runs using an appropriate sampling technique for the variables (Monte Carlo, for instance). Uncertainty pertaining to dynamic variables is handled on the flow simulation side. Experimental Design methodology estimates the probabilistic production of the reservoir, which aids in decreasing the amount of processing time needed on the simulator.

Meaningful progress in such versatile integrated projects could be slow. Would you agree? You could argue that sometimes slowness is what it takes to create innovative opportunities. Right?

Not at all. All-comprehensive, take-everything-into-account earth models have been replaced by purpose-built models designed to address a specific question or two. The collaborators need only agree upon endpoints and distributions. With this approach, roadblocks and innovative opportunities quickly reveal themselves.

On the lighter side, Maitri, during your educational years, what is one important thing that you learnt, which influenced you in your life?

In taking a variety of courses beyond geoscience, including higher mathematics, programming, 3D animation, art, language, history and political science, I learned how to learn new things. It also opened my mind to ideas that can influence scientific work and make it more relevant.

To sum up, the questions we ask in order to understand and predict the petroleum system are inherently interdisciplinary. So ought to be the people and teams that work on them. Furthermore, if companies discover and produce hydrocarbons differently, why are they all using the same tools? While being at the forefront of geoscience software and techniques is important, what the industry really needs are open and extensible pathways in which these tools talk to one another to better serve subsurface analysis and drilling decisions.

End

Share This Interview