Interviews

“Making a difference is how you are going to succeed.”

An interview with Bill Fahmy

Coordinated by: Satinder Chopra
Bill Fahmy

Bill Fahmy of ExxonMobil Exploration Company presented the 2006 SEG Distinguished Lecture on ‘ DHI/ AVO best practices, methodology and applications’, in Calgary on March 12, 2007. His lecture was well received by the audience and a webcast of his talk (recorded at Dallas) is also available at the SEG website. This interview, wherein Bill shared his thoughts and experiences on a wide range of topics, was recorded in the morning before Bill delivered his talk. Following are excerpts from the interview.

Bill, let’s begin by asking you about your educational background and your work experience.

I received my Bachelor’s Degree in Geophysics at the University of Pacific in Stockton, California in December 1978, then went to work for Western Geophysical out in the field, in data acquisition, in the spring of 1979. I basically started from the bottom and worked my way to a Crew Chief. I remained with Western until 1986 when I decided to return to school to get my Masters in Geophysics.

I completed my Masters in Geophysics at the University of Wyoming in 1988. Exxon hired me in 1988 and I have worked there ever since.

What all do you do at Exxon these days?

I am currently a Geophysical Advisor in a group that looks at under- explored basins around the world in an attempt to generate new ideas or new opportunities for the Company.

What inspired you to take up a career in Geophysics?

That’s a tough question. I was very good in mathematics and I really didn’t have a direction on which way to go into applied mathematics. A very good friend of mine at the University of the Pacific was majoring in geophysics. At the time, the University of the Pacific didn’t have a formal Geophysics Department, there were only 2 other students that were studying geophysics. The curriculum was a combination of geology, physics, mathematics, chemistry and engineering, which was close enough to applied mathematics. He persuaded me to take a couple of geology courses and I liked them. Shortly afterwards, I talked to my advisor and he suggested that I major in geophysics. So I became the third Geophysics student in the Department.

During your building years, who were your mentors and why did you admire them?

I recall two colleagues at Western Geophysical who were mentors. The first was Jerry Patrick, the Supervisor of the first seismic crew I ever worked on, and the second was Buddy van Wagenen, my first Party Chief. From them, I learned not to be afraid to get your hands dirty even if you are the supervisor or manager; i.e., lead by example. For example, when Jerry was out in the field visiting the crew, he would go out and lay down geo-phones even though he was the Supervisor. That demonstrated leadership not just to me but to the rest of the crew. To anyone who saw him out there laying geo-phones like the rest of the crew, they followed him without any hesitation. That was really very impressive and set the model that I would follow later on in my life.

When I got to ExxonMobil, I had that go-get-it kind of attitude which helped because there was no new hire training program. It was on the job training. You were thrown into the fire right away, unlike now where we have a two-year new hire training program. During that time, three people influenced my career, two of them technically. Their names were Harry Portwood and Roger Caviness. Harry and Roger were very experienced with excellent technical backgrounds in geophysical applications and processing. Besides mentoring me, they gave me much great advice and fostered the attitude to always strive to seek out the best technical person within any group without being afraid to seek his/her advice. The third person was Karen Dittert, my Manager when I started for Exxon. Her goal was to make a well-rounded geophysicist and not a specialized one. So she chose me to be that person and, in turn, she exposed me to all different types of technology via different assignments. I owe her a lot for that because that’s what has helped my thought process and creativity today in how I approach and solve tough geophysical problems.

After doing your Masters you joined Exxon, well it became ExxonMobil and you continued for so long there; tell us about some of your professional experiences that tend to stand out in your mind.

With ExxonMobil?

You might want to say something during your Western tenure too if you like.

I could tell you many stories – going that far back is hard to remember. However, with my Western tenure, the thing that stands out in my mind was my first managerial job. At the time, the crew was working in a mountain terrain just outside Ventura, California. The morning of my first day, I was driving out to the field and I got a call on the radio from the Field Observer. As I recall, he said “the crew doesn’t want to work any more - they quit, it’s too hard out here, and the terrain is just too rough”. I was shocked; I didn’t know how to act or what to do. Just then, when it seemed all was lost, I recalled my experience with Jerry Patrick and Buddy van Wagenen. So, I got back on the radio and I said “Okay, well let me ask you something – if I go out there ahead of them and demonstrate that I can cut through the brush and go through this rough terrain, would they go along? And if I can’t, then I will consider taking an alternative route. However, if I can go through this brush and they still won’t do it, then we will have to take whoever doesn’t want to work to town and pay them off and hire new ones”. The Observer came back on radio and said, “Well, they said that if you could come out here and lead the way, they will follow you.” So I did it. It was tough and it was quite an experience. But the bottom line is that we completed the survey and in a timely fashion. That is something that I will remember for the rest of my life.

With ExxonMobil, several things come to mind, the first one was working in a team concept. It was a great opportunity for us to link technically from the geophysical interpretation to the geology to the business needs while working together in a team-like environment, in one work room. The team made all the decisions regarding our budget and our prospectivity. We met and discussed our tasks everyday. We came up with several great and creative prospects that were technically sound which made this one of the most fun experiences I ever had. The most rewarding experience was contributing to a major discovery, especially when everyone was going to walk away from the area, because that is what we are paid to do.

Fig. 01

Is that the one which you presented a couple of years ago?

No, this is a different one. In this case history, everyone was convinced that there was only gas in the area, which was uneconomical. We actually had to convince everyone, including our partners, to drill before walking away. Our technical challenge was to show the probability of oil besides gas in the main reservoir which turned out to be quite a success story. We were not only able to convince everyone to drill the prospect but we were able to predict both contacts with accuracy. The other rewarding experiences were being elevated to a Technical Advisor which was a great achievement for me and being chosen as a Distinguished Lecturer for the SEG/AAPG as well as representing ExxonMobil in this venue. This is quite an honor.

So now you are working as Geophysical Advisor and exploring opportunities; what are your aspirations for the future?

Well, I would like to become a Chief Geophysicist. I think I have the geophysics background for it. I have a very well-rounded background including experience in acquisition, processing, and in geophysical applications, not just AVO based. I also have a good background in seismic interpretation and in geology and good understanding of the business needs. So, I think I have a solid foundation to become a Chief Geophysicist and that is my aspiration.

Apart from AVO what other areas in geophysics do you like and what are your contributions there?

Another area is pore pressure prediction, especially in areas where we have no well control and we are able to predict pore pressure just by using seismic data. Or in areas where we have look-ahead VSPs, and we can use them in connection with seismic data to predict both pressure and lithology ahead of the bit. This is an exciting technical challenge for me to work in the realm of pore pressure prediction and look-ahead VSPs.

Also, frequency-based attribute analysis/interpretation and 3D visualization are emerging and exciting areas. I like to play with new technologies where I can get creative and innovative.

For pore pressure prediction you need a smooth densely populated, velocity volume and then a transform which would give us pore pressure information. What type of transforms do you use: is it Eaton’s method, or Bower’s method or is it something else?

We use Bower’s method within the company. Bower used to work for Exxon Research when he published his work. Our pore pressure prediction program is based on his methodology which works very well in clastic settings. Historically, the predictions have been accurate even within hard pressure sections.

For the look ahead VSP, the VSP is acquired up to a certain depth and then quickly processed. The way I had done it several years ago is that we would invert the corridor stack trace. The inverted trace would show the change in velocity or impedance at the top of the high pressure formation or something like that. Is that the same technique you used or something different?

It is the same technique. We invert the VSP and we look for changes in the reflectivity to determine the high from low impedance events and we also look at the deviations from the background trend. Then, we attempt to relate these changes to changes in pressure or lithology. Furthermore, we invert for velocities and compare those from the inversion to our seismic velocities. We flag the deviations from the overall compaction trends, or we see slope breaks and again we try to relate them to pressure or lithology changes. Finally, we tie all observations together to come up with our final interpretation.

Since the introduction of AVO in 1984 by Ostrander, of course people knew something about it before that, but he was the one who first published details about it. Ever since it has become a hot topic. How do you think the perception or expectation from AVO has changed in over all these years; you can also give us the ExxonMobil perspective.

Well for us, AVO was used heavily in the 80s but it was used mainly as a stand-alone tool. It is the kind of tool that got a “bad rap” or was being abused. So, in the Distinguished Lecture Talk I will give today, AVO is not used as a stand-alone tool, but as an integral component of the overall DHI analysis process. It is combined with other technologies to help us de-risk prospects. This is what we have learned over the years. There are certain basic elements that we look for before we even start looking at AVO, like the right impedance signature. And these elements have to make sense to us before we do any kind of AVO work. This is dictated by our rock physics work which is done early on in the process. To look for these attributes on the seismic data, the data has to be conditioned or processed appropriately to preserve amplitude and phase. Finally all the attributes observed on the seismic not only have to match our rock physics models, but fit together with the geology as well as imply a fluid change with structure. The cumulative observations will determine the risk of the prospect in terms of a probability for the presence of hydrocarbon and that is the final step in the analysis. To assign this probability for the presence of hydrocarbon, we use a neural network approach which compares our observations to historical statistics in the basin. If no wells exist, then we use statistics from an analog basin. That probability, in turn, is reported to management when the prospect is reviewed. The bottom line is that we have learned a lot since the technology was developed, we have higher confidence in applying it now, and greater success since the initiation of Best Practices. So in summary, developing Best Practices allowed us to apply the technology consistently and optimally across the organization. We keep it evergreen by learning from our drilling results and by adding in new technologies.

In the past we have talked about AVO as an anomaly identification technique, more so, than trying to invert for elastic parameters. More recently attempts have been made for inversion for elastic parameters. Do you think we can actually invert for elastic parameters and get meaningful estimates out of that, or you have any reservations?

Well, if you have lots of wells, we do it. Our philosophy is that if you have lots of wells, like in a Development or a Production setting, then we try to derive these elastic parameters for reservoir characterization. Let’s take deriving Vshale for example. We need a relationship to VP/VS which is derived from conventional offset seismic data. So, in order for us to do this; the well ties have to be sound on all offsets; nears, mids and fars. Today, we see lots of people invert the data without paying close attention to well ties or judge the ties by the quality of the correlation coefficients. In reality, this should be done by visual inspection of the character tie, especially at the reservoir level, and that is what we insist on doing before we proceed with the inversion. We also try not utilizing all the wells; e.g., hold some wells back or what we call blind wells. That way, we can verify how robust the inversion prediction is by its ability to predict at these blind wells where we already know the results. These kind of elastic inversions are fit-for-purpose products. For Development or Production settings, when we have lots of wells, we utilize them. However, for Exploration, where we have one or no wells, it’s not really a good idea to do a density or Vshale inversion because you have little or no constraints.

So that’s the kind of philosophy that you have been following within your company?

Yes. One more thing, we have to understand the uncertainty and the error of the inversion product. For example, if we are inverting for density and we are trying to predict within .1 or .2 accuracy but our error is larger than that, which it normally is, then we really have to stop and question the kind of answer we are trying to obtain from the product. The best remedy, is to do feasibility studies to understand what we are dealing with and get a handle on uncertainty or error prior to generating any of these products. Also, we try to do things fit-for-purpose depending on what business stage we are in. For example, in Exploration we do reconnaissance work. In Development or Production, we tend to do more detailed work; i.e., reservoir characterization.

Fig. 02

At the general session of the EAGE some three or four years ago, or maybe a little over that, Andy Wood the head of the global explorations at Shell International, had mentioned publicly that AVO had not significantly altered their overall success rate; so I was going to ask you to comment on that from an ExxonMobil prospective.

Well, on a global portfolio, we find that prospects that were based on some kind of DHI have at least a 20% higher success rate, both geologically and economically, than those that were not associated with any kind of DHIs; i.e., hard rocks. By geologically, I mean that some kind of hydrocarbon accumulation was successfully found but it wasn’t commercial for us to produce. Economically means that we can commercially produce it. So in both cases, we see at least 20% or higher uplift. Furthermore, what I also hope to show is that the success rate has increased since the initiation of Best Practices. And that’s mainly due to the fact that we’re applying the technology consistently and optimally across the organization and continually learning from our drilling results.

One of the significant issues – and you have alluded to that a few moments ago, is AVO reliability and how you condition the data. What would you like to share with our members? What all do you do within ExxonMobil and is there any preferred work load that you like to follow or something?

Yes, one of the major issues for doing proper AVO analysis is having the right data, hence, the processing of the seismic data for AVO. Not having the right data leads to pitfalls in the AVO interpretation, this was one of our major learnings. ExxonMobil has a proprietary controlled amplitude and phase processing philosophy for both marine and land data. Controlling the amplitude and phase is very important for us and it is verified by how robust our seismic-well ties are. Now the question becomes, “even though we were successful in controlling the amplitude and phase for stacked data, how do we condition the data for AVO?”. The major factor here is to come to the realization that the seismic energy loses frequency from the near to the far offset, and hence the wavelet is different. What most people will do to account for this effect is examine seismic background amplitudes on the near and far offsets and apply an offset scalar to compensate for the amplitude loss that is observed. However, this amplitude loss is a frequency-dependent amplitude loss and can not be accounted for by applying an amplitude-only correction. You still have the different wavelets present or a wavelet mismatch. Our trick is very simple, especially if we don’t have wells or any kind of theoretical-based methodology in the basin we are working. All we do is just apply a filter to both the near and fars to match the frequency content on the fars. This methodology has worked almost 99% of the time.

Is that something similar to what Chris Finn had demonstrated a couple of years ago at the SEG?

No, Chris Finn’s method is based on a theoretical wavelet replacement technique which is very powerful for detailed AVO analysis. We actually use his method when we have good calibration or many wells in an area. This type of application is very powerful for reservoir characterization work in Development and/or Production settings. For Exploration settings, or recon work on large data sets, we prefer to use the method I just described which is to filter the nears and fars.

Many different ways have been demonstrated for interpreting AVO data like you also mentioned you know mid, fars and you know there are other types, do you have any favored methodology for interpretation of AVO?

No, not really. All these products are fit-for-purpose and their usage is quantified in our Best Practices. For routine work, we just use what is called “The Envelope Difference product” which is the difference of the envelope of the near minus that of the far offset stacks multiplied by the envelope of the far stack. It is normally used for class 3 and weak class 2 type AVO reconnaissance work. In basins where this product is not so robust, especially where there are class 2P type AVO anomalies, we tend to use the Fluid Factor product. But, no matter which stacked AVO product we use, Best Practices suggest that the final product should always be QC’d by how it was generated. This is normally done by examination and analysis of pre-stack gathers to verify the AVO response that is seen on the stacks and to insure that it was not a function of the processing. No matter what your favorite final AVO product is, you always have to go back and look at the basic data. Because if the gathers are not moved out correctly, or if there is something interfering with the signal, like some kind of noise or critical angle effects that are not handled in the processing, then large errors will be introduced in the final AVO stack products which in turn will lead to erroneous interpretation. I don’t really have a favorite product, we do things fit-for-purpose. For example, for detailed work, like in a Development or Production setting with many wells, we will do AVO inversion for reservoir characterization. However, in an Exploration setting with no wells or one well, we will use the Fluid Factor or the Envelope Difference products.

Well, fair enough. There is a lot of discussion about getting reliable density information from AVO and of course we need long offset data for that; what is your opinion on this potentially controversial claim?

Again, the issue here deals with how many wells you have to calibrate with and the assumptions behind the product. Then, you have to realize that it’s an amplitude-based inversion philosophy. The question one should ask is: what kind of assumptions do we have to deal with to generate density? To generate density from P-wave seismic data, we first need to calculate shear velocities which are normally generated from some kind of AVO inversion product that is based on robust offset-well ties. Then, we have to calculate the angle of incidence, which can be inferred from seismic velocity. For starters, there are just too many assumptions to deal with, especially in generating shear velocity, and the angle of incidence before we even get to invoke other assumptions to generate our final product, the density. Let us say we have great offset-well ties in order to generate good AVO inversion product. We will have to worry about seismic resolution, especially when we are dealing with offset; i.e., far offsets. As you recall, when we go out to very far offsets, we tend to lose frequency, and hence the resolution becomes a problem. So you can see where I am going with this discussion, especially when we are trying to predict density for typically thin beds. What I mean is, the typical error and uncertainty is quite large for the kind of density changes that we are looking for, which is normally very small and on the order of about 0.1. Even though many of these density sections are strikingly colorful, one has to think behind the inversion and look back at the magnitude of error comparing to the magnitude of the prediction. For example, oversight of things such as the relationship of calculating shear velocities is normally derived as a best fit of an associated scatter. The degree of that scatter measures the error, or uncertainty of the fit, and many times, it is overlooked. But by looking at something like that, one can see that the very small changes in density that we are trying to predict may actually fall within that first order scatter or the error range. And that is why predicting density is a very difficult problem.

Within ExxonMobil you people don’t rely so much on density? Do you?

It’s not that we don’t try. We try different technologies but, in a systematic approach and in areas where we have many wells like in a Development or a Production setting. First, we do a feasibility study via test lines that cross several calibration wells. The well ties have to be excellent on all offsets, nears, mids, and fars, especially within the reservoir interval. The quality of well ties also has to be inspected visually and not just by looking at correlation coefficients. Then we hold some wells blind, i.e., we don’t include them in the inversion and we invert the test lines. Once the inversion is complete, we try to understand the degree of uncertainty and the error by examining how the prediction fared against the results at the blind well locations. In doing so, we get a good understanding of what we are dealing with instead of just taking a product as it is and trying to interpret it. I have to stress that. We try to tie the final interpretation to other attributes as well as to the geology. We never rely on a single technology by itself to give the answer and that is the power of Best Practices.

Fig. 03

Okay, my next question is on 3D AVO. The normal practice is still processing 3D data as one line after the other, rather than put it in a 3D perspective and then do the processing that way. So what is your comment on that and within ExxonMobil do you really do 3D AVO?

We do it in the same manner that everyone else does. What I mean is, it’s done basically from the 3D gathers to create near and far cubes. We can design an angle-based mute which is calculated from smoothed seismic velocities. The gathers are then muted based on a specified near and far angle range of interest. Or we do pseudo-near and pseudo-far angle cubes by first designing mute patterns based on angle ranges. The mute patterns are then tied to the water bottom to mute the gathers in a consistent manner and generate the near and far offset cubes. Once the near and far cubes are generated, then the desired final AVO product is created. But what we do not account for, or take into consideration, is azimuth variations.

But what is preventing people from bringing in the 3D AVO, because that is the correct way to go about it. It may be true that the way we are doing AVO may work in 80 percent of the cases, but if we do a good job, it might work in the remaining 20 percent as well.

Yes, I totally agree, but you will need to account for the azimuth variations to properly position the data in 3D. It’s mainly cost and time that drive our business. The problem is that the processing routines that account for azimuthal variations such as pre-stack migration, DMO, etc., that exist in the industry today either cost a lot of money or take a lot of time to run. But I think, in the future, as the technology and the computing power advances, the technology and all these routines will become feasible to run. For example, look at the days when depth migration and pre-stack time migration used to be cost prohibitive but now they are almost performed routinely. I am very sure we will get there as computing power advances but we are just not there yet.

ExxonMobil is quite active in doing time-lapse surveys and I know Dave Johnston and other groups within ExxonMobil who are actively pursuing these types of studies. My question is do you really follow AVO in time-lapse surveillance or is it not something you consider; I would like you to comment on that?

No, we do it. Dave Johnston would be the better person to comment on this topic.

You do it? What about the success rate?

I can’t comment on the success rate. We haven’t really tracked it like our DHI Best Practices. But you would have to ask Dave Johnston that question; he is the leader of our time-lapse technology. Personally, I haven’t really kept track of it myself, but I know his group is actively engaged in this area. You know as well as I do, for time-lapse, we are not only dealing with class 3 type DHIs but also class 2s. And for class 2s, there is a considerable AVO time-lapse effect. Dave is really the one to ask: he is the resident expert and is on top of his game. He would be the better person to answer this question.

Now the last question on AVO is about the spectral decomposition application that I heard about at your presentation at the SEG in 2005 This is a neat technique I think. I wanted to get your comment on – have you used it in future prospects and how did it fare and do you have a preferred term over the earlier monochromatic AVO that was coined for such an application?

We are currently using it. I can’t really comment on its success rate right now. I wish I could present more material on the subject. Actually, we do something new, something that a colleague of mine came up with, instead of applying a filter to balance the frequency of the near and far stacks. Gianni Matteucci, who is one of our bright young Geoscientists, came up with a technique that is a bi-product of the frequency decomposition to simulate the filtering or balance the bandwidth. He figures out what the common frequency band on the nears and fars; is e.g., let’s say 6-30 hz. Then he decomposes both near and far stacks over their entire bandwidth; e.g., 0-60 hz. Next, he keeps the common frequencies he previously found on both, which we assumed as 6-30 hz, and he throws out or omits the rest. After that, he sums up these frequency panels on the nears and the fars to produce simulated filtered offset stacks, or summed 6-30 hz panels. Finally, he takes the envelopes of each and subtracts them to create a spectrally balanced AVO envelope difference product. And that is a lot more powerful than just doing a filter or monochromatic AVO. Furthermore, it’s a better signal-to-noise ratio preserver.

Bill, what are your other interests apart from science that you practice?

I like playing soccer, biking, and playing guitar. I mainly like sports in general. I have an 11- year-old and we do many things together. We like going to the beach in the summer and hanging out. We also do lots of outdoor activities. Lastly, I like to travel around the world experiencing other cultures.

What would be your message for fresh graduates entering our industry?

I see a lot of bright young people coming out of college that are very good in technology, operating the computers, and generating many good-looking products. My message to them is: don’t forget about the basics. Try to understand the basics, like contouring, for example. Start from hand contouring and understand what geologic structures look like. Don’t depend on the machine contouring and take its output at face value. Try to understand what has been generated and make sure it makes sense geologically. Seek out the experienced people, and learn from them as they have a lot of knowledge. Try to work in teams and work effectively. Work on your presentation skills and be able to present yourself as well as your work. Keep an open mind. Always try new technology and try it in different ways: don’t just follow a prescribed routine. Making a difference is how you are going to succeed. Companies right now are scrambling for new opportunities and the way you are going to make a difference for them is by taking technologies where nobody else has taken them before.

Good. The last question then, is there any question that you had expected me to ask and I didn’t?

Yes. The question that everybody asks me and I was surprised that you didn’t is – “What about predicting oil vs. gas”?

Okay, let’s hear that.

My answer to that is; for a geophysicist, it is the Holy Grail. If I could do it, why would I be working for anybody? I would be on my own making big bucks consulting for everyone. In reality, it is a difficult problem. One time, a colleague of mine told me that his company required their people to always do oil versus gas prediction on every DHI prospect. He went on to elaborate that their overall success rate has been about 30 to 40%, which is less than a flip of a coin. So, he didn’t know why they even did it. That has basically been our view: it is hard to get a success rate above 50%. The difficulty in predicting oil versus gas is somewhat similar to that of density. The fact is you’ve got all these different variables which require lots of assumptions like reservoir and fluid properties. But what bites you the most is seismic resolution which is an inherit problem of the seismic data. So, things like bed thickness affect the seismic amplitude which we rely on to make our prediction. However, if you could factor out the resolution problem then that is one less variable that you have to worry about. For example, Chris Finn’s method, or other routines like it, may help with the resolution problem. That is what we need to start with to get rid of each unknown, and then we can concentrate on the real problem at hand. The bottom line is it is a very difficult problem to crack.

Coupled with that problem is the fizz gas problem as well.

Oh yes, there is always fizz gas which complicates things further. Now you have to worry about fizz gas as well as all the other fluid properties and not just the reservoir properties. So there are lots of variables that we need to worry about, or know. Not only for the reservoir like thickness, porosity, net/ gross, etc., but we’ve got to worry about all of the fluid properties such as saturation, oil API or density, GOR, etc. To me, that is just too many unknowns or assumptions to worry about.

Bill, thank you very much for giving us this time to sit down and ask you a few things about your views on the different topics and I am looking forward to your talk now.

End

Share This Interview