Interviews

“Look for better ways and try to be innovative…”

An interview with Malcolm Lansley

Coordinated by: Satinder Chopra | Photos courtesy: Joyce Au
Malcolm Lansley

Malcolm Lansley is a highly experienced geophysicist who is currently working as Vice- President of Geophysics for Sercel, Inc. in Houston. Glimpses of his wide-ranging experience in different facets of seismic exploration emerge as one listens to him sharing his practical knowledge on seismic acquisition methods, their impact on data processing and the interpretability of the resultant data.

Malcolm was in Calgary as the keynote luncheon speaker for 2011 CSEG DoodleTrain, and the RECORDER did not miss the opportunity for an interview, which he sportingly agreed to.

Malcolm has been awarded many patents for his work on 3D survey design and evaluation, and has authored numerous papers. He is well travelled and his presentations usually carry catchy titles, which help in conveying important messages. Following are excerpts from the interview.

Malcolm, let’s begin by asking you about your educational background and your work experience.

Okay, well basically I had decided at a very young age, about 13 years old, that I wanted to be a geophysicist.

Well, that’s interesting.

During my childhood I had a desire to travel. I think this was partly due to the fact that my parents did not venture far from our home. One day I was browsing through an Encyclopedia and found an article about these people called geophysicists or “Doodlebuggers”—they seemed to travel to all sorts of different places around the world. I didn’t really know what it was but I decided I was going to be a geophysicist. Later, as I was approaching college age I discovered that there were no undergraduate Geophysics courses in England. The only one was as an option as part of the Physics Course at Imperial College. I therefore decided I was going to Imperial College, study physics and then take geophysics as an option in the third year. The Geophysics Course included the study of gravity, electromagnetics, earthquake seismology and Earth models. I think we had probably three hours of seismic, but nothing about real actual seismic acquisition and processing for oil and gas. The practical part of the education was more the mathematics and the physics, not the geophysics.

Good, now your work experience also?

Just before I graduated, we had the usual round of companies visiting to see prospective employees.

Oh, yes, Campus Interviews –

We had the Campus Interviews and I interviewed with Shell, Chevron and NorskHydro as it was at the time, together with SSL and GSI. Most of the interviews went well but, to cut a long story short, the interview with Shell lasted thirty seconds.

What did they ask you in thirty seconds that you were able to—

I was told that because I was diabetic I would not be allowed to work for their company.

Oh, okay.

After these interviews I had offers from SSL, GSI and NorskHydro. Because of my wish to travel I decided the job with NorskHydro was the one I was going to take. But then when I went for a second interview the gentleman said, “Who else do you have an offer from?” I said, “Well I’ve had offers from GSI and SSL as well.” He said, “Why don’t you go and take the job with GSI, go over there for a couple of years; they have an excellent training program, then come back to us? We will hire you and you’ll get a much higher salary after you’ve had some training with GSI.” I went to GSI and started there in September of ‘69 and I stayed with GSI until the end of GSI. Every time I would start to get bored with something another opportunity would come up and I would end up somewhere else.

So you were with GSI for how many years?

I was with GSI from ‘69 until its end, which I think was 1988.

And after that?

GSI was sold to Halliburton and then merged with GeoSource and we became Halliburton Geophysical Services. I stayed there until the end of HGS when it was sold to Western Geophysical, where I remained until 2000. I left Western Geophysical one and a half weeks before the formation of WesternGeco was announced.

So now you are with Sercel?

Yes, I’ve been with Sercel for six years. When I left Western I had had continuous employment for about 31 years even though the company names had changed. When I left I went to PGS Onshore and spent 5 years there before joining Sercel.

Tell us in three words about Malcolm the person.

I would say a traveler, an educator, and also innovative.

Good combination. It looks like it helped you all through your professional life?

Apart from a couple of years at the start of my career and one year I spent as manager of 6 crews in Alaska, I have been an Area Geophysicist. As a technical person in this position, I was given the opportunity to work on issues from around the World. When you see those challenges it gives you the opportunity to expand your understanding and knowledge. Solving problems provides the best training for a geophysicist.

So you have been a problem solver all your life and this bred knowledge, knowledge that has kept you in good stead.

I started in data processing in GSI and within the first few months I started to see things that didn’t make sense that were coming in from the field. GSI management encouraged me to visit the field crews to better understand the issues. Immediately, I started traveling to the field and talking with the operations people and interacting with the oil company clients. This has continued for the last 42 years.

Malcolm, I would like you to share with us some of your most memorable geophysical experiences. You’ve traveled such a lot, you have been in the field, you have been in the processing centers, you have worked with so many companies, I am sure you’ll have lots of good stuff to share with us.

I think one of the biggest challenges and one of the most memorable things was working on the Flexural Ice Wave. I have spent many winter seasons in the Arctic, 23 or 24 in total. I’ve worked in Greenland, in the Arctic in Russia, many years in Alaska, and to a small extent in the Canadian Arctic.

The Flexural Ice Wave is a very highly dispersive wave created when we put surface sources like vibrators on floating ice. We set up something like a plate wave within the ice. It’s highly dispersive and it has opposite dispersion compared to normal ground roll. The high frequencies travel with high velocities, and the low frequencies travel with low velocities. It has typically somewhere in the order of a 60 dB noise to signal ratio and therefore can totally wipe out your shallow data.

Back in the late ‘70s and early ‘80s we started working with vibrators out on the Beaufort Sea ice in the Alaskan Arctic. Prior to 1976 the use of explosives was allowed offshore in the winter with the shots drilled through the ice and water with the charge below the sea floor. In the mid-1970s this was banned and we started using airguns lowered through the ice. In 1979 we tested vibrators on the ice and compared the data quality versus that acquired with airguns. The data quality for the target objective was good, therefore allowing us to use the same source both onshore and offshore. Unfortunately the shallow data was very poor. In those days a lot of the data was recorded with groups of many oil companies or “group shoots.” The problem of the poor shallow data was always a major issue with these oil company groups.

At first the actual cause was not understood but after a number of tests we determined that the cause was this flexural wave. I was working very closely with two colleagues, David Nyland and Pat Eilert. Dave is now with NOAA’s West Coast and Alaska Tsunami Warning Centre in Palmer Alaska and is a brilliant mathematician. He modeled the wave motion and developed the theory of its relationship to the water depths, ice thickness etc. This took probably one and a half years or two years to develop all the mathematics.

His theory was something like a ninth order determinant and each element was a page of Bessel Functions, Greens Functions and triple integrals. I couldn’t understand any of what he was telling me except for a few key words which were – “this is related to the shear transmission through the ice.” We determined that if we were to cut the ice we should be able to destroy the shear wave transmission. We performed a test using a Ditch Witch to actually cut the ice and put the sources and receivers on opposite sides. For 2D you could offset the sources, put the receivers on one side of the cut and the sources on the other. We didn’t get the full 60 dB’s of attenuation; we saw approximately 25 dBs. In a review of the mathematics Dave pointed out that there was also a pressure wave in the water that’s associated with the motion of the ice that’s also coupled with something like a Scholte wave or a pseudo ground roll along the ocean floor. Because of these, with only about an 8-inch wide cut we do not completely isolate the ice motion and the pressure wave re-starts the motion on the other side of the cut.

Unfortunately the cutting rate of the largest commercially available Ditch Witch was inadequate for normal production. We created our own Ice Saw, which was enormous. It was basically a 16V92 turbo diesel engine (a V-16 with 92 cu. in. per cylinder or about 24 liters.) It was actually the engine used on one of our smaller boats and it could cut through five to six feet of ice at half a mile an hour with an 8-inch wide cut. We created two of them. The method was successful and was used for a couple of years up in the Arctic for 2D with considerable improvement in the data quality. We developed a second version for the second year, which had half the engine size with lower fuel consumption, etc.

There was a lot of teamwork on the analysis of the tests, designing and building the equipment, and then implementing it in the field. With everyone pulling together it was one of my most memorable experiences.

You know that is an experience that many people won’t know about! This is why these interviews are such a good idea and so popular!

There was actually an article on this in the Leading Edge by Dolores Proubasta back in 1985. I still receive frequent questions and requests for information about the old tests and data from people who are continuing to acquire seismic data on the ice in the Arctic and Antarctic.

Tell us about a disappointment.

For me the biggest disappointment and, I think also for many people, was the end of GSI as we knew it. There was a family spirit or family feeling that still exists today between the people who worked for GSI. It was a very sad time for all of us. When I attend conventions, the number of ex-GSI people continues to grow smaller, but there is always a common bond between the people who worked for GSI.

[John Fernando] I understand what you are saying because I felt the same way being an ex-GSIer. I had just returned from one year of training at the GSI headquarters in Dallas, Texas and my aspirations of wanting to give back to GSI for the tremendous training opportunity I received, were shattered.

Another disappointment is the quality and state of technical papers that we see being presented at conventions today. I find this very, very disappointing, very disturbing. When I started there was a generally accepted rule amongst all the major contractors that the company logo and name could be on your title slide, acknowledgement slide and nowhere else. There was no corporate logo on the slides, no advertising and no commercialization in the papers. Today, in 50 to 60 percent of the papers or more, we see papers with just advertising, people trying to sell their product. For me, a technical paper should be technical. It shouldn’t be a cheap method of blatant advertising. If people are interested in the technology they know your name and company and that should be enough.

I am very, very happy you mentioned this. I have tried in my capacity here as Technical Chair and all that, I have tried to enforce this without luck. Most companies nowadays have a corporate policy that couples sponsorship or participation in conventions to logos on every slide, or at least that is what I hear. I am very happy you shared that with us.

The SEG tried a few years ago to try to separate the two types of papers into technical and commercial. They actually had an area that was set aside for commercial presentations to say to people – technical is technical, commercial is commercial. I know it was tried for one year but attendance at the commercial section was very low. I don’t know if it was tried for a second year, it doesn’t matter. But I still find the papers in general to be a great disappointment.

Malcolm, you always worked for service companies and I know you have given some reasons for this. Do you have any other reasons that you would like to give for working only for service companies, and not oil companies?

I enjoy the freedom and the challenge of seeing different aspects from survey design, through data acquisition and processing to the final interpretation volume. Although I know that there are some positions within oil companies that have that same sort of freedom in a technical position, I have always enjoyed the challenges I see in the contracting industry. I also hear different challenges from different oil companies asking, “How would you do this?” I enjoy sharing my experiences and teaching, so for me the contracting industry has given me more freedom and flexibility.

So you started off with data processing at GSI, then after a few years you went to the acquisition side, and finally to interpretation of data?

I started in data processing and then almost immediately started to move towards acquisition. After a few years I became Technical Coordinator for GSI’s Aramco Group in Croydon. Back in the early and mid-1970s Aramco (note that this is before the formation of SaudiAramco – so basically at that time it was Standard Texaco Exxon and Mobil) had their interpretation group in Croydon in the GSI Office. Most of the data processing was also being done there, although there was some processing that was being done by Western Geophysical in Isleworth.

I think that at that time there were six field crews with GSI and a couple of Western crews. The Technical Coordinator position was an interface between the data acquisition, processing and the interpreters. I had to talk to the interpreters and find out what their problems were. I was able to work with the interpreters, then go down to Saudi Arabia to try a test, bring the data back and then work with the data processors to try and optimize the data quality. Finally we could take it over to the interpreters and see if it met their needs.

That’s what I thought – you had been addressing the challenges at the interpreter level, to drive acquisition and data processing improvements.

After I left the Aramco Group I moved to Dallas to work in the GSI office with many people, including Alistair Brown, on the early stages of the development of 3D. We started teaching courses on 3D and started the transition from using paper sections to interpreting on the predecessors of today’s workstations, Seiscrop Tables. From those we moved to the early workstations. Because I was interacting with the oil companies to help them design the surveys and acquire the data, I was able to follow through with the interpretation. Although I was never in a position of saying, “Okay, drill right here,” I was very closely involved following the data from start to finish.

Well, that’s very, very important because interpreters simply cannot sit down and keep interpreting if there are challenges – who is going to look into them?

One of the things I do when I am teaching courses is to try to emphasize integration – really express the need for close interaction, from design, to acquisition, to processing, to interpretation.

Malcolm you have been a very successful professional in seismic industry and still are engaged in teaching courses, seminars, giving presentations and all that. What would you say briefly is the secret of your success?

I’d really say that the shortest answer to that is, it is a love of the profession. Most of my explanations are not mathematical, I see things in pictures. I try to work with people to help them understand the way they should go forwards. Because I have quite a wide range of experience, when I teach I can provide illustrations of things that worked and those that have not. I think that this method allows people to gain a better understanding. I suppose that you can surmise that I really enjoy teaching, and for me it’s a true pleasure.

Malcolm, John and Satinder

[John Fernando] Over the years Malcolm I have always admired your ability to use all your experience in the teaching environment. What I want to ask you is, have you ever considered writing a book, as this would make all your valuable knowledge accessible to a much wider audience.

I have been asked on a number of occasions to write a book. The problem is really time. Maybe when I retire or slow down I will find the time, but you know things are also changing very rapidly, by the time you write a book it may already be out of date.

[John Fernando] But as you showed in your talk yesterday, a historical perspective can be of tremendous value.

That, in fact, is the way I tend to teach my courses for basic 3D survey design. Rather than just say, “This is the way we do it,” I say, “This is the way we tried it first.” Then I will show an example and ask, “What do you think you can see wrong with this?” I try to lead people through the thought processes that we went through over ten or fifteen years. I get really frustrated when I see people wasting time using survey design software in a trial and error mode. They try a geometry and then say, “I don’t like that let’s try something else,” without thinking about what the changes will mean. Anyone should be able to sit down in 10 or 15 minutes with a piece of graph paper and do a basic survey design. You should be able to do 90% of the calculations in your head. Okay, you can’t produce all the nice offset, azimuth and colour fold distribution plots, but you can do most of the calculations in your head or with a calculator. It’s simple algebra and geometry, there is nothing else to it.

[John Fernando] Well, I don’t know if you would be interested, but if you would like I would offer my help to get you started on writing your book. This is something I truly believe in – the experience a person has over the years, cannot be replaced by technology and we should do our best to pass it on to the younger generation. So I hope you will think about it.

All right, well thank you John.

As you know, finding hydrocarbons is becoming increasingly difficult and requires the use of advanced technology. Some companies believe they can compete through the application of technology rather than the development of technology. They’ll pick up something from somewhere and just go ahead. What are your views on this?

In terms of technology advances there will always be an advantage to the inventor. This only lasts for maybe one or two years before alternatives are discovered. Therefore, while there is a short-term advantage to the inventing company, I think we all, the entire industry, benefit from this progress.

You have patented some of your techniques, correct? What are your views on patents?

Well, I am not a great believer in the patent system.

Even though you have six or seven patents?

No, I think I have three and I’ve got two or three more in the pipeline, but that’s typically because if you don’t patent something somebody else will try to. When I was with GSI we tried to patent the Ice Saw and we got a rejection letter that said, “It’s intuitively obvious to one skilled in the art.” My simple reaction to our patent attorney was – well if it’s so intuitively obvious, why is it that the twenty-three oil companies with whom we have been working, haven’t thought about it? So he sent a response and the patent office came back with another rejection. The patent attorney with whom I was working was in Texas Instruments’ patent department and they said, “Okay, we are protected.” Two rejections were okay, they were using patents as a defensive method so that nobody else could patent the idea and stop us using it.

That’s interesting! Give us an overview of how the technology for seismic data acquisition has changed over the last three or four decades.

You should have listened to my talk yesterday!

I know, but –

I started with 24 recording channels and the recording system had valves, not transistors. That was my first recording system. When we started 3D just a few years later, we still had a 24- channel system, which wasn’t enough to do 3D.

The first commercial 3D survey was in Lea County New Mexico in 1972. It used four master-slave 24 channel recording systems. There was a group of 17 or 20 oil companies, which had all committed to the project and the subsequent analysis, so it was sort of a group shoot, to evaluate the potential benefits of 3D and to begin to understand the processing. I still have a copy of the final report. This was my first experience with 3D.

find it interesting is that nobody had thought about doing two pass migrations in those days. The first migration was a one pass Kirchhoff. It was much later I think that Bill Schneider published the fact that you could separate migration into two passes.

At the time it must have taken a huge amount of resources because of limitations in computer technology?

Even though the amount of data was very small in comparison with today, the processing continued at least two years after the data acquisition ended. When we submitted processing jobs back in those days it would be six weeks before you’d see the results.

People continued using master-slave operations for 3Ds as I stated yesterday. I think the ultimate for that was actually done here in the Mackenzie Delta by Sonics Exploration in 1985 (Ron Reid and Earl Hale) when they used six DFSV – 120 channel systems master-slaved for a 300 square mile survey. That was quite a large survey back in the mid ‘80s.

As time progressed, the channel counts became larger and we were able to meet the requirements of most 3Ds. Each time we increased the channel counts, companies would use the limits of what was available to record their surveys. In more recent times we have seen a dramatic divergence in the number of channels that are being used in different parts of the world. In some parts of the world, but typically the Western Hemisphere, crews seem to have limited out somewhere in the range of three or four thousand to ten or twelve thousand channels. Even ten to twelve thousand is quite a large crew for the Western Hemisphere. There are a few crews that have used more. There was one in Colorado two years ago with ninety six thousand channels, but that was for a threecomponent survey, so about thirty two thousand stations live, with fiftytwo receiver lines.

In the Middle East and China there have been forty thousand channel crews and next year there should be two crews with 120,000 live channels. So we are seeing a great divergence in system usage around the world, and system capabilities today are up to a million channels. You could record 2 ms data real time with a million channels but nobody is currently doing it. What I think is interesting is the change from having to master-slave multiple systems, to not even using the full system capability on the upper end. There are other technologies in marine where the modern solid streamer really helps to attenuate noise. Other technologies such as steerable streamers and the capability of getting much longer offsets in marine by using much longer streamers are seeing increasing use.

When using so many thousands of channels, is the investment commensurate with the advantages gained?

There should always be a balance between the source effort and the receiver effort. One of the things that we see in the modern surveys in the Middle East and North Africa is the use of much lower source effort. The use of a fairly high density of receivers together with many more source locations but a much lower effort per source point is increasingly used. What is quite typical today for these very high-density surveys is essentially going to a point source and almost a point receiver, single vibrators and single sweeps. The raw data as collected appears to be very poor quality. However, with sufficient trace density and finer spatial sampling, the image reconstruction permits you to attenuate the noise and get higher frequency and resolution. That’s where we are seeing the big difference. As we continue moving towards point receivers with digital sensors with three components, we immediately get a multiplication of three in the channel count. As we move towards point source and point receiver, we must get the spatial sampling finer. By getting the spatial sampling smaller we will see future progress.

Very interesting. I think that is the answer to the next question I was going to ask you – what is required for acquiring good quality seismic data?

Yes, spatial sampling I think is critical.

What is the present scenario of R&D in our industry? Do you think enough is being done to keep our industry going, apart from advances on the acquisition side?

In general, if you look at the newer interpolation algorithms and the progression towards wavefield migrations, these are what we need to optimize our imaging with the newer data sets we acquire. As reserves decline, we are already seeing, and will continue to see, a change in the balance of seismic work – less exploration but more reservoir geophysics, time lapse, etc. Our industry will keep going; I don’t see this as being an issue.

In essence you are reaffirming that our jobs are safe for the next whatever number of years.

In the Gulf of Mexico we are seeing repeat surveys that are being acquired not for time lapse but just to improve the image. There may be two, three or four surveys over the same area. We’ve gone from the earlier very narrow azimuth surveys, to a newer survey that is still perhaps narrow azimuth but with longer offsets, and maybe recorded at a different azimuth, and then to wide azimuth.

When that many surveys have already been acquired, I believe that modeling can be done more reliably. I am not a great believer in modeling in the early stages of studying a field, because in most cases we don’t have good enough information to build a rigorous model, particularly in complex geology. Only in those areas where you have a reasonably good understanding of the reservoir to begin with and perhaps some well control as well, can you do accurate modeling. Then you can begin to understand the illumination issues and design the real survey that is going to meet your objectives.

What do you feel about the value of modeling?

To me, modeling is of limited use. You can use the model to determine the accuracy or the truth of how well the algorithms are performing. However, one of the biggest issues that we have in the real world is determining the velocity field, even today. If your data are lacking in illumination, how do you determine the velocity? I have been saying for a number of years that the biggest problem with using models is that we need to know the velocity field. For modeling to work, we must begin with a really accurate model with representative levels of noise and then start processing from the raw prestack data. We are then able to see how well we can determine the velocity field before trying our depth migration algorithms. I think that we, as an industry, are spending a little too much time in modeling. It is useful but I think we should expend more effort thinking about the sampling. The way we acquire surveys should benefit from more time and consideration.

We will leave it at that point. In your opinion, what are the challenges facing us in seismic data processing that need to be addressed, in particular what areas of processing do you think research should be focused on more?

I think the research that we are doing at the moment in terms of wave equation migration and interpolation should be extended. Obviously we will need to develop faster algorithms. PSDM works well to a certain extent, but I think we need to get to the stage where we can do it routinely and fast enough to be able to really evaluate changes in velocity models. I believe that we need to do pre-stack depth migration on all of our surveys, but particularly those with wide azimuth.

First we have to get the sampling correct and then we need to have the algorithms and our systems fast enough to interact with the data in real time. This will enable us to actually understand how changes in the velocity field affect the data.

What is your take on the types of reverse time migrations that are being talked about? There seems to be a lot of interest, with many people engaged in doing RTM.

At the last SEG I co-authored a workshop paper on Foothills acquisition and imaging with Sam Gray. When designing 3D surveys we typically look at common mid-point spacing for spatial sampling, but this is what we need for a post-stack migration. If we are doing pre-stack depth migration like RTM in the shot domain, what is the limiting factor? It is actually the receiver line spacing. I have to admit, I hadn’t really appreciated this before. In a cross line direction the aliasing is not related to the bin size, the aliasing factor is the space between the receiver lines. In order to optimize our imaging, we need to strive to improve our spatial sampling and possibly use some form of interpolation. After my lunchtime presentation yesterday, I received comments from several people who were interested in my comments about survey acquisition in rough terrain because I said, “Use available access, don’t try to put everything rigidly in straight lines, use available access to try to increase your sampling.” They said, “That’s great for the new interpolation algorithms because some randomization is preferable to when everything is in straight lines.” One of them said, “That’s great, may I use that information when discussing survey design with my clients?”

I looked through some of the papers that you published and you have given some very catchy titles to some of your presentations. “The question of azimuths”, “High density 3D – How high is high enough?”, then “CMP fold: A meaningless number”. Could you elaborate a little bit on each of these please?

“The question of azimuths” was actually a workshop presentation from back in the early ‘90s. This was in the day when our 3D surveys were typically very low fold, or low trace density and so we always had this question about how do we use our existing recording channels? Do we spread them out over a wide area and have a wide azimuth survey? With this we have better azimuth sampling but poor offset sampling. Or, do we put them into a narrow azimuth, swath geometry and get better offset sampling but very poor azimuth sampling? This question had been proposed as a topic for the workshop and I was asked if I would be prepared to present my thoughts on the subject.

Much to my surprise a short time later I received an email from one of the organizers saying, “Well, you’re the only sucker that accepted the challenge, so we are going to abandon the whole idea of having a complete session,” because none of the other invited speakers wanted to stand up and talk about such a controversial subject. I told him that I was still prepared to do a discussion paper and so I entitled it “The Question of Azimuths.” What I tried to do was go through the issues of anisotropy, structure, cost of acquisition, spatial sampling, offsets vs. azimuth, etc., and explain the pros and the cons. I wanted to rationalize the decision-making process issue by issue.

That must have been really interesting.

I received a number of requests for additional presentations so it was presented quite a few times at local society and “special interest group” meetings.

“High density 3D – How high is high enough?” When I first started talking about high trace density in the late 1990s and early 2000s we were discussing and showing the benefits with a number of data examples from different places. The question people kept asking me was, “How high is high enough?” I didn’t really invent the title, it was actually answering a question that I had been asked.

The “CMP fold: A meaningless number” paper is again related to trace density. My point here is that a fold value by itself is a meaningless number. For example, if someone says “I’ve got a 96 fold survey,” people might respond with, “Oh, that’s pretty good.” But, if your bin size is 100m x 100m you only have 9600 traces per square kilometre.

It defeats the purpose.

With this small number of traces you don’t have adequate wavefield sampling. The intent of the paper was to indicate that it’s the density of traces and how well we’ve sampled the wavefield that’s important. You know, I said this in my talk yesterday, I tell people I’d be quite happy to have a single fold data set and they might say, “You are crazy.” But it’s not crazy if my bin size is 1m x 1m , or 50cm x 50cm, I’ve got one million or four million traces per square kilometer. With this I should have good wavefield sampling. I thought this title would grab people’s attention.

Yes, it is a clever way to make people look. Any ideas for future catchy titles?

The title of my paper yesterday was “Guns, Vibes and Channels” and again it was really just trying to catch people’s attention. Sometimes I may dismiss reading an abstract because the title is very long and it explains everything. What I try to do is to keep the title a little bit shorter and encourage people to at least read the introduction or summary and then decide if they want to read the rest of the paper. At the moment I don’t have any more “catchy titles” in mind.

All right, let’s ask you this. When you look back on your life and career, which spans more than 42 years of professional life, would you have done anything differently than what you did? If you were to start all over again?

I can’t really think of anything. I mean there are a few very minor things that I might change, but overall I think it would be much the same. I think I would probably have started to cut back on the travel a little earlier and spend more time with my family, but apart from that I have really enjoyed what I have done and continue to enjoy it.

So the feeling you have is that of a satisfied man?

Yes.

Very nice! Not many people have that – they have regrets: I should have done that differently, and I didn’t make that decision at the right time, things like that. But it is good to hear this kind of attitude based on your experience. So thank you.
What are your aspirations for the future now?

You know I am getting close to retirement age, but as I think that I can still contribute to the industry I hope to continue for quite a few more years. I would like to slow down a little bit, spend more time with the family and do some more fishing, which I really enjoy. But as long as I am still enjoying my work and contributing to the industry, I would like to continue working, teaching and helping wherever I can.

Very nice, well said. Apart from the professional work that you have done and have been doing, what are your other interests?

I enjoy being in the outdoors, gardening and I love hot peppers. Usually in the summer down in Texas we grow many hot peppers. Having fresh, homegrown fruits and vegetables is nice as well. I also enjoy traveling, but these days at a more relaxed pace.

One last question, the second last actually, what would be your message for the young geoscientists who are entering our industry?

I think my advice would be to keep their eyes open. Keep a wide perspective. Think about what you can do, don’t accept “this is the way it’s always been done” and just do the same thing. Look for better ways, try to be innovative, those would be my main comments.

All right, the last question I have for you Malcolm is after going through all these questions, did you expect me to ask you something that I didn’t ask you?

I can’t think of anything offhand.

All right. Malcolm, I thank you for giving us this opportunity to sit down and chat with you and get your perspectives on these different topics. It has been a pleasure.

Thank you, I enjoyed it

End

Share This Interview