The calibration or AVO Walkaway technique, where the downhole geophone array is placed just above the reservoir, is gaining acceptance as a tool for evaluating the full elastic response of the reservoir local to well locations. It may also be used to evaluate the velocity anisotropy of the layers in which the receivers reside. Provided appropriate well data is acquired, these data can be tightly integrated with the current and future local surface seismic.
Geco-Prakla are developing Survey Evaluation and Design (SED) and Quantitative Quality Assurance (QQA) procedures as part of their Total Quality surface seismic package. The integration of borehole and surface seismic data plays a vital role in this process.
The use of borehole measurements for refining and calibrating the log based AVO model are documented in a paper by Armstrong, Chmela and Leaney (First Break, Aug. 1995). Geco-Prakla are now concentrating their efforts on using the borehole data as part of the SED and QQA processes.
Typically borehole seismic, in the form of a synthetic seismogram or the corridor stack, is used to extract the embedded wavelet from migrated surface seismic data at the well location. If the matching is done in a frequency dependent way, establishing percentage confidence by a measure of correlatable to uncorrelatable power within different frequency bands, then in fact the results are a measure of the agreement between the underlying reflectivities described by the well data and the surface seismic respectively. Thus, adaptations of the matching process can be used to test for surface seismic "quality". This is one of the fundamental calibration and evaluation procedures within the SED and QQA system. "Quality" parameters are quantified during the SED phase and monitored during the acquisition and processing phases.
The main problem in using such techniques is that in the presence of even small amounts of dip the matching procedures cannot be used on pre-migrated data and so quality cannot be monitored until after the processing has finished and the data migrated. However, VSP data is seismic data and contains the effects of dip. A method of extracting information from the VSP to provide data to match with premigrated surface seismic data for QQA purposes has been devised and will be illustrated in the presentation.
In particular, pre and post migration VSP images are compared with their surface seismic images. These will show how effective matching can be achieved premigration in areas of considerable and varying dip. A comparison of the pre and post migration matches with their respective models will show, for example, that the post stack and migration processing has in fact marginally lowered the apparent bandwidth. These data examples come from a pre-processed surface seismic data set. In the future well based QQA will test different processing procedures and migration velocities and algorithms to attempt to maintain or improve the reflectivity match through the processing sequence.
Such intermediate testing and its potential for improving the "quality" of the data will be illustrated. For example, two different types of predictive deconvolution are evaluated. Although one of the deconvolutions does not increase the high frequency of the reflectivity match (they are both gapped deconvolutions designed not to whiten the data above a certain frequency) the match across the complete bandwidth is shown to be remarkably better using one of the deconvolution methods, indicating better multiple suppression and/or a better preservation of the primary reflectivity.
In Geco-Prakla we have now applied the SED/QQA integrated approach to one 3D project in the North Sea as a test of the method and to learn more about its value. The integration of available borehole data into the sequence at all stages was a key to the methodology. It will be illustrated by comparison with the borehole control, bow much improvement was gained from the old 3D to the new data set. Much of the improvement was achieved by using different acquisition parameters determined through the SED phase and monitoring the acceptable variation of some of those parameters during the acquisition. In this respect, borehole data played a crucial role, particularly in determining the achievable bandwidth at the target from which key acquisition parameters were derived.
During the processing of the newly acquired data borehole seismic data, particularly in the form of VSPs, were used to monitor the effect of each and every process and parameter applied. As well as "testing" the accuracy of the reflectivity, deterministic estimates of the phase and bandwidth of the embedded wavelet derived from borehole data were utilized to calibrate other statistical measures that describe the "quality" of the embedded wavelet at the target level. The success of these approaches will be illustrated by the comparison of the results of a conventional processing decision making route and a route which used qualitative well based measures in the decision making.
The frequency dependent match between the YSP data and the synthetic is always computed as part of our SED evaluation. Matching between the log derived reflectivity, represented in the synthetic, and the measured reflectivity, represented in the YSP, leads to the concept of editing the log data in sympathy with the acoustic impedance obtained from the YSP, measured at seismic scale.
A very simplistic application of such a methodology is illustrated in the talk. The transposed, deconvolved YSP data is inverted to acoustic impedance using a sparse spike scheme. The acoustic impedance log derived from the calibrated sonic and density logs is compared and then edited against the inverted YSP control. The synthetic seismogram is then recomputed and compared with the YSP seismic response. The improvement in match that has been achieved by the editing process will be illustrated. The method is capable of considerable refinement, not least by the introduction of much more petrophysical data to aid the editing process. It may also offer an effective aid to calibrating the sonic log data against the YSP arrival times.
The value of borehole seismic data in improving the understanding of misties between log and surface seismic data should not be underestimated. When initially undertaking a visual or statistical comparison of log data with surface seismic data, the initial match is often poor. In such cases, it is unclear whether the mistie stems from errors in the log, the acquisition/processing of the surface seismic data or actual changes in lithology close to the borehole.
A zero-offset YSP provides a useful starting point when seeking to understand the origin of misties. A case will be illustrated where there is a good match between the borehole transposed YSP trace and the synthetic seismogram, although there are clearly some changes close to the borehole as evidenced by variation across the transposed YSP. This variation may be due to the increasing Fresnel zone size, as the borehole receiver moves further from a given reflector, but fails to account for the mismatch with the surface seismic, which still bears little resemblance to the YSP.
The Calibration Walkaway provides the "missing link" between the zero-offset YSP and the surface seismic response. It essentially provides an accurate estimate of the seismic response local to the well, including offset-dependant effects. By NMO-correcting the upgoing P-wavefield from the walkaway, we can then stack the walkaway in a manner which is equivalent to the surface seismic stack. In this case to be illustrated, it is worth noting the remarkable AYO differences between the various reflectors on the NMO-corrected walkaway. Such differences could not be observed directly on surface seismic gathers due to a severe multiple problem and high noise level. The final step is to see if the difference between the near-offset walkaway trace and the seismic equivalent walkaway stack explains the mismatch between the log-derived synthetic and the surface seismic. As will be observed, the walkaway stacks bridge the gap very successfully, leading to a much improved understanding of the relationship between log and surface seismic data. This knowledge can be used both to guide log editing (as described above) or, more importantly in this case, for controlled reprocessing of the surface seismic data. For example, there is no doubt that some events, which have significant amplitudes only at very short offsets, have been severely attenuated by the demultiple processing on the original surface seismic data.
In conclusion, the development work being undertaken with regard to the integration of surface and borehole seismic and other data is properly directed towards improving the information content of surface seismic data particularly in a way that will also make the process more efficient, less of an art and with reduced cost and turnaround.
The work of Philip Armstrong, Project Geophysicist in the Geosupport Group of Geco-Prakla in Gatwick UK, in supporting the preparation of this presentation is gratefully acknowledged.
About the Author(s)
Dick Ireson, who has 31 years of industry experience, is currently geoscience manager for Geco-Prakla within the corporate geosupport group in Gatwick, England. His principal area of responsibility is to develop the technical integration of borehole and surface seismic methods and products across the Schlumberger companies and to recommend the optimum way to develop this business within the company. He was originally with Seismograph Service (England) Ltd. and helped pioneer its development of borehole seismic technology. In 1990 he created a small team to develop and market 3D survey evaluation and design services and to develop methodologies integrating borehole and surface seismic measurements for improved reservoir description. This group became part of Geco-Prakla in 1992. Dick received a BS degree in mathematics and physics from the University of Sheffield, England, and is currently Associate Editor, responsible for borehole geophysics, for Geophysical Prospecting.