The VSP Corridor Stack - An Imperative Constraint in the Age of Machine Learning

I am currently (virtually) attending the 2020 CSEG GeoConvention and have attended many excellent talks. However, I am now even more convinced than ever about the current need for raising awareness of the benefits of VSPs and in particular the corridor stack as a yard stick for processing parameter optimization and the QC of processing flows.

As an aside, and not for the first time I would like to bang the VSP drum one more time. Please, no more checkshots, unless operational constraints require. Of course, this is not a concern for DAS surveys but even for traditional geophone arrays, the incremental additional rig time and cost of acquiring a VSP over a checkshot is minimal. Don’t do it. Acquire VSP data to as shallow as you can get, even if the benefits are not immediately obvious someone in your organization will someday thank you.

Now back to the original theme of this week’s blog.


Today, when Machine Learning (ML) is being used for both seismic processing and interpretation, it is timely to reconsider the use of the VSP Corridor Stack as an imperative constraint on ML algorithms. A constraint which can provide both time and phase control points at a well, and, be used to test and validate the output from ML algorithms. 

The Vertical Seismic Profiles (VSP) has long been recognized as an invaluable tool for all Geophysicists engaged in exploration with surface seismic data whether it be onshore or offshore. A VSP is a specific type of seismic survey which employs receivers in a wellbore and a seismic source at the surface. It is used to aid in subsurface characterization typically in the recovery of hydrocarbons but also has mining and engineering geophysical applications.

A zero offset VSP provides a robust calibration point for both the phase and time-depth relationship of the seismic response along a wellbore. The derived corridor stack represents the zero phase, multiple free seismic response with a hard time-depth tie at the well.

One of the most significant features of the VSP geometry is that we can distinguish between wave modes moving upward and downward in the well based on their moveout which can be used to separate them to give a downgoing and upgoing wavefield at each receiver level. This means that at each receiver depth we have a direct measurement of how the wavelet has changed due to spreading, transmission and attenuation as it has passed through the earths minimum phase filter. This is an invaluable feature of the VSP and allows us to extract the corridor stack. This is in stark contrast to surface seismic acquisition which records only upward moving energy at the surface. 

 
VSP+processing+1.jpg

Vertical component zero offset VSP showing upgoing and downgoing wave fields.

VSP+processing+2.jpg

Downgoing wavefield derived from full wavefield. Note multiple contamination and phase change with depth.

VSP+processing+3.jpg

Upgoing wavefield derived from subtracting downgoing from total wavefield. This wavefield still contains multiples and phase changes contained in downgoing wavefield.

 
VSP+processing+4.jpg

The downgoing wavefield on the left is deconvolved with a deterministic operator to remove multiples and collapse the trace to a band limited zero phase wavelet (center panel). The same operator is applied to the upgoing wavefield (right).

 
Picture1.png

(Image source: A Gentle Introduction to Machine Learning, Mark Dahl, P.Geo. CSEG Recorder Jan 2018, Vol 43 No. 1)

Machine Learning can be applied to a broad swath of geophysical problems (Dahl, 2018) summarized on the left.

All machine learning aspects of the seismic challenges shown here involve making predictions on outcomes based on some assumptions. The process is typically benchmarked against a subset of data to assess the result. One of the main challenges with this approach is that the test data is likely to have inherited a lot of the challenges of surface seismic; it might be of mixed phase, contaminated with multiples, perhaps with datum problems resulting in time shifts or migration artefacts. Any of these issues will result in a machine ‘learnt’ result with similar challenges at best.

It seems to me that a preferred method might be to systematically utilize borehole seismic data (corridor stacks, multiple information, Q values, measured anisotropy) to calibrate the processing at each step and ensure that any subset used for benchmarking ML tests are tied in time and phase to the corridor stack at the well location. In other word use your in-situ measured, zero offset, zero phase, multiple free seismic trace as the ‘holy grail’ at the well location. Any other manifestation of seismic through that point in the earth is ‘required’ to match it or at least match it better than that from which it was derived.

Several years ago, I was involved in a surface seismic processing project offshore Eastern Canada (Kaderali, 2007). Although at the time there was no mention of ‘Machine Learning’, with hindsight a lot of what was incorporated in that project does follow the philosophy being advocated here. Our client was a strong promoter of the value of VSP data as a constraint for the processing; to optimize parametrization, to guide velocity picking to derive borehole calibrated anisotropic values and to directly measure the attenuation factor, Q. At each step in the (PSTM) processing, a seismic well trace (derived from a super gather) derived from each parameterization, is compared to the corridor stack over the two-way time window of interest and the resultant correlation coefficient compared. The parameter which gave the best correlation was selected.

Thus, the final seismic volumes were calibrated at the well locations and “demonstrate that it is essential to honor and incorporate well information in the processing of surface seismic data. If the seismic data are to be trusted away from the well locations, at the very least they must agree at the well locations.”

So, compress, constrain, extrapolate, interpolate or skeletonize your seismic data as much as you like, knock yourselves out, fill your boots, but don’t forget to calibrate, correlate and quantify misfits as well, and of course the best way to do this is with the corridor stack.

References:

Dahl M. 2018. A Gentle Introduction to Machine Learning. CSEG Recorder, Vol 43 No. 1

Kaderali A. et al. 2007. White Rose seismic with well data constraints: A case history. The Leading Edge 26:6, 742-754

Previous
Previous

Microseismic Event ‘Quality’. A Personal Approach to Data QC

Next
Next

Seismic Acquisition Geometry in a Class of its’ Own