470 Posts in 231 Topics by 67 members
If this is your first visit, you will need to register before you can post. However, you can browse all messages below.
|Page: 1||Go to End|
|Author||Topic: Offset regularitation, supergather, Radon and SRME||263 Views|
15 October 2019 at 12:33am
I'm trying to demultiple multichannel data from a 480-channels 6 km streamer between 150 and 3000 m water depths. The data were shot at 37.5m interval. The nominal CMP fold is 80, but can be 87 in some CMPs (geometry defined using ukoaa P1/90 file). The group spacing is 12.5 m.
I would like to perform radon demultiple using CMP gathers in orther to obtain a better result and avoid aliasing. So I'm combining 6 CMPs into 1 supergather. My question refers to the need to perform an offset regularization before supergthering since the offset of the traces is real (not constant). I'm right?
And on the other hand, if I wanted to test the SRME, I'd previously do offset regularization and shot interpolation (x3), so that the shot interval and the group interval are 12.5m, right?
15 October 2019 at 1:30pm
The PRT_DEMULT module can be run with the MODE parameter set to Land or Marine, so doesn't require offset regularisation - but will run faster if the data does have a regular offset (MODE=Marine). For Land processing, the offset distribution is expected to be different for each gather, so the parabolic radon transform (PRT) operator must be re-calculated for each gather. This is by far the slowest part of PRT filtering. For MODE=Land, the offsets are taken from the trace headers. Whereas for marine processing, where the offset list is the same for each gather, it is much more efficient to calculate the operator for the first gather, and store it for later application to all successive gathers. This is much faster, but can use a lot of memory. For MODE=Marine the user specifies the first and last offsets, the offset increment, and the offset bin-size in the OFFSETS parameter of the PRT_DEMULT module.
The SRME method requires the input data to have the same shot & receiver spacing (shot interval equal to group interval). So, as you point out, you should apply offset regularisation (OFFREG) and shot interpolation (SHOTINT) to the input data prior to SRME. The data also should have had essentially no processing applied prior to application of SRME (although some basic filtering and swell noise removal should not distort the model significantly), needs to be extrapolated to zero offset, and ideally be as free from spatial aliasing as possible.
15 October 2019 at 11:16pm Last edited: 15 October 2019 11:23pm
Thank you for the explanation.
I have one more question. I'd like to test flattening on offset planes for the shallow part of my profile as the example in your "Demultiple routes" presentation but I'm not sure about the steps. Would the next job be correct if I apply this method pre-stack (in the shots)?
Before, perform offset regularization on the data and write shots in hdf5 format
1. SEISREAD - Read shots.hdf5 file using PKEY=offset and SKEY=shotid in order to sort data to offset planes
2. FLATTEN - Parameters DATUM=1000, QORS=S, DIGFILE1=multiple_time.dig, HORIZON1=mult. Or maybe as the SF time is already stored in my data in the DELAY header, I can just use in the FLATTEN module HEADER1=DELAY, SCALAR1=2 to define the time of the multiple
4. FK_FILT - Parameters: OFF1=91, OFF2= 6078.5 m, DX= 12.5 m, APPLY =yes, PDIP=5, NDIP=54, TAPER=0.35
5. FLATTEN - Same parameters but INVERSE=Yes
My main question is how to define the filter to be applied from 950 to 1050 ms as you explain in your example, as in the FK_FILT module there is not a related parameter.
As I explained before, the acquisition parameters are: 480-channels, nominal CMP fold=80, but can be 87 in some CMPs (geometry defined using ukoaa P1/90 file). The group spacing is 12.5 m, so the CDP interval is 6.25 m. Nearoffset =91 m
16 October 2019 at 11:08am
In the FLATTEN module the user can specify either a digfile and horizon (using the DIGFILE1 and HORIZON1 parameters) OR a trace header (using the HEADER1 - specify a header value that contains the times of an event you want to flatten, and optionally SCALAR1 - scalar to multiply the values seen in HEADER1, parameters). If the sea floor has been stored in the headers (for example DELAY, ms), a scalar of 2 would flatten the first water-bottom multiple, and so forth. A second horizon can also be used as a reference (along with a second scalar) to allow multiples of deeper events and peg-leg multiples to be flattened.
In the QFK module there is a TWIN_TIMES parameter if you want to time-window your data (ie, apply FK only between two specified times, with tapering at the boundaries). The TWIN_TIMES parameter takes four values in ms, for the start-time, end-time, on-taper length and off-taper length. The output traces will contain fully FK filtered data between start-time and end-time, and completely unfiltered data between 0ms and (start time minus on-taper) ms, and from end time plus off-taper ms to the end of the trace. Linear tapers are used.
Set the DIP parameter to a small value, for example +/-0.5ms/trace, so basically anything flat within the time-window will be removed.
17 October 2019 at 3:19am
I'm getting some problems with the FLATTEN and QFK modules that I cannot solve.
FLATTEN module return the following error: "Data cannot be shifted above 0s, please increase datum". I tried increasing it until 5000 ms but I got the same error. I fact, I don't understand the problem. If I understood correctly the module's help, the module just shift each trace from the time defined for the multiple using the corresponding horizon or header+scalar to the time defined in the Datum parameter. Is that right? So, if I defined the datum=1000ms, the multiple must just shift to this position, not above zero seconds.
I attached the job and log files
17 October 2019 at 9:34am
Hi Alejandra, could you send your jobflow, logfile, some example data, *.smu mute file to firstname.lastname@example.org - we can then take a look, try to have a bit of a play here, and get back to you with any suggestions/comments.
18 October 2019 at 2:38pm
Try setting the DATUM parameter in the FLATTEN module to a value greater than the first waterbottom multiple and increasing the tracelength appropriately so no data is lost off the top or bottom of the traces.
It's looking like the FKMUTE module might actually be a better option than the QFK module. As with the QFK module the FKMUTE module also has a TWIN_TIMES parameter if you want to time-window your data.
|Go to Top|