Reverse‐time migration has become an industry standard for imaging in complex geological areas. We present an approach for increasing its imaging resolution by employing time‐shift gathers. The method consists of two steps: (i) migrating seismic data with the extended imaging condition to get time‐shift gathers and (ii) accumulating the information from time‐shift gathers after they are transformed to zero‐lag time‐shift by a post‐stack depth migration on a finer grid. The final image is generated on a grid, which is denser than that of the original image, thus improving the resolution of the migrated images. Our method is based on the observation that non‐zero‐lag time‐shift images recorded on the regular computing grid contain the information of zero‐lag time‐shift image on a denser grid, and such information can be continued to zero‐lag time‐shift and refocused at the correct locations on the denser grid. The extra computational cost of the proposed method amounts to the computational cost of zero‐offset migration and is almost negligible compared with the cost of pre‐stack shot‐record reverse‐time migration. Numerical tests on synthetic models demonstrate that the method can effectively improve reverse‐time migration resolution. It can also be regarded as an approach to improve the efficiency of reverse‐time migration by performing wavefield extrapolation on a coarse grid and by generating the final image on the desired fine grid. 相似文献
A critical sampling grid can be defined for an earth related natural variable distributed in space, according to established
theoretical results and under certain mathematical conditions. Sampling above this critical limit does not substantially improve
mapping results, while based on this limit the ideal process of reproducing the original phenomenon is theoretically defined.
The aim of the present paper is, by using an innovative approach; to investigate the validity of commonly used interpolation
algorithms, both stochastic and deterministic, below and above this critical sampling limit. When sampling is dense, application
to a simulated spatial random field shows that the results are equally accurate with those derived with more sophisticated
stochastic methods. On the other hand, when the sampling grid is sparse, deterministic methods produce less accurate results,
therefore stochastic algorithms with minimum estimation error are a much better option. To further demonstrate these points,
the interpolation algorithms were applied in three different sampling grid densities in a contaminated waste disposal site
in Russia. 相似文献
The study examines the correlation function of tropical monsoon rainfall on monthly, seasonal and annual time scales and obtains the relationship between this function and the distance. The area selected for study is Vidarbha with a fairly dense network of rain gauges. Vidarbha is a meteorological sub-division of the state of Maharashtra in India. Utilizing the relationship between the correlation function of the rainfall field and the distance, the errors of optimum interpolation of rainfall at a point have been computed by applying the method of optimum interpolation byGandin (1970). Relationships between the errors of interpolation and distance have been evaluated and from this the maximum spacing allowed between rain gauges for a specified tolerable error in interpolation has been estimated for each of the periods. 相似文献
Compressed Sensing has recently proved itself as a successful tool to help address the challenges of acquisition and processing seismic data sets. Compressed sensing shows that the information contained in sparse signals can be recovered accurately from a small number of linear measurements using a sparsity‐promoting regularization. This paper investigates two aspects of compressed sensing in seismic exploration: (i) using a general non‐convex regularizer instead of the conventional one‐norm minimization for sparsity promotion and (ii) using a frequency mask to additionally subsample the acquired traces in the frequency‐space () domain. The proposed non‐convex regularizer has better sparse recovery performance compared with one‐norm minimization and the additional frequency mask allows us to incorporate a priori information about the events contained in the wavefields into the reconstruction. For example, (i) seismic data are band‐limited; therefore one can use only a partial set of frequency coefficients in the range of reflections band, where the signal‐to‐noise ratio is high and spatial aliasing is low, to reconstruct the original wavefield, and (ii) low‐frequency characteristics of the coherent ground rolls allow direct elimination of them during reconstruction by disregarding the corresponding frequency coefficients (usually bellow 10 Hz) via a frequency mask. The results of this paper show that some challenges of reconstruction and denoising in seismic exploration can be addressed under a unified formulation. It is illustrated numerically that the compressed sensing performance for seismic data interpolation is improved significantly when an additional coherent subsampling is performed in the domain compared with the domain case. Numerical experiments from both simulated and real field data are included to illustrate the effectiveness of the presented method. 相似文献