首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Data interpolation is an important step for seismic data analysis because many processing tasks, such as multiple attenuation and migration, are based on regularly sampled seismic data. Failed interpolations may introduce artifacts and eventually lead to inaccurate final processing results. In this paper, we generalised seismic data interpolation as a basis pursuit problem and proposed an iteration framework for recovering missing data. The method is based on non‐linear iteration and sparse transform. A modified Bregman iteration is used for solving the constrained minimisation problem based on compressed sensing. The new iterative strategy guarantees fast convergence by using a fixed threshold value. We also propose a generalised velocity‐dependent formulation of the seislet transform as an effective sparse transform, in which the non‐hyperbolic normal moveout equation serves as a bridge between local slope patterns and moveout parametres in the common‐midpoint domain. It can also be reduced to the traditional velocity‐dependent seislet if special heterogeneity parametre is selected. The generalised velocity‐dependent seislet transform predicts prestack reflection data in offset coordinates, which provides a high compression of reflection events. The method was applied to synthetic and field data examples, and the results show that the generalised velocity‐dependent seislet transform can reconstruct missing data with the help of the modified Bregman iteration even for non‐hyperbolic reflections under complex conditions, such as vertical transverse isotropic (VTI) media or aliasing.  相似文献   

3.
In this study, we investigate the accuracy of approximating constant‐Q wave propagation by series of Zener or standard linear solid (SLS) mechanisms. Modelling in viscoacoustic and viscoelastic media is implemented in the time domain using the finite‐difference (FD) method. The accuracy of numerical solutions is evaluated by comparison with the analytical solution in homogeneous media. We found that the FD solutions using three SLS relaxation mechanisms as well as a single SLS mechanism, with properly chosen relaxation times, are quite accurate for both weak and strong attenuation. Although the RMS errors of FD simulations using a single relaxation mechanism increase with increasing offset, especially for strong attenuation (Q = 20), the results are still acceptable for practical applications. The synthetic data of the Marmousi‐II model further illustrate that the single SLS mechanism, to model constant Q, is efficient and sufficiently accurate. Moreover, it benefits from less computational costs in computer time and memory.  相似文献   

4.
Selecting a seismic time‐to‐depth conversion method can be a subjective choice that is made by geophysicists, and is particularly difficult if the accuracy of these methods is unknown. This study presents an automated statistical approach for assessing seismic time‐to‐depth conversion accuracy by integrating the cross‐validation method with four commonly used seismic time‐to‐depth conversion methods. To showcase this automated approach, we use a regional dataset from the Cooper and Eromanga basins, Australia, consisting of 13 three‐dimensional (3D) seismic surveys, 73 two‐way‐time surface grids and 729 wells. Approximately 10,000 error values (predicted depth vs. measured well depth) and associated variables were calculated. The average velocity method was the most accurate overall (7.6 m mean error); however, the most accurate method and the expected error changed by several metres depending on the combination and value of the most significant variables. Cluster analysis tested the significance of the associated variables to find that the seismic survey location (potentially related to local geology (i.e. sedimentology, structural geology, cementation, pore pressure, etc.), processing workflow, or seismic vintage), formation (potentially associated with reduced signal‐to‐noise with increasing depth or the changes in lithology), distance to the nearest well control, and the spatial location of the predicted well relative to the existing well data envelope had the largest impact on accuracy. Importantly, the effect of these significant variables on accuracy were found to be more important than choosing between the four methods, highlighting the importance of better understanding seismic time‐to‐depth conversions, which can be achieved by applying this automated cross‐validation method.  相似文献   

5.
Hydrocarbon production and fluid injection affect the level of subsurface stress and physical properties of the subsurface, and can cause reservoir‐related issues, such as compaction and subsidence. Monitoring of oil and gas reservoirs is therefore crucial. Time‐lapse seismic is used to monitor reservoirs and provide evidence of saturation and pressure changes within the reservoir. However, relative to background velocities and reflector depths, the time‐lapse changes in velocity and geomechanical properties are typically small between consecutive surveys. These changes can be measured by using apparent displacement between migrated images obtained from recorded data of multiple time‐lapse surveys. Apparent displacement measurements by using the classical cross‐correlation method are poorly resolved. Here, we propose the use of a phase‐correlation method, which has been developed in satellite imaging for sub‐pixel registration of the images, to overcome the limitations of cross‐correlation. Phase correlation provides both vertical and horizontal displacements with a much better resolution. After testing the method on synthetic data, we apply it to a real dataset from the Norne oil field and show that the phase‐correlation method can indeed provide better resolution.  相似文献   

6.
The injection of CO2 at the Ketzin pilot CO2 storage site started in June 2008 and ended in August 2013. During the 62 months of injection, a total amount of about 67 kt of CO2 was injected into a saline aquifer. A third repeat three‐dimensional seismic survey, serving as the first post‐injection survey, was acquired in 2015, aiming to investigate the recent movement of the injected CO2. Consistent with the previous two time‐lapse surveys, a predominantly west–northwest migration of the gaseous CO2 plume in the up‐dip direction within the reservoir is inferred in this first post‐injection survey. No systematic anomalies are detected through the reservoir overburden. The extent of the CO2 plume west of the injection site is almost identical to that found in the 2012 second repeat survey (after injection of 61 kt); however, there is a significant decrease in its size east of the injection site. Assessment of the CO2 plume distribution suggests that the decrease in the size of the anomaly may be due to multiple factors, such as limited vertical resolution, CO2 dissolution, and CO2 migration into thin layers, in addition to the effects of ambient noise. Four‐dimensional seismic modelling based on dynamic flow simulations indicates that a dynamic balance between the newly injected CO2 after the second repeat survey and the CO2 migrating into thin layers and being dissolved was reached by the time of the first post‐injection survey. In view of the significant uncertainties in CO2 mass estimation, both patchy and non‐patchy saturation models for the Ketzin site were taken into consideration.  相似文献   

7.
Spectral decomposition is a widely used technique in analysis and interpretation of seismic data. According to the uncertainty principle, there exists a lower bound for the joint time–frequency resolution of seismic signals. The highest temporal resolution is achieved by a matching pursuit approach which uses waveforms from a dictionary of functions (atoms). This method, in its pure mathematical form can result in atoms whose shape and phase have no relation to the seismic trace. The high‐definition frequency decomposition algorithm presented in this paper interleaves iterations of atom matching and optimization. It divides the seismic trace into independent sections delineated by envelope troughs, and simultaneously matches atoms to all peaks. Co‐optimization of overlapping atoms ensures that the effects of interference between them are minimized. Finally, a second atom matching and optimization phase is performed in order to minimize the difference between the original and the reconstructed trace. The fully reconstructed traces can be used as inputs for a frequency‐based reconstruction and red–green–blue colour blending. Comparison with the results of the original matching pursuit frequency decomposition illustrates that high‐definition frequency decomposition based colour blends provide a very high temporal resolution, even in the low‐energy parts of the seismic data, enabling a precise analysis of geometrical variations of geological features.  相似文献   

8.
说明了SuperSeis系统的地震目录格式(即Eq3格式);详细介绍了不同地震目录格式间的互转;分析了OBDC技术对Excel文件的读取以及ADO技术操作Access数据库;实现了对Eq3目录格式文件的直接打印功能。该工具软件程序少、操作简便,可以为一定数量用户群的工作带来方便。  相似文献   

9.
Compressed Sensing has recently proved itself as a successful tool to help address the challenges of acquisition and processing seismic data sets. Compressed sensing shows that the information contained in sparse signals can be recovered accurately from a small number of linear measurements using a sparsity‐promoting regularization. This paper investigates two aspects of compressed sensing in seismic exploration: (i) using a general non‐convex regularizer instead of the conventional one‐norm minimization for sparsity promotion and (ii) using a frequency mask to additionally subsample the acquired traces in the frequency‐space () domain. The proposed non‐convex regularizer has better sparse recovery performance compared with one‐norm minimization and the additional frequency mask allows us to incorporate a priori information about the events contained in the wavefields into the reconstruction. For example, (i) seismic data are band‐limited; therefore one can use only a partial set of frequency coefficients in the range of reflections band, where the signal‐to‐noise ratio is high and spatial aliasing is low, to reconstruct the original wavefield, and (ii) low‐frequency characteristics of the coherent ground rolls allow direct elimination of them during reconstruction by disregarding the corresponding frequency coefficients (usually bellow 10 Hz) via a frequency mask. The results of this paper show that some challenges of reconstruction and denoising in seismic exploration can be addressed under a unified formulation. It is illustrated numerically that the compressed sensing performance for seismic data interpolation is improved significantly when an additional coherent subsampling is performed in the domain compared with the domain case. Numerical experiments from both simulated and real field data are included to illustrate the effectiveness of the presented method.  相似文献   

10.
Seismic inversion has drawn the attention of researchers due to its capability of building an accurate earth model. Such a model will need to be discretised finely, and the dimensions of the inversion problem will be very high. In this paper, we propose an efficient differential evolution algorithm and apply it to high‐dimensional seismic inversion. Our method takes into account the differences among individuals, which are disregarded in conventional differential evolution methods, resulting to a better balance between exploration and exploitation. We divide the entire population into three subpopulations and propose a novel mutation strategy with two phases. Furthermore, we optimise the crossover operator by applying the components having the best objective function values into the crossover operator. We embed this strategy into a cooperative coevolutionary differential evolution and propose a new differential evolution algorithm referred to as a differential evolution with subpopulations. Then, we apply our scheme to both synthetic and field data; the results of high‐dimensional seismic inversion have shown that the proposed differential evolution with subpopulations achieves faster convergence and a higher‐quality solution for seismic inversion.  相似文献   

11.
Least squares Fourier reconstruction is basically a solution to a discrete linear inverse problem that attempts to recover the Fourier spectrum of the seismic wavefield from irregularly sampled data along the spatial coordinates. The estimated Fourier coefficients are then used to reconstruct the data in a regular grid via a standard inverse Fourier transform (inverse discrete Fourier transform or inverse fast Fourier transform). Unfortunately, this kind of inverse problem is usually under‐determined and ill‐conditioned. For this reason, the least squares Fourier reconstruction with minimum norm adopts a damped least squares inversion to retrieve a unique and stable solution. In this work, we show how the damping can introduce artefacts on the reconstructed 3D data. To quantitatively describe this issue, we introduce the concept of “extended” model resolution matrix, and we formulate the reconstruction problem as an appraisal problem. Through the simultaneous analysis of the extended model resolution matrix and of the noise term, we discuss the limits of the Fourier reconstruction with minimum norm reconstruction and assess the validity of the reconstructed data and the possible bias introduced by the inversion process. Also, we can guide the parameterization of the forward problem to minimize the occurrence of unwanted artefacts. A simple synthetic example and real data from a 3D marine common shot gather are used to discuss our approach and to show the results of Fourier reconstruction with minimum norm reconstruction.  相似文献   

12.
In this paper, we propose a workflow based on SalSi for the detection and delineation of geological structures such as salt domes. SalSi is a seismic attribute designed based on the modelling of human visual system that detects the salient features and captures the spatial correlation within seismic volumes for delineating seismic structures. Using this attribute we cannot only highlight the neighbouring regions of salt domes to assist a seismic interpreter but also delineate such structures using a region growing method and post‐processing. The proposed delineation workflow detects the salt‐dome boundary with very good precision and accuracy. Experimental results show the effectiveness of the proposed workflow on a real seismic dataset acquired from the North Sea, F3 block. For the subjective evaluation of the results of different salt‐dome delineation algorithms, we have used a reference salt‐dome boundary interpreted by a geophysicist. For the objective evaluation of results, we have used five different metrics based on pixels, shape, and curvedness to establish the effectiveness of the proposed workflow. The proposed workflow is not only fast but also yields better results as compared with other salt‐dome delineation algorithms and shows a promising potential in seismic interpretation.  相似文献   

13.
The idea of curvature analysis has been widely used in subsurface structure interpretation from three-dimensional seismic data (e.g., fault/fracture detection and geomorphology delineation) by measuring the lateral changes in the geometry of seismic events. However, such geometric curvature utilizes only the kinematic information (two-way traveltime) of the available seismic signals. While analysing the dynamic information (waveform), the traditional approaches (e.g., complex trace analysis) are often trace-wise and thereby fail to take into account the seismic reflector continuity and deviate from the true direction of geologic deposition, especially for steeply dipping formations. This study proposes extending the three-dimensional curvature analysis to the waveforms in a seismic profile, here denoted as the waveform curvature, and investigates the associated implications for assisting seismic interpretation. Applications to the F3 seismic dataset over the Netherlands North Sea demonstrate the added values of the proposed waveform curvature analysis in four aspects. First, the capability of the curvature operator in differentiating convex and concave bending allows automatic decomposition of a seismic image by the reflector types (peaks, troughs and zero crossings), which can greatly facilitate computer-aided horizon interpretation and modelling from three-dimensional seismic data. Second, the signed minimum curvature offers a new analytical approach for estimating the fundamental and important reflector dip attribute by searching the orientation associated with least waveform variation. Third, the signed maximum curvature makes it possible to analyse the seismic signals along the normal direction of the reflection events. Finally, the curvature analysis promotes the frequency bands of the seismic signals and thereby enhances the apparent resolution on identifying and interpreting subtle seismic features.  相似文献   

14.
In order to deconvolve the ghost response from marine seismic data, an estimate of the ghost operator is required. Typically, this estimate is made using a model of in‐plane propagation, i.e., the ray path at the receiver falls in the vertical plane defined by the source and receiver locations. Unfortunately, this model breaks down when the source is in a crossline position relative to the receiver spread. In this situation, in‐plane signals can only exist in a small region of the signal cone. In this paper, we use Bayes' theory to model the posterior probability distribution functions for the vertical component of the ray vector given the known source–receiver azimuth and the measured inline component of the ray vector. This provides a model for the ghost delay time based on the acquisition geometry and the dip of the wave in the plane of the streamer. The model is fairly robust with regard to the prior assumptions and controlled by a single parameter that is related to the likelihood of in‐plane propagation. The expected values of the resulting distributions are consistent with the deterministic in‐plane model when in‐plane likelihood is high but valid everywhere in the signal cone. Relaxing the in‐plane likelihood to a reasonable degree radically simplifies the shape of the expected‐value surface, lending itself for use in deghosting algorithms. The model can also be extended to other plane‐wave processing problems such as interpolation.  相似文献   

15.
Reflection seismic data were acquired within two field campaigns in the Blötberget, Ludvika mining area of central Sweden, for deep imaging of iron-oxide mineralization that were known to extend down to 800–850 m depth. The two surveys conducted in years 2015 and 2016, one employing a seismic landstreamer and geophones connected to wireless recorders, and another one using cabled geophones and wireless recorders, aimed to delineate the geometry and depth extent of the iron-oxide mineralization for when mining commences in the area. Even with minimal and conventional processing approaches, the merged datasets provide encouraging information about the depth continuation of the mineralized horizons and the geological setting of the study area. Multiple sets of strong reflections represent a possible continuation of the known deposits that extend approximately 300 m further down-dip than the known 850 m depth obtained from historical drilling. They show excellent correlation in shape and strength with those of the Blötberget deposits. Furthermore, several reflections in the footwall of the known mineralization can potentially be additional resources underlying the known ones. The results from these seismic surveys are encouraging for mineral exploration purposes given the good quality of the final section and fast seismic surveys employing a simple cost-effective and easily available impact-type seismic source.  相似文献   

16.
Reverse‐time migration has become an industry standard for imaging in complex geological areas. We present an approach for increasing its imaging resolution by employing time‐shift gathers. The method consists of two steps: (i) migrating seismic data with the extended imaging condition to get time‐shift gathers and (ii) accumulating the information from time‐shift gathers after they are transformed to zero‐lag time‐shift by a post‐stack depth migration on a finer grid. The final image is generated on a grid, which is denser than that of the original image, thus improving the resolution of the migrated images. Our method is based on the observation that non‐zero‐lag time‐shift images recorded on the regular computing grid contain the information of zero‐lag time‐shift image on a denser grid, and such information can be continued to zero‐lag time‐shift and refocused at the correct locations on the denser grid. The extra computational cost of the proposed method amounts to the computational cost of zero‐offset migration and is almost negligible compared with the cost of pre‐stack shot‐record reverse‐time migration. Numerical tests on synthetic models demonstrate that the method can effectively improve reverse‐time migration resolution. It can also be regarded as an approach to improve the efficiency of reverse‐time migration by performing wavefield extrapolation on a coarse grid and by generating the final image on the desired fine grid.  相似文献   

17.
基于非均匀Fourier变换的地震数据重建方法研究   总被引:1,自引:2,他引:1       下载免费PDF全文
不规则采样地震数据会对地震数据的多道处理造成严重影响,将非均匀Fourier变换和贝叶斯参数反演方法相结合,对不规则空间带限地震数据进行反演重建.对每一个频率依据最小视速度确定出重建数据的带宽,然后从不规则地震数据中估计出重建数据的空间Fourier系数.将不规则地震数据重建视为信息重建的地球物理反演问题,运用贝叶斯参数反演理论来估计Fourier系数.在反演求解时,使用共轭梯度算法,以保证求解的稳定性,加快解的收敛速度.理论模型和实际资料处理验证了本方法的有效性和实用性.  相似文献   

18.
Prestack image volumes may be decomposed into specular and non‐specular parts by filters defined in the dip‐angle domain. For space‐shift extended image volumes, the dip‐angle decomposition is derived via local Radon transform in depth and midpoint coordinates, followed by an averaging over space‐shifts. We propose to employ prestack space‐shift extended reverse‐time migration and dip‐angle decomposition for imaging small‐scale structural elements, considered as seismic diffractors, in models with arbitrary complexity. A suitable design of a specularity filter in the dip‐angle domain rejects the dominant reflectors and enhances diffractors and other non‐specular image content. The filter exploits a clear discrimination in dip between specular reflections and diffractions. The former are stationary at the specular dip, whereas the latter are non‐stationary without a preferred dip direction. While the filtered image volume features other than the diffractor images (for example, noise and truncation artefacts are also present), synthetic and field data examples suggest that diffractors tend to dominate and are readily recognisable. Averaging over space‐shifts in the filter construction makes the reflectors? rejection robust against migration velocity errors. Another consequence of the space‐shift extension and its angle‐domain transforms is the possibility of exploring the image in a multiple set of common‐image gathers. The filtered diffractions may be analysed simultaneously in space‐shift, scattering‐angle, and dip‐angle image gathers by means of a single migration job. The deliverables of our method obviously enrich the processed material on the interpreter's desk. We expect them to further supplement our understanding of the Earth's interior.  相似文献   

19.
We present the results of a probabilistic seismic hazard assessment and disaggregation analysis aimed to understand the dominant magnitudes and source-to-site distances of earthquakes that control the hazard at the Celano site in the Abruzzo region of central Italy. Firstly, we calculated a peak ground acceleration map for the central Apennines area, by using a model of seismogenic sources defined on geological-structural basis. The source model definition and the probabilistic seismic hazard evaluation at the regional scale (central Apennines) were obtained using three different seismicity models (Gutenberg–Richter model; characteristic earthquake model; hybrid model), consistent with the available seismological information. Moreover, a simplified time-dependent hypothesis has been introduced, computing the conditional probability of earthquakes occurrence by Brownian passage time distributions.Subsequently, we carried out the disaggregation analysis, with a modified version of the SEISRISK III code, in order to separate the contribution of each source to the total hazard.The results show the percentage contribution to the Celano hazard of the various seismogenic sources, for different expected peak ground acceleration classes. The analysis was differentiated for close (distance from Celano <20 km) and distant (distance from Celano >20 km) seismogenic sources. We propose three different “scenario earthquakes”, useful for the site condition studies and for the seismic microzoning study: (1) large (M=6.6) local (Celano-epicentre distance 16 km) earthquake, with mean recurrence time of 590 years; (2) moderate (M=5.5) local (Celano-epicentre distance 7.5 km) earthquake, with mean recurrence time of 500 years; and (3) large (M=6.6) distant (Celano-epicentre distance 24 km) earthquake, with mean recurrence time of 980 years.The probabilistic and time-dependent approach to the definition of the “scenario earthquakes” changes clearly the results in comparison to traditional deterministic analysis, with effects in terms of engineering design and seismic risk reduction.  相似文献   

20.
The automatic detection of geological features such as faults and channels is a challenging problem in today's seismic exploration industry. Edge detection filters are generally applied to locate features. It is desirable to reduce noise in the data before edge detection. The application of smoothing or low‐pass filters results in noise suppression, but this causes edge blurring as well. Edge‐preserving smoothing is a technique that results in simultaneous edge preservation and noise suppression. Until now, edge‐preserving smoothing has been carried out on rectangular sampled seismic data. In this paper, an attempt has been made to detect edges by applying edge‐preserving smoothing as a pre‐processing step in the hexagonally sampled seismic‐data spatial domain. A hexagonal approach is an efficient method of sampling and has greater symmetry than a rectangular approach. Here, spiral architecture has been employed to handle the hexagonally sampled seismic data. A comparison of edge‐preserving smoothing on both rectangular and hexagonally sampled seismic data is carried out. The data used were provided by Saudi Aramco. It is shown that hexagonal processing results in well‐defined edges with fewer computations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号