首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Seismic data processing is a challenging task, especially when dealing with vector-valued datasets. These data are characterized by correlated components, where different levels of uncorrelated random noise corrupt each one of the components. Mitigating such noise while preserving the signal of interest is a primary goal in the seismic-processing workflow. The frequency-space deconvolution is a well-known linear prediction technique, which is commonly used for random noise suppression. This paper represents vector-field seismic data through quaternion arrays and shows how to mitigate random noise by proposing the extension of the frequency-space deconvolution to its hypercomplex version, the quaternion frequency-space deconvolution. It also shows how a widely linear prediction model exploits the correlation between data components of improper signals. The widely linear scheme, named widely-linear quaternion frequency-space deconvolution, produces longer prediction filters, which have enhanced signal preservation capabilities shown through synthetic and field vector-valued data examples.  相似文献   

2.
A method for interpolation of multicomponent streamer data based on using the local directionality structure is presented. The derivative components are used to estimate a vector field that locally describes the direction with the least variability. Given this vector field, the interpolation can be phrased in terms of the solution of a partial differential equation that describes how energy is transported between regions of missing data. The approach can be efficiently implemented using readily available routines for computer graphics. The method is robust to noise in the measurements and particularly towards high levels of low‐frequent noise that is present in the derivative components of the multicomponent streamer data.  相似文献   

3.
A marine source generates both a direct wavefield and a ghost wavefield. This is caused by the strong surface reflectivity, resulting in a blended source array, the blending process being natural. The two unblended response wavefields correspond to the real source at the actual location below the water level and to the ghost source at the mirrored location above the water level. As a consequence, deghosting becomes deblending (‘echo‐deblending’) and can be carried out with a deblending algorithm. In this paper we present source deghosting by an iterative deblending algorithm that properly includes the angle dependence of the ghost: It represents a closed‐loop, non‐causal solution. The proposed echo‐deblending algorithm is also applied to the detector deghosting problem. The detector cable may be slanted, and shot records may be generated by blended source arrays, the blending being created by simultaneous sources. Similar to surface‐related multiple elimination the method is independent of the complexity of the subsurface; only what happens at and near the surface is relevant. This means that the actual sea state may cause the reflection coefficient to become frequency dependent, and the water velocity may not be constant due to temporal and lateral variations in the pressure, temperature, and salinity. As a consequence, we propose that estimation of the actual ghost model should be part of the echo‐deblending algorithm. This is particularly true for source deghosting, where interaction of the source wavefield with the surface may be far from linear. The echo‐deblending theory also shows how multi‐level source acquisition and multi‐level streamer acquisition can be numerically simulated from standard acquisition data. The simulated multi‐level measurements increase the performance of the echo‐deblending process. The output of the echo‐deblending algorithm on the source side consists of two ghost‐free records: one generated by the real source at the actual location below the water level and one generated by the ghost source at the mirrored location above the water level. If we apply our algorithm at the detector side as well, we end up with four ghost‐free shot records. All these records are input to migration. Finally, we demonstrate that the proposed echo‐deblending algorithm is robust for background noise.  相似文献   

4.
Most seismic processing algorithms generally consider the sea surface as a flat reflector. However, acquisition of marine seismic data often takes place in weather conditions where this approximation is inaccurate. The distortion in the seismic wavelet introduced by the rough sea may influence (for example) deghosting results, as deghosting operators are typically recursive and sensitive to the changes in the seismic signal. In this paper, we study the effect of sea surface roughness on conventional (5–160 Hz) and ultra‐high‐resolution (200–3500 Hz) single‐component towed‐streamer data. To this end, we numerically simulate reflections from a rough sea surface using the Kirchhoff approximation. Our modelling demonstrates that for conventional seismic frequency band sea roughness can distort results of standard one‐dimensional and two‐dimensional deterministic deghosting. To mitigate this effect, we introduce regularisation and optimisation based on the minimum‐energy criterion and show that this improves the processing output significantly. Analysis of ultra‐high‐resolution field data in conjunction with modelling shows that even relatively calm sea state (i.e., 15 cm wave height) introduces significant changes in the seismic signal for ultra‐high‐frequency band. These changes in amplitude and arrival time may degrade the results of deghosting. Using the field dataset, we show how the minimum‐energy optimisation of deghosting parameters improves the processing result.  相似文献   

5.
We develop the true‐amplitude prestack migration of multicomponent data based on the use of elastic Gaussian beams for walkaway vertical seismic profile (VSP) acquisition systems. It consists in a weighted summation of multishot data with specific weights, computed by tracing elastic Gaussian beams from each imaging point of the target area towards the sources and receivers. Each pair of beams may be connected with either a pair of P‐rays (PP‐image) or the P‐ray towards sources and the S‐ray to receivers (PS‐image) and is uniquely determined by dip (the angle of the bisector between the rays and the vertical direction) and opening (the angle between the rays) angles. Shooting from the bottom towards the acquisition system helps to avoid well‐known troubles, in particular multipathing for the imaging conditions in complex velocity models. The ability to fix the dip angle and implement summation over opening angles leads to the so‐called selective images that contain mostly interfaces with desired slopes. On the other hand, a set of images computed for a range of opening angles by summation over all available dip angles is used as input of an AVO‐like inversion procedure for the recovery of elastic parameters. The feasibility of this imaging procedure is verified by synthetic data for 2D realistic elastic models.  相似文献   

6.
7.
Wave field reconstruction – the estimation of a three‐dimensional (3D) wave field representing upgoing, downgoing or the combined total pressure at an arbitrary point within a marine streamer array – is enabled by simultaneous measurements of the crossline and vertical components of particle acceleration in addition to pressure in a multicomponent marine streamer. We examine a repeated sail line of North Sea data acquired by a prototype multicomponent towed‐streamer array for both wave field reconstruction fidelity (or accuracy) and reconstruction repeatability. Data from six cables, finely sampled in‐line but spaced at 75 m crossline, are reconstructed and placed on a rectangular data grid uniformly spaced at 6.25 m in‐line and crossline. Benchmarks are generated using recorded pressure data and compared with wave fields reconstructed from pressure alone, and from combinations of pressure, crossline acceleration and vertical acceleration. We find that reconstruction using pressure and both crossline and vertical acceleration has excellent fidelity, recapturing highly aliased diffractions that are lost by interpolation of pressure‐only data. We model wave field reconstruction error as a linear function of distance from the nearest physical sensor and find, for this data set with some mismatched shot positions, that the reconstructed wave field error sensitivity to sensor mispositioning is one‐third that of the recorded wave field sensitivity. Multicomponent reconstruction is also more repeatable, outperforming single‐component reconstruction in which wave field mismatch correlates with geometry mismatch. We find that adequate repeatability may mask poor reconstruction fidelity and that aliased reconstructions will repeat if the survey geometry repeats. Although the multicomponent 3D data have only 500 m in‐line aperture, limiting the attenuation of non‐repeating multiples, the level of repeatability achieved is extremely encouraging compared to full‐aperture, pressure‐only, time‐lapse data sets at an equivalent stage of processing.  相似文献   

8.
Recently, mode converted shear waves (C‐waves) have been shown to enable overpressure prediction in media where primary wave acquisition is inhibited by gas and fluid effects – C‐wave moveout is analysed and a long standing relationship between differential stress and primary‐wave (P‐wave) velocity is modified and employed. Though pore‐pressure prediction based on C‐waves is supported by empirical evidence from laboratory and field experiments, a theoretical justification has yet to be developed. In this research note, we provide a supporting algebra for the original relationship between pore pressure and C‐wave velocity.  相似文献   

9.
In this paper, we introduce a new method of geophysical data interpretation based on simultaneous analysis of images and sounds. The final objective is to expand the interpretation workflow through multimodal (visual–audio) perception of the same information. We show how seismic data can be effectively converted into standard formats commonly used in digital music. This conversion of geophysical data into the musical domain can be done by applying appropriate time–frequency transforms. Using real data, we demonstrate that the Stockwell transform provides a very accurate and reliable conversion. Once converted into musical files, geophysical datasets can be played and interpreted by using modern computer music tools, such as sequencers. This approach is complementary and not substitutive of interpretation methods based on imaging. It can be applied not only to seismic data but also to well logs and any type of geophysical time/depth series. To show the practical implications of our integrated visual–audio method of interpretation, we discuss an application to a real seismic dataset in correspondence of an important hydrocarbon discovery.  相似文献   

10.
Three‐component borehole magnetics provide important additional information compared to total field or horizontal and vertical measurements. These data can be used for several tasks such as the localization of ferromagnetic objects, the determination of apparent polar wander curves and the computation of the magnetization of rock units. However, the crucial point in three‐component borehole magnetics is the reorientation of the measured data from the tool's frame to the geographic reference frame North, East and Downwards. For this purpose, our tool, the Göttinger Borehole Magnetometer, comprises three orthogonally aligned fibre optic gyros along with three fluxgate sensors. With these sensors, the vector of the magnetic field along with the tool rotation can be recorded continuously during the measurement. Using the high–precision gyro data, we can compute the vector of the magnetic anomaly with respect to the Earth's reference frame. Based on the comparison of several logs measured in the Outokumpu Deep Drill Hole (OKU R2500, Finland), the repeatability of the magnetic field vector is 0.8° in azimuthal direction, 0.08° in inclination and 71 nT in magnitude.  相似文献   

11.
Seismic detection of faults, dykes, potholes and iron-rich ultramafic pegmatitic bodies is of great importance to the platinum mining industry, as these structures affect safety and efficiency. The application of conventional seismic attributes (such as instantaneous amplitude, phase and frequency) in the hard-rock environment is more challenging than in soft-rock settings because the geology is often complex, reflections disrupted and the seismic energy strongly scattered. We have developed new seismic attributes that sharpen seismic reflections, enabling additional structural information to be extracted from hard-rock seismic data. The symmetry attribute is based on the invariance of an object with respect to transformations such as rotation and reflection; it is independent of the trace reflection amplitude, and hence a better indicator of the lateral continuity of thin and weak reflections. The reflection-continuity detector attribute is based on the Hilbert transform; it enhances the visibility of the peaks and troughs of the seismic traces, and hence the continuity of weak reflections. We demonstrate the effectiveness of these new seismic attributes by applying them to a legacy 3D seismic data set from the Bushveld Complex in South Africa. These seismic attributes show good detection of deep-seated thin (∼1.5 m thick) platinum ore bodies and their associated complex geological structures (faults, dykes, potholes and iron-rich ultramafic pegmatites). They provide a fast, cost-effective and efficient interpretation tool that, when coupled with horizon-based seismic attributes, can reveal structures not seen in conventional interpretations.  相似文献   

12.
Linear prediction filters are an effective tool for reducing random noise from seismic records. Unfortunately, the ability of prediction filters to enhance seismic records deteriorates when the data are contaminated by erratic noise. Erratic noise in this article designates non‐Gaussian noise that consists of large isolated events with known or unknown distribution. We propose a robust fx projection filtering scheme for simultaneous erratic noise and Gaussian random noise attenuation. Instead of adopting the ?2‐norm, as commonly used in the conventional design of fx filters, we utilize the hybrid ‐norm to penalize the energy of the additive noise. The estimation of the prediction error filter and the additive noise sequence are performed in an alternating fashion. First, the additive noise sequence is fixed, and the prediction error filter is estimated via the least‐squares solution of a system of linear equations. Then, the prediction error filter is fixed, and the additive noise sequence is estimated through a cost function containing a hybrid ‐norm that prevents erratic noise to influence the final solution. In other words, we proposed and designed a robust M‐estimate of a special autoregressive moving‐average model in the fx domain. Synthetic and field data examples are used to evaluate the performance of the proposed algorithm.  相似文献   

13.
Compressed Sensing has recently proved itself as a successful tool to help address the challenges of acquisition and processing seismic data sets. Compressed sensing shows that the information contained in sparse signals can be recovered accurately from a small number of linear measurements using a sparsity‐promoting regularization. This paper investigates two aspects of compressed sensing in seismic exploration: (i) using a general non‐convex regularizer instead of the conventional one‐norm minimization for sparsity promotion and (ii) using a frequency mask to additionally subsample the acquired traces in the frequency‐space () domain. The proposed non‐convex regularizer has better sparse recovery performance compared with one‐norm minimization and the additional frequency mask allows us to incorporate a priori information about the events contained in the wavefields into the reconstruction. For example, (i) seismic data are band‐limited; therefore one can use only a partial set of frequency coefficients in the range of reflections band, where the signal‐to‐noise ratio is high and spatial aliasing is low, to reconstruct the original wavefield, and (ii) low‐frequency characteristics of the coherent ground rolls allow direct elimination of them during reconstruction by disregarding the corresponding frequency coefficients (usually bellow 10 Hz) via a frequency mask. The results of this paper show that some challenges of reconstruction and denoising in seismic exploration can be addressed under a unified formulation. It is illustrated numerically that the compressed sensing performance for seismic data interpolation is improved significantly when an additional coherent subsampling is performed in the domain compared with the domain case. Numerical experiments from both simulated and real field data are included to illustrate the effectiveness of the presented method.  相似文献   

14.
The idea of curvature analysis has been widely used in subsurface structure interpretation from three-dimensional seismic data (e.g., fault/fracture detection and geomorphology delineation) by measuring the lateral changes in the geometry of seismic events. However, such geometric curvature utilizes only the kinematic information (two-way traveltime) of the available seismic signals. While analysing the dynamic information (waveform), the traditional approaches (e.g., complex trace analysis) are often trace-wise and thereby fail to take into account the seismic reflector continuity and deviate from the true direction of geologic deposition, especially for steeply dipping formations. This study proposes extending the three-dimensional curvature analysis to the waveforms in a seismic profile, here denoted as the waveform curvature, and investigates the associated implications for assisting seismic interpretation. Applications to the F3 seismic dataset over the Netherlands North Sea demonstrate the added values of the proposed waveform curvature analysis in four aspects. First, the capability of the curvature operator in differentiating convex and concave bending allows automatic decomposition of a seismic image by the reflector types (peaks, troughs and zero crossings), which can greatly facilitate computer-aided horizon interpretation and modelling from three-dimensional seismic data. Second, the signed minimum curvature offers a new analytical approach for estimating the fundamental and important reflector dip attribute by searching the orientation associated with least waveform variation. Third, the signed maximum curvature makes it possible to analyse the seismic signals along the normal direction of the reflection events. Finally, the curvature analysis promotes the frequency bands of the seismic signals and thereby enhances the apparent resolution on identifying and interpreting subtle seismic features.  相似文献   

15.
Elastic imaging from ocean bottom cable (OBC) data can be challenging because it requires the prior estimation of both compressional‐wave (P‐wave) and shear‐wave (S‐wave) velocity fields. Seismic interferometry is an attractive technique for processing OBC data because it performs model‐independent redatuming; retrieving ‘pseudo‐sources’ at positions of the receivers. The purpose of this study is to investigate multicomponent applications of interferometry for processing OBC data. This translates into using interferometry to retrieve pseudo‐source data on the sea‐bed not only for multiple suppression but for obtaining P‐, converted P to S‐wave (PS‐wave) and possibly pure mode S‐waves. We discuss scattering‐based, elastic interferometry with synthetic and field OBC datasets. Conventional and scattering‐based interferometry integrands computed from a synthetic are compared to show that the latter yields little anti‐causal response. A four‐component (4C) pseudo‐source response retrieves pure‐mode S‐reflections as well at P‐ and PS‐reflections. Pseudo‐source responses observed in OBC data are related to P‐wave conversions at the seabed rather than to true horizontal or vertical point forces. From a Gulf of Mexico OBC data set, diagonal components from a nine‐component pseudo‐source response demonstrate that the P‐wave to S‐wave velocity ratio (VP/VS) at the sea‐bed is an important factor in the conversion of P to S for obtaining the pure‐mode S‐wave reflections.  相似文献   

16.
Full‐waveform inversion is re‐emerging as a powerful data‐fitting procedure for quantitative seismic imaging of the subsurface from wide‐azimuth seismic data. This method is suitable to build high‐resolution velocity models provided that the targeted area is sampled by both diving waves and reflected waves. However, the conventional formulation of full‐waveform inversion prevents the reconstruction of the small wavenumber components of the velocity model when the subsurface is sampled by reflected waves only. This typically occurs as the depth becomes significant with respect to the length of the receiver array. This study first aims to highlight the limits of the conventional form of full‐waveform inversion when applied to seismic reflection data, through a simple canonical example of seismic imaging and to propose a new inversion workflow that overcomes these limitations. The governing idea is to decompose the subsurface model as a background part, which we seek to update and a singular part that corresponds to some prior knowledge of the reflectivity. Forcing this scale uncoupling in the full‐waveform inversion formalism brings out the transmitted wavepaths that connect the sources and receivers to the reflectors in the sensitivity kernel of the full‐waveform inversion, which is otherwise dominated by the migration impulse responses formed by the correlation of the downgoing direct wavefields coming from the shot and receiver positions. This transmission regime makes full‐waveform inversion amenable to the update of the long‐to‐intermediate wavelengths of the background model from the wide scattering‐angle information. However, we show that this prior knowledge of the reflectivity does not prevent the use of a suitable misfit measurement based on cross‐correlation, to avoid cycle‐skipping issues as well as a suitable inversion domain as the pseudo‐depth domain that allows us to preserve the invariant property of the zero‐offset time. This latter feature is useful to avoid updating the reflectivity information at each non‐linear iteration of the full‐waveform inversion, hence considerably reducing the computational cost of the entire workflow. Prior information of the reflectivity in the full‐waveform inversion formalism, a robust misfit function that prevents cycle‐skipping issues and a suitable inversion domain that preserves the seismic invariant are the three key ingredients that should ensure well‐posedness and computational efficiency of full‐waveform inversion algorithms for seismic reflection data.  相似文献   

17.
In marine acquisition, reflections of sound energy from the water–air interface result in ghosts in the seismic data, both in the source side and the receiver side. Ghosts limit the bandwidth of the useful signal and blur the final image. The process to separate the ghost and primary signals, called the deghosting process, can fill the ghost notch, broaden the frequency band, and help achieve high‐resolution images. Low‐signal‐to‐noise ratio near the notch frequencies and 3D effects are two challenges that the deghosting process has to face. In this paper, starting from an introduction to the deghosting process, we present and compare two strategies to solve the latter. The first is an adaptive mechanism that adjusts the deghosting operator to compensate for 3D effects or errors in source/receiver depth measurement. This method does not include explicitly the crossline slowness component and is not affected by the sparse sampling in the same direction. The second method is an inversion‐type approach that does include the crossline slowness component in the algorithm and handles the 3D effects explicitly. Both synthetic and field data examples in wide azimuth acquisition settings are shown to compare the two strategies. Both methods provide satisfactory results.  相似文献   

18.
In 2005, a multicomponent ocean bottom node data set was collected by BP and BHP Billiton in the Atlantis field in the Gulf of Mexico. Our results are based on data from a few sparse nodes with millions of shots that were analysed as common receiver azimuthal gathers. A first‐order look at P‐wave arrivals on a common receiver gather at a constant offset reveals variation of P‐wave arrival time as a function of azimuth indicating the presence of azimuthal anisotropy at the top few layers. This prompted us to investigate shear arrivals on the horizontal component data. After preliminary processing, including a static correction, the data were optimally rotated to radial (R) and transverse (T) components. The R component shows azimuthal variation of traveltime indicating variation of velocity with azimuth; the corresponding T component shows azimuthal variation of amplitude and phase (polarity reversal). The observed shear‐wave (S‐wave) splitting, previously observed azimuthal P‐wave velocity variation and azimuthal P‐wave amplitude variation, all indicate the occurrence of anisotropy in the shallow (just below the seafloor) subsea sediment in the area. From the radial component azimuthal gather, we analysed the PP‐ and PS‐wave amplitude variation for the first few layers and determined corresponding anisotropy parameter and VP/VS values. Since fracture at this depth is not likely to occur, we attribute the observed azimuthal anisotropy to the presence of microcracks and grain boundary orientation due to stress. The evidence of anisotropy is ubiquitous in this data set and thus it argues strongly in favour of considering anisotropy in depth imaging for obtaining realistic subsurface images, at the least.  相似文献   

19.
Due to the complicated geophysical character of tight gas sands in the Sulige gasfield of China, conventional surface seismic has faced great challenges in reservoir delineation. In order to improve this situation, a large‐scale 3D‐3C vertical seismic profiling (VSP) survey (more than 15 000 shots) was conducted simultaneously with 3D‐3C surface seismic data acquisition in this area in 2005. This paper presents a case study on the delineation of tight gas sands by use of multi‐component 3D VSP technology. Two imaging volumes (PP compressional wave; PSv converted wave) were generated with 3D‐3C VSP data processing. By comparison, the dominant frequencies of the 3D VSP images were 10–15 Hz higher than that of surface seismic images. Delineation of the tight gas sands is achieved by using the multi‐component information in the VSP data leading to reduce uncertainties in data interpretation. We performed a routine data interpretation on these images and developed a new attribute titled ‘Centroid Frequency Ratio of PSv and PP Waves’ for indication of the tight gas sands. The results demonstrated that the new attribute was sensitive to this type of reservoir. By combining geologic, drilling and log data, a comprehensive evaluation based on the 3D VSP data was conducted and a new well location for drilling was proposed. The major results in this paper tell us that successful application of 3D‐3C VSP technologies are only accomplished through a synthesis of many disciplines. We need detailed analysis to evaluate each step in planning, acquisition, processing and interpretation to achieve our objectives. High resolution, successful processing of multi‐component information, combination of PP and PSv volumes to extract useful attributes, receiver depth information and offset/ azimuth‐dependent anisotropy in the 3D VSP data are the major accomplishments derived from our attention to detail in the above steps.  相似文献   

20.
The added value of the joint pre-stack inversion of PP (incident P-wave and reflected P-wave) and PS (incident P-wave and reflected S-wave) seismic data for the time-lapse application is shown. We focus on the application of this technique to the time-lapse (four-dimensional) multicomponent Jubarte field permanent reservoir monitoring seismic data. The joint inversion results are less sensitive to noise in the input data and show a better match with the rock physics models calibrated for the field. Further, joint inversion improves S-impedance estimates and provides a more robust quantitative interpretation, allowing enhanced differentiation between pore pressure and fluid saturation changes, which will be extremely useful for reservoir management. Small changes in reservoir properties are expected in the short time between the time-lapse seismic acquisitions used in the Jubarte project (only 1 year apart). The attempt to recover subtle fourth-dimensional effects via elastic inversion is recurrent in reservoir characterization projects, either due to the small sensitivity of the reservoirs to fluid and pressure changes or the short interval between the acquisitions. Therefore, looking for methodologies that minimize the uncertainty of fourth-dimensional inversion outputs is of fundamental importance. Here, we also show the differences between PP only and joint PP–PS inversion workflows and parameterizations that can be applied in other projects. We show the impact of using multicomponent data as input for elastic seismic inversions in the analysis of the time-lapse differences of the elastic properties. The larger investment in the acquisition and processing of multicomponent seismic data is shown to be justified by the improved results from the fourth-dimensional joint inversion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号