首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 187 毫秒
1.
二次电流场多次叠加概率成像   总被引:3,自引:1,他引:2       下载免费PDF全文
基于地下电流场的积分公式,离散化的二次电流场被分解为电性不连续界面上的一系列点电荷电场的代数和.单位正点电荷电场被引入作为空间扫描函数(SDS),积累电荷出现的概率(COP)函数定义为二次电流场与SDS的互相关.为对概率成像结果进行定量分析解释,提出了规范的积累电荷出现的概率(NCOP)函数.通过应用有限元算法对2D地电模型进行二次电流场合成,实现了二次电流场的多次叠加概率成像.结果表明对均匀半空间中赋存地质异常体的电性结构,概率成像方法对地下异常体的空间位置有较好的指示作用.  相似文献   

2.
偏移成像是将由地下目标产生的地震数据归位到反射界面从而产生地下结构的像。但由于受到采集系统孔径、复杂上覆介质和成像目标倾角等因素的影响,成像结果常常是地下结构的畸变了的像。地震照明和分辨率分析为上述因素对地震成像结果的影响提供了定量的描述方法。点弥散函数中包含地震照明和分辨率分析的全部信息。染色算法建立地下某一特定结构与地震波场和地震数据之间的对应关系。本文利用染色算法计算点弥散函数并进一步获得角度域照明信息,同时通过点弥散函数与原始成像结果反卷积的方法对成像结果予以校正。展示了SEG盐丘模型的相关计算结果。染色算法为点弥散函数的计算及进行宽频带地震照明与分辨率分析提供了一种高效的计算工具。  相似文献   

3.
逆时偏移作为一种先进的地震偏移成像方法,其成像结果的好坏取决于很多因素,其中最关键的是速度.速度越准确,成像效果就越好.但在实践当中,速度往往是未知的,只能通过速度反演等各种手段来估计近似速度.由此导致逆时偏移成像结果存在误差,从而降低后续地震解释的精度.那么速度误差对逆时偏移成像的影响到底有多大呢?实际当中一般估计出来的近似速度可能在模型每一个地方都不同,很难直接进行逆时偏移速度误差系统分析.为了简化,本文采用等效速度误差——速度模型整体平均误差,进行逆时偏移成像分析.逆时偏移成像采用互相关成像条件,低频噪声压制采用振幅补偿拉普拉斯滤波方法,源波场正推和接收波场逆推采用波动方程一阶应力.速度形式及交错网格有限差分方法.首先对比分析不同速度误差对逆时偏移成像结果的影响,并和单程波波动方程偏移进行对比分析;然后分析速度误差对逆时偏移和单程波波动方程偏移成像位置的影响;最后基于偏移速度分析方法估计得到的近似速度模型,进行逆时偏移和单程波波动方程偏移成像试验.结果 表明:在不同速度误差情况下,逆时偏移成像在同相轴连续性和能量聚焦等方面要好于单程波波动方程偏移;速度误差对逆时偏移和单程波波动方程偏移成像位置的影响程度相当.本文研究成果对逆时偏移在实践中的应用具有一定的参考价值.  相似文献   

4.
逆时偏移作为一种先进的地震偏移成像方法,其成像结果的好坏取决于很多因素,其中成像奈件是关键的一个因素.成像条件一般采用互相关成像条件,它是将源波场沿时间进行正推,接收波场沿时间进行逆推,然后将源波场和接收波场进行互相关,从而得到偏移成像结果.源波场正推和接收波场逆推都是采用同一个波动方程,这里采用波动方程一阶应力-速度形式,数值计算方法采用交错网格有限差分方法.采用振幅补偿拉普拉斯滤波方法压制逆时偏移成像中的低频噪声.采用三次样条插值方法解决逆时偏移成像中的波形不光滑问题.在接收波场逆推过程中,是将地震记录作为边界条件.而在源波场正推过程中,需要给定震源子波作为初值条件.理论上来讲,震源子波为脉冲时是最佳的,但在实际当中难以实现.在源波场正推时,一般采用子波函数.子波函数形式非常多,最常用的是雷克子波.不同主频震源子波产生的源波场频率也不同,从而导致逆时偏移互相关成像效果也不同.那么震源子波主频对逆时偏移成像影响到底有多大呢?为了简便,本文基于不同主频雷克子波,初步分析了相应的逆时偏移成像效果,同时对速度模型存在不同误差时的情形也进行了试验分析.结果 表明:在不同速度误差情况下,震源子波主频越低,成像同相轴连续性越好,散射点聚焦能量越强;震源子波主频越高,虽然分辨率提高,但成像同相轴连续性变差,散射点聚焦能量变弱,同时背景噪声增强.本文研究结果对逆时偏移成像技术在实践中的应用具有一定的参考价值.  相似文献   

5.
几种相对振幅保持的叠前偏移方法对比分析   总被引:5,自引:4,他引:1       下载免费PDF全文
本文首先介绍了几种振幅保持的叠前偏移方法,通过对这几种方法进行对比分析可以看出每种方法所考虑的影响因素不同,所以它们的加权函数不同.第一类加权函数没有考虑焦散和孔径的影响;第二类加权函数考虑了焦散的影响,但是没有考虑孔径的影响,第三类加权函数既考虑了焦散的影响又考虑了孔径的影响,而第四类加权函数是从波动方程出发,通过分解上、下行波场,根据成像条件来求取加权函数的.然后通过模型验证了共炮检距保幅偏移方法的有效性,从而说明了保幅偏移对振幅的补偿作用.最后考虑了偏移孔径与权函数的关系,它们之间主要通过连续窗函数μ(ξ,sξt)来控制,这样才能有效地消除偏移过程中所产生的噪声.  相似文献   

6.
以表征区域地震活动强度背景的震级期望值作为单个地震事件的目标值,利用震级累积和C值随时间的变化分析地震活动相对平静现象,并给出其显著性检验. 文中还定量分析了平静异常与大震的关系,提出了利用核函数对大震发生时间进行概率外推的方法. 用上述方法对华北区的山西、张家口-渤海地震带的部分地区及新疆区域进行计算,显示该方法能够描述地震活动平静现象,并可合理地对未来大震发生时间进行概率外推估计.  相似文献   

7.
最小二乘偏移能够消除非规则地震采集、带限子波等因素对偏移结果造成的不良影响,提高成像分辨率和振幅保真度,但高昂的计算成本严重制约了其实用性和应用前景.本文基于射线理论格林函数和地震波走时一阶近似,提出了一种快速的点扩散函数计算方法,不但可以大幅降低点扩散函数的计算成本,还能够适应任意的空间采样以保证点扩散函数的插值精度.以此为基础,本文进一步发展了成像域最小二乘偏移方法,能够高效、灵活地实现地下反射率的迭代更新,并最终获得分辨率更高、照明更加均衡的反演成像结果.文中给出的模型和实际数据的测试结果验证了上述方法的正确性和有效性.  相似文献   

8.
三维地震资料叠前时间偏移应用研究   总被引:17,自引:16,他引:1       下载免费PDF全文
本文通过选取合适的叠前时间偏移软件,对两块三维地震资料进行偏移成像试验,验证叠前时间偏移中影响偏移成像效果的几个主要因素.该软件偏移算法的核心技术是弯曲射线偏移处理,这不同于工业界常用的直射线假设.偏移速度是偏移成像好坏的主要因素,通过迭代进行偏移、速度分析,使共成像点道集拉平,从而实现构造的准确成像;偏移孔径也是影响偏移成像的一个关键参数,其选取与成像目标层的倾斜角、深度、速度等有关;反假频参数对偏移成像效果有一定影响,是偏移中需要考虑的因素之一.  相似文献   

9.
对于常规的逆时定位成像方法,成像结果中强震源的成像值通常远大于并且会掩盖弱震源;同时,成像结果中假象的压制与消除也一直是该技术中颇受关注且比较难解决的问题.对此,本文结合了混合成像条件与高通滤波,从图像对比度的角度加强定位成像效果.提出了反传检波点随机选择的方法,通过重复进行随机选择与随机分组,从而得到不同震源的、包括一些冗余在内的更多信息,通过对信息的融合以提高定位可靠性.提出了筛选模型的概念,将成像过程中各点的波场反传序列引入震源判断标准,构建函数以大致量化震源存在的可能性,结合阈值,构造出由0和1组成"筛选模型",对成像结果进行通过性选择,以此消除假象并提高震源识别的正确性.通过简单模型和复杂模型,验证了本文提出方法的有效性以及对各类干扰因素的适应性与抵抗性.  相似文献   

10.
为使磁测数据的解释结果能够刻画复杂的磁性体,探讨了应用三维磁化率成像技术来反演磁测数据.本文应用基于Occam反演的模型粗糙度约束的光滑成像算法来求解纯欠定的磁化率成像问题.在求解大规模线性方程组时,本文使用预优共轭梯度法并将深度加权函数作为预优因子,既提高了反演的收敛速度,又使磁化率分布在合理的深度上.采用绝对约束的方法对反演的模型值的取值范围进行了约束,取得了较好的约束效果.二维和三维典型模型的数值实验表明,本文给出的磁化率成像方法可以恢复真实模型的主要特征信息.将本文的磁化率成像反演技术应用于大冶铁矿狮子山矿段的实测数据的反演解释,获得了与地质信息具有很好对应关系的成像结果.  相似文献   

11.
Scattering of plane harmonic P, SV, or Rayleigh waves by a two-dimensional rough cavity completely embedded in an isotropic elastic half-space is investigated by using a direct boundary integral equation method. The cavity’s roughness is assumed to be in the form of periodic or random perturbations of arbitrary amplitude superimposed to a smooth elliptical shape. For the randomly corrugated cavities the normal or the uniform probability distribution functions are assumed. Based on multiple random cavity results, the corresponding average surface response is computed. These are compared with the corresponding periodically corrugated and smooth cavity responses. The surface response is evaluated for different cavity shapes and incident waves and for a range of frequencies. The surface motion results are used to determine the peak surface motion frequencies. They depend strongly upon the basic inclusion shape (the principal axes) and the nature of the incident wave. Strong similarity in the peak surface motion frequencies can be observed for the rough and smooth cavity models for both circular and elliptical shapes. In order to quantify the importance of the cavity corrugation upon the surface motion, a roughness influence factor is defined in terms of the rough and smooth cavity surface responses. This factor strongly depends upon the type of the incident wave, the nature of the cavity corrugation, the basic cavity shape, and the frequency. The factor clearly shows the effect of the cavity roughness upon the surface motion.  相似文献   

12.
Influential input classification in probabilistic multimedia models   总被引:1,自引:1,他引:1  
 Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the variance (uncertainty and/or variability) associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions one should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the spread of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.  相似文献   

13.
The impact of initial velocity models on final image reconstruction results and how to construct a proper initial velocity model in near-surface tomography studies are investigated on a two-layer synthetic model with gradually increasing velocity with depth. Refraction initial velocity models and linear velocity function models are tested on both synthetic and field data to obtain images close to reality. It is concluded that velocity function type initial models should be preferred in soft alluvial deposits that exist within the investigated depths, whereas refraction initial models should be preferred in the groundwater table or with strong refractors' existence within the investigated depths to obtain optimum subsurface images in refraction–diving wave seismic tomography.  相似文献   

14.
Full waveform inversion aims to use all information provided by seismic data to deliver high-resolution models of subsurface parameters. However, multiparameter full waveform inversion suffers from an inherent trade-off between parameters and from ill-posedness due to the highly non-linear nature of full waveform inversion. Also, the models recovered using elastic full waveform inversion are subject to local minima if the initial models are far from the optimal solution. In addition, an objective function purely based on the misfit between recorded and modelled data may honour the seismic data, but disregard the geological context. Hence, the inverted models may be geologically inconsistent, and not represent feasible lithological units. We propose that all the aforementioned difficulties can be alleviated by explicitly incorporating petrophysical information into the inversion through a penalty function based on multiple probability density functions, where each probability density function represents a different lithology with distinct properties. We treat lithological units as clusters and use unsupervised K-means clustering to separate the petrophysical information into different units of distinct lithologies that are not easily distinguishable. Through several synthetic examples, we demonstrate that the proposed framework leads full waveform inversion to elastic models that are superior to models obtained either without incorporating petrophysical information, or with a probabilistic penalty function based on a single probability density function.  相似文献   

15.
The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to judge whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. An answer to this problem can be found by reliability analysis. This paper presents a reliability analysis method based on the popular Seed'85 liquefaction analysis method. This reliability method uses the empirical acceleration attenuation law in the Taiwan area to derive the probability density distribution function (PDF) and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The PDF and the statistics for the cyclic resistance ratio (CRR) can be deduced from some probabilistic cyclic resistance curves. These curves are produced by the regression of the liquefaction and non-liquefaction data from the Chi-Chi earthquake and others around the world, using, with minor modifications, the logistic model proposed by Liao [J. Geotech. Eng. 114 (1988) 389]. The CSR and CRR statistics are used in conjunction with the first order and second moment method, to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.  相似文献   

16.
This paper defines a new scoring rule, namely relative model score (RMS), for evaluating ensemble simulations of environmental models. RMS implicitly incorporates the measures of ensemble mean accuracy, prediction interval precision, and prediction interval reliability for evaluating the overall model predictive performance. RMS is numerically evaluated from the probability density functions of ensemble simulations given by individual models or several models via model averaging. We demonstrate the advantages of using RMS through an example of soil respiration modeling. The example considers two alternative models with different fidelity, and for each model Bayesian inverse modeling is conducted using two different likelihood functions. This gives four single-model ensembles of model simulations. For each likelihood function, Bayesian model averaging is applied to the ensemble simulations of the two models, resulting in two multi-model prediction ensembles. Predictive performance for these ensembles is evaluated using various scoring rules. Results show that RMS outperforms the commonly used scoring rules of log-score, pseudo Bayes factor based on Bayesian model evidence (BME), and continuous ranked probability score (CRPS). RMS avoids the problem of rounding error specific to log-score. Being applicable to any likelihood functions, RMS has broader applicability than BME that is only applicable to the same likelihood function of multiple models. By directly considering the relative score of candidate models at each cross-validation datum, RMS results in more plausible model ranking than CRPS. Therefore, RMS is considered as a robust scoring rule for evaluating predictive performance of single-model and multi-model prediction ensembles.  相似文献   

17.
Worldwide experience repeatedly shows that damages in structures caused by earthquakes are highly dependent on site condition and epicentral distance. In this paper, a 21-storey shear wall-structure built in the 1960s in Hong Kong is selected as an example to investigate these two effects. Under various design earthquake intensities and for various site conditions, the fragility curves or damage probability matrix of such building is quantified in terms of the ductility factor, which is estimated from the ratio of storey yield shear to the inter-storey seismic shear. For high-rise buildings, a higher probability of damage is obtained for a softer site condition, and damage is more severe for far field earthquakes than for near field earthquakes. For earthquake intensity of VIII, the probability of complete collapse (P) increases from 1 to 24% for near field earthquakes and from 1 to 41% for far field earthquakes if the building is moved form a rock site to a site consisting a 80 m thick soft clay. For intensity IX, P increases from 6 to 69% for near field earthquake and from 14 to 79% for far field earthquake if the building is again moved form rock site to soft soil site. Therefore, site effect is very important and not to be neglected. Similar site and epicentral effects should also be expected for other types of high-rise structures.  相似文献   

18.
 To preserve biodiversity over centuries, ecosystem management will need to be accepted and practiced by individuals from a broad spectrum of society's strata. Also, management decisions will need to be based on reliable judgments of the cause and effect relationships that govern an ecosystem's dynamics. This article describes an extant, web-based ecosystem management system (EMS) that allows (a) wide participation in ecosystem assessment and policy impact predictions, (b) convenient construction of probabilistic models of ecosystem processes through an influence diagram, and (c) automatic creation of ecosystem assessment reports. For illustration, the system is used to first model the cheetah population in Kenya, and then to assess the impact on this population of different management options. The influence diagram used herein extends standard influence diagram theory to allow representation of variables governed by stochastic differential equations, birth–death processes, and other nongaussian, continuous probability distributions. For many ecosystems, data sets on ecosystem health indicators can be incomplete, small, and contain unknown measurement errors. Some amount of knowledge of an ecosystem's dynamics however, may exist in the form of expert opinion derived from ecological theory. The proposed EMS uses a nonbayesian parameter estimation method, called consistency analysis that finds parameter estimates such that the fitted ecosystem model is as faithful as possible to both the available data and the collected body of expert opinion. For illustration, consistency analysis is used to estimate the cheetah viability influence diagram using all known cheetah surveys in the country of Kenya plus current understanding of factors such as habitat and prey availability that affect cheetah population dynamics.  相似文献   

19.
Different models were developed for evaluating the probabilistic three-dimensional (3D) stability analysis of earth slopes and embankments under earthquake loading using both the safety factor and the displacement criteria of slope failure. In the 3D analysis, the critical and total slope widths become two new and important parameters.The probabilistic models evaluate the probability of failure under seismic loading considering the different sources of uncertainties involved in the problem, i.e. uncertainties stemming from the discrepancies between laboratory-measured and in-situ values of shear strength parameters, randomness of earthquake occurrence, and earthquake-induced acceleration. The models also takes into consideration the spatial variabilities and correlations of soil properties.Five probabilistic models of earthquake-induced displacement were developed based on the non-exceedance of a limited value criterion. Moreover, a probabilistic model for dynamic slope stability analysis was developed based on 3D dynamic safety factor.These models are formulated and incorporated within a computer program (PTDDSSA).A sensitivity analysis was conducted on the different parameters involved in the developed models by applying those models to a well-known landslides (Selset landslide) under different levels of seismic hazard.The parametric study was conducted to evaluate the effect of different input parameters on the resulting critical failure width, 3D dynamic safety factor, earthquake-induced displacement and the probability of failure. Input parameters include: average values and coefficients of variations of water table, cohesion and angle of friction for effective stress analysis, scales of fluctuations in both distance and time, hypocentral distance, earthquake magnitude, earthquake strong shaking period, etc.The hypocentral distance and earthquake magnitude were found to have major influence on the earthquake-induced displacement, probability of failure (i.e. probability of allowable displacement exceedance), and dynamic 2D and 3D safety factors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号