首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 3 毫秒
1.
We present a method for fitting trishear models to surface profile data, by restoring bedding dip data and inverting for model parameters using a Markov chain Monte Carlo method. Trishear is a widely-used kinematic model for fault-propagation folds. It lacks an analytic solution, but a variety of data inversion techniques can be used to fit trishear models to data. Where the geometry of an entire folded bed is known, models can be tested by restoring the bed to its pre-folding orientation. When data include bedding attitudes, however, previous approaches have relied on computationally-intensive forward modeling. This paper presents an equation for the rate of change of dip in the trishear zone, which can be used to restore dips directly to their pre-folding values. The resulting error can be used to calculate a probability for each model, which allows solution by Markov chain Monte Carlo methods and inversion of datasets that combine dips and contact locations. These methods are tested using synthetic and real datasets. Results are used to approximate multimodal probability density functions and to estimate uncertainty in model parameters. The relative value of dips and contacts in constraining parameters and the effects of uncertainty in the data are investigated.  相似文献   

2.
Markov Chain Monte Carlo Implementation of Rock Fracture Modelling   总被引:1,自引:0,他引:1  
This paper deals with the problem of estimating fracture planes, given only the data at borehole intersections with fractures. We formulate an appropriate model for the problem and give a solution to fitting the planes using a Markov chain Monte Carlo (MCMC) implementation. The basics of MCMC are presented, with particular emphasis given to reversible jump, which is required for changing dimensions. We also give a detailed worked example of the MCMC implementation with reversible jump since our implementation relies heavily on this new methodology. The methods are tested on both simulated and real data. The latter is a unique data set in the form of a granite block, which was sectioned into slices. All joints were located and recorded, and the joint planes obtained by stacking strike lines. This work is important in the risk assessment for the underground storage of hazardous waste. Problems and extensions are discussed.  相似文献   

3.
In this paper we develop a generalized statistical methodology for characterizing geochronological data, represented by a distribution of single mineral ages. The main characteristics of such data are the heterogeneity and error associated with its collection. The former property means that mixture models are often appropriate for their analysis, in order to identify discrete age components in the overall distribution. We demonstrate that current methods (e.g., Sambridge and Compston, 1994) for analyzing such problems are not always suitable due to the restriction of the class of component densities that may be fitted to the data. This is of importance, when modelling geochronological data, as it is often the case that skewed and heavy tailed distributions will fit the data well. We concentrate on developing (Bayesian) mixture models with flexibility in the class of component densities, using Markov chain Monte Carlo (MCMC) methods to fit the models. Our method allows us to use any component density to fit the data, as well as returning a probability distribution for the number of components. Furthermore, rather than dealing with the observed ages, as in previous approaches, we make the inferences of components from the “true” ages, i.e., the ages had we been able to observe them without measurement error. We demonstrate our approach on two data sets: uranium-lead (U-Pb) zircon ages from the Khorat basin of northern Thailand and the Carrickalinga Head formation of southern Australia.  相似文献   

4.
基于MCMC法的非饱和土渗流参数随机反分析   总被引:2,自引:0,他引:2  
左自波  张璐璐  程演  王建华  何晔 《岩土力学》2013,34(8):2393-2400
基于贝叶斯理论,以马尔可夫链蒙特卡罗方法(Markov chain Monte Carlo Simulation, MCMC法)的自适应差分演化Metropolis算法为参数后验分布抽样计算方法,建立利用时变测试数据的参数随机反分析及模型预测方法。以香港东涌某天然坡地降雨入渗测试为算例,采用自适应差分演化Metropolis算法对时变降雨条件下非饱和土一维渗流模型参数进行随机反分析,研究参数后验分布的统计特性,并分别对校准期和验证期内模型预测孔压和实测值进行比较。研究结果表明,DREAM算法得到的各随机变量后验分布标准差较先验分布均显著减小;经过实测孔压数据的校准,模型计算精度很高,校准期内95%总置信区间的覆盖率达到0.964;验证期第2~4个阶段95%总置信区间的覆盖率分别为0.52、0.79和0.79,模型预测结果与实测值吻合程度较高。  相似文献   

5.
基于蒙特卡罗随机有限元法的三维随机渗流场研究   总被引:3,自引:0,他引:3  
王林  徐青 《岩土力学》2014,35(1):287-292
通过建立改进Latin超立方抽样和对偶抽样相结合的复合抽样法,以提高Monte Carlo方法的计算效率,并将其引入Monte Carlo随机有限元(MSFEM)。基于三维有限元模型,采用MCSFEM对山坪土石坝进行随机渗流场分析,研究渗透系数和水头边界条件的随机特性对渗流场的干扰,进行变异系数和抽样次数的敏感性分析。最后,对渗流场的求解量进行概型分析。研究表明:总水头势、流速及渗透体积力的变异性随着渗透系数随机性的增强而变大;复合抽样法既能有效加快Monte Carlo的收敛速度,又能降低样本间的统计相关性,说明了该方法的实用性与有效性;当渗透系数服从正态分布时,渗流场中所取结点的水头和坡降也服从正态分布。  相似文献   

6.
Spatial datasets are common in the environmental sciences. In this study we suggest a hierarchical model for a spatial stochastic field. The main focus of this article is to approximate a stochastic field with a Gaussian Markov Random Field (GMRF) to exploit computational advantages of the Markov field, concerning predictions, etc. The variation of the stochastic field is modelled as a linear trend plus microvariation in the form of a GMRF defined on a lattice. To estimate model parameters we adopt a Bayesian perspective, and use Monte Carlo integration with samples from Markov Chain simulations. Our methods does not demand lattice, or near-lattice data, but are developed for a general spatial data-set, leaving the lattice to be specified by the modeller. The model selection problem that comes with the artificial grid is in this article addressed with cross-validation, but we also suggest other alternatives. From the application of the methods to a data set of elemental composition of forest soil, we obtained predictive distributions at arbitrary locations as well as estimates of model parameters.  相似文献   

7.
Based on the assumption of the plain-strain problem, various optimization or random search methods have been developed for locating the critical slip surfaces in slope-stability analysis, but none of such methods is applicable to the 3D case. In this paper, a simple Monte Carlo random simulation method is proposed to identify the 3D critical slip surface. Assuming the initial slip to be the lower part of a slip ellipsoid, the 3D critical slip surface is located by means of a minimized 3D safety factor. A column-based 3D slope stability analysis model is used to calculate this factor. In this study, some practical cases of known minimum safety factors and critical slip surfaces in 2D analysis are extended to 3D slope problems to locate the critical slip surfaces. Compared with the 2D result, the resulting 3D critical slip surface has no apparent difference in terms of only cross section, but the associated 3D safety factor is definitely higher.  相似文献   

8.

变形速率是衡量构造活动强弱的重要参数, 对其准确限定一直是构造地貌、活动构造研究的重点。以河流地貌作为参考面限定构造变形速率是目前研究中最常用的手段。基于近年来的研究体会和具体的研究案例分析认为, 虽然目前已能获得可靠的变形量和地貌年龄数据, 但若想获得合理、可靠的变形速率, 需要关注这两类数据之间的匹配关系与由其构建的断裂滑动历史的合理性。相较于通过位错量与构造变形时间的比值, 或利用两者进行线性回归的方法来限定变形速率, 从模拟合理的断裂滑动历史的角度出发, 蒙特卡洛方法可有效减小变形速率估计的不确定性, 使变形速率估计更加合理, 从而为地震风险评价等提供可靠的基础资料。

  相似文献   

9.
Hurst's rescaled range analysis is a useful tool in the examination of a time series and is designed to measure memory content and determine its fractal texture. This study applies the Hurst method to a new earthquake catalogue for Greece. The study also adopts Monte Carlo simulations to provide a statistical test underpinning the Hurst analyses. Together these reveal basic temporal fractal characteristics in the earthquake occurrence time-histories' memory. Three regions are considered, approximately: all of Greece and some surrounding areas, and the sub-zones of the Hellenic Arc and the Gulf of Corinth. Three temporal textures are considered: elapsed time between earthquakes, strain energy release, and earthquake frequency. The elapsed temporal textures for the zone whole Greece indicate distinct characteristics in chronological order and possess long memory. These belong to the class non-random pattern. However, these characteristics generally disappear when the sub-zones are considered and become random patterns. The Monte Carlo simulations support this. Therefore, memoryless statistical seismic hazard estimates may not be suitable for whole Greece but could be useful for the sub-zones. The strain energy release temporal textures for whole Greece and for the sub-zones, no matter that these seem to possess long memory at first analysis, are all random patterns. In other words, the Monte Carlo simulations demonstrate that these patterns are much more likely to happen by chance. The seismic frequency textures for whole Greece and for the sub-zones suggest long memory, however, only the texture for the Hellenic Arc zone (MS ≥ 5.0) and that for whole Greece (MS ≥ 4.0) approach demonstrable non-random patterns. Except for these, other patterns happen by chance.  相似文献   

10.
蒙特卡罗法油气资源评价软件的设计与开发   总被引:3,自引:0,他引:3  
钱伟  陆现彩 《江苏地质》2002,26(3):145-149
根据有关资料,对比油气资源量评价的各种方法,详细论述了蒙特卡罗法在油气资源量评价中的应用原理和特点,从算法上解决了蒙特卡罗法的关键实现技术即伪随机数产生和随机数抽样问题。利用API设计编制了程序软件,完成了程序设计、算法实现、界面开发等工作,并应用于实际地区的资源评价工作,给出了实际算例。  相似文献   

11.
In the present study, reliability analysis of near surface disposal facility is performed, by assessing the probability of sequential failure of the multi barrier system using the contaminant transport model. The concentration and dose rate of the radionuclide evolve with time hence there is a need for time dependent reliability analysis. Due to the low values of expected probabilities of failure, an enhanced Monte Carlo (EMC) method and Subset simulation is employed. The Result of the analysis show that, the EMC method is useful to evaluate the probability of failure associated with the barrier system which has low probability of failure.  相似文献   

12.
A recently developed Bayesian Monte Carlo (BMC) method and its application to safety assessment of structures are described in this paper. We use a one-dimensional BMC method that was proposed in 2009 by Rajabalinejad in order to develop a weighted logical dependence between successive Monte Carlo simulations. Our main objective in this research is to show that the extended BMC can dramatically improve simulation efficiency by using prior information from modelling and outcomes of preceding simulations. We provide theory and numerical algorithms for an extended BMC method for multi-dimensional problems, integrate it with a probabilistic finite element model and apply these coupled models to assessment of reliability of a flood defence for the 17th Street Flood Wall system in New Orleans. This is the first successful demonstration of the BMC method to a complex system. We provide a comparison of the numerical efficiency for the BMC, Monte Carlo (MC) and Dynamic Bounds methods that are used in reliability assessment of complex infrastructures.  相似文献   

13.
Identification of cyclic sequences gives valuable insight into depositional associations of stratigraphic facies. An embedded Markov chain is a reasonable general model for facies transitions. But a model with independent random occurrences of facies is not an appropriate null hypothesis to be tested to show the presence of cycles because of definitional restriction in transition observations to only those between different facies. This is a common stratigraphic situation and the problem has been raised recently by several authors. We present here a test statistic for null hypothesis derived from the concept of partial independence and inherent to the model of embedded Markov processes.  相似文献   

14.
This paper presents an efficient Bayesian back-analysis procedure for braced excavations using wall deflection data at multiple points. Response surfaces obtained from finite element analyses are adopted to efficiently evaluate the wall responses. Deflection data for 49 wall sections from 11 case histories are collected to characterize the model error of the finite element method for evaluating the deflections at various points. A braced excavation project in Hang Zhou, China is chosen to illustrate the effectiveness of the proposed procedure. The results indicate that the soil parameters could be updated more significantly for the updating that uses the deflection data at multiple points than that only uses the maximum deflection data. The predicted deflections from the updated parameters agree fairly well with the field observations. The main significance of the proposed procedure is that it improves the updating efficiency of the soil parameters without adding monitoring effort compared with the traditional method that uses the maximum deflection data.  相似文献   

15.
Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.  相似文献   

16.
Carbon (25–30 nm in thickness) is the most common coating material used in the electron probe microanalysis (EPMA) of geological samples. A gold coating is also used in special cases to reduce the surface damage by electron bombardment. Monte Carlo simulations have been performed for monazite with a 25 nm carbon and a 10 nm gold coating to understand the effect of a coating film in quantitative EPMA at E0= 15 keV and 25 keV. Simulations showed that carbon-coated monazite gave the same depth distribution of the generated X-rays in the monazite as uncoated monazite, whilst gold-coated monazite gave a distorted depth distribution. A 10 nm gold coating was 1.06 (15 keV) and 1.05 (25 keV) times higher in k -ratio between monazite and pure thorium than a 25 nm carbon coating at an X-ray take-off angle of 40 degrees. Thus, a 10 nm gold coating is a possible factor contributing to inaccuracy in quantitative EPMA of monazite, while a 25 nm carbon coating does not have a significant effect.  相似文献   

17.
Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing research domain generally referred to as hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters, their inherent trade-off between resolution and range, as well as the notoriously site-specific nature of petrophysical parameter relations, the fundamental usefulness of multi-method surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative Integration of the resulting vast and diverse database into a unified model of the probed subsurface region that is consistent with all available measurements. To this end, we present a novel approach toward hydrogeophysical data integration based on a Monte-Carlo-type conditional stochastic simulation method that we consider to be particularly suitable for high-resolution local-scale studies. Monte Carlo techniques are flexible and versatile, allowing for accounting for a wide variety of data and constraints of differing resolution and hardness, and thus have the potential of providing, in a geostatistical sense, realistic models of the pertinent target parameter distributions. Compared to more conventional approaches, such as co-kriging or cluster analysis, our approach provides significant advancements in the way that larger-scale structural information contained in the hydrogeophysical data can be accounted for. After outlining the methodological background of our algorithm, we present the results of its application to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the detailed local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to a field dataset collected at the Boise Hydrogeophysical Research Site. Finally, we compare the performance of our data integration approach to that of more conventional methods with regard to the prediction of flow and transport phenomena in highly heterogeneous media and discuss the Implications arising.  相似文献   

18.
This work deals with the geostatistical simulation of mineral grades whose distribution exhibits spatial trends within the ore deposit. It is suggested that these trends can be reproduced by using a stationary random field model and by conditioning the realizations to data that incorporate the available information on the local grade distribution. These can be hard data (e.g., assays on samples) or soft data (e.g., rock-type information) that account for expert geological knowledge and supply the lack of hard data in scarcely sampled areas. Two algorithms are proposed, depending on the kind of soft data under consideration: interval constraints or local moment constraints. An application to a porphyry copper deposit is presented, in which it is shown that the incorporation of soft conditioning data associated with the prevailing rock type improves the modeling of the uncertainty in the actual copper grades.  相似文献   

19.
In site investigation, the amount of observation data obtained for geotechnical property characterisation is often too sparse to obtain meaningful statistics and probability distributions of geotechnical properties. To address this problem, a Bayesian equivalent sample method was recently developed. This paper aims to generalize the Bayesian equivalent sample method to various geotechnical properties, when measured by different direct or indirect test procedures, and to implement the generalized method in Excel by developing an Excel VBA program called Bayesian Equivalent Sample Toolkit (BEST). The BEST program makes it possible for practitioners to apply the Bayesian equivalent sample method without being compromised by sophisticated algorithms in probability, statistics and simulation. The program is demonstrated and validated through examples of soil and rock property characterisations.  相似文献   

20.
A standard procedure for conditioning a stochastic channel to well-test pressure data requires the minimization of an objective function. The Levenberg–Marquardt algorithm is a natural choice for minimization, but may suffer from slow convergence or converge to a local minimum which gives an unacceptable match of observed pressure data if a poor initial guess is used. In this work, we present a procedure to generate a good initial guess when the Levenberg–Marquardt algorithm is used to condition a stochastic channel to pressure data and well observations of channel facies, channel thickness, and channel top depth. This technique yields improved computational efficiency when the Levenberg–Marquardt method is used as the optimization procedure for generating realizations of the model by the randomized maximum likelihood method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号