首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到18条相似文献,搜索用时 15 毫秒
1.
Gamma ray logging is a method routinely employed by geophysicists and environmental engineers in site geology evaluations. Modelling of gamma ray data from individual boreholes assists in the local identification of major lithological changes; modelling these data from a network of boreholes assists with lithological mapping and spatial stratigraphic correlation. In this paper we employ Bayesian spatial partition models to analyse gamma ray data spatially. In particular, a spatial partition is defined via a Voronoi tessellation and the mean intensity is assumed constant in each cell of the partition. The number of vertices generating the tessellation as well as the locations of vertices are assumed unknown, and uncertainty about these quantities is described via a hierarchical prior distribution. We describe the advantages of the spatial partition modelling approach in the context of smoothing gamma ray count data and describe an implementation that may be extended to the fitting of a more general model than a constant mean within each cell of the partition. As an illustration of the methodology we consider a data set collected from a network of eight boreholes, which is part of a geophysical study to assist in mapping the lithology of a site. Gamma ray logs are linked with geological information from cores and the spatial analysis of log data assists with predicting the lithology at unsampled locations.  相似文献   

2.
In this paper we develop a generalized statistical methodology for characterizing geochronological data, represented by a distribution of single mineral ages. The main characteristics of such data are the heterogeneity and error associated with its collection. The former property means that mixture models are often appropriate for their analysis, in order to identify discrete age components in the overall distribution. We demonstrate that current methods (e.g., Sambridge and Compston, 1994) for analyzing such problems are not always suitable due to the restriction of the class of component densities that may be fitted to the data. This is of importance, when modelling geochronological data, as it is often the case that skewed and heavy tailed distributions will fit the data well. We concentrate on developing (Bayesian) mixture models with flexibility in the class of component densities, using Markov chain Monte Carlo (MCMC) methods to fit the models. Our method allows us to use any component density to fit the data, as well as returning a probability distribution for the number of components. Furthermore, rather than dealing with the observed ages, as in previous approaches, we make the inferences of components from the “true” ages, i.e., the ages had we been able to observe them without measurement error. We demonstrate our approach on two data sets: uranium-lead (U-Pb) zircon ages from the Khorat basin of northern Thailand and the Carrickalinga Head formation of southern Australia.  相似文献   

3.
R. Rotondi  E. Varini   《Tectonophysics》2006,423(1-4):107
We consider point processes defined on the space–time domain which model physical processes characterized qualitatively by the gradual increase over time in some energy until a threshold is reached, after which, an event causing the loss of energy occurs. The risk function will, therefore, increase piecewise with sudden drops in correspondence to each event. This kind of behaviour is described by Reid's theory of elastic rebound in the earthquake generating process where the quantity that is accumulated is the strain energy or stress due to the relative movement of tectonic plates. The complexity and the intrinsic randomness of the phenomenon call for probabilistic models; in particular the stochastic translation of Reid's theory is given by stress release models. In this article we use such models to assess the time-dependent seismic hazard of the seismogenic zone of the Corinthos Gulf. For each event we consider the occurrence time and the magnitude, which is modelled by a probability distribution depending on the stress level present in the region at any instant. Hence we are dealing here with a marked point process. We perform the Bayesian analysis of this model by applying the stochastic simulation methods based on the generation of Markov chains, the so called Markov chain Monte Carlo (MCMC) methods, which allow one to reconcile the model's complexity with the computational burden of the inferential procedure. Stress release and Poisson models are compared on the basis of the Bayes factor.  相似文献   

4.
Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.  相似文献   

5.
Markov Chain Monte Carlo Implementation of Rock Fracture Modelling   总被引:1,自引:0,他引:1  
This paper deals with the problem of estimating fracture planes, given only the data at borehole intersections with fractures. We formulate an appropriate model for the problem and give a solution to fitting the planes using a Markov chain Monte Carlo (MCMC) implementation. The basics of MCMC are presented, with particular emphasis given to reversible jump, which is required for changing dimensions. We also give a detailed worked example of the MCMC implementation with reversible jump since our implementation relies heavily on this new methodology. The methods are tested on both simulated and real data. The latter is a unique data set in the form of a granite block, which was sectioned into slices. All joints were located and recorded, and the joint planes obtained by stacking strike lines. This work is important in the risk assessment for the underground storage of hazardous waste. Problems and extensions are discussed.  相似文献   

6.
Spatial datasets are common in the environmental sciences. In this study we suggest a hierarchical model for a spatial stochastic field. The main focus of this article is to approximate a stochastic field with a Gaussian Markov Random Field (GMRF) to exploit computational advantages of the Markov field, concerning predictions, etc. The variation of the stochastic field is modelled as a linear trend plus microvariation in the form of a GMRF defined on a lattice. To estimate model parameters we adopt a Bayesian perspective, and use Monte Carlo integration with samples from Markov Chain simulations. Our methods does not demand lattice, or near-lattice data, but are developed for a general spatial data-set, leaving the lattice to be specified by the modeller. The model selection problem that comes with the artificial grid is in this article addressed with cross-validation, but we also suggest other alternatives. From the application of the methods to a data set of elemental composition of forest soil, we obtained predictive distributions at arbitrary locations as well as estimates of model parameters.  相似文献   

7.
Multilayer perceptrons (MLPs) can be used to discover a function which can be used to map from a set of input variables onto a value representing the conditional probability of mineralization. The standard approach to training MLPs is gradient descent, in which the error between the network output and the target output is reduced in each iteration of the training algorithm. In order to prevent overfitting, a split-sample validation procedure is used, in which the data is partitioned into two sets: a training set, which is used for weight optimization, and a validation set, which is used to optimize various parameters that can be used to prevent overfitting. One of the problems with this approach is that the resulting maps can display significant variability which stems from (i) the (randomly initialized) starting weights and (ii) the particular training/validation set partition (also determined randomly). This problem is especially pertinent on mineral potential mapping tasks, in which the number of deposit cells is a very small proportion of the total number of cells in the study area. In contrast to gradient descent methods, Bayesian learning techniques do not find a single weight vector; rather, they infer the posterior distribution of the weights given the data. Predictions are then made by integrating over this distribution. An important advantage of the Bayesian approach is that the optimization of parameters such as the weight decay regularization coefficient can be performed using training data alone, thus avoiding the noise introduced through split-sample validation. This paper reports results of applying Bayesian learning techniques to the production of maps representing gold mineralization potential over the Castlemaine region of Victoria, Australia. Maps produced using the Bayesian approach display significantly less variability than those produced using gradient descent training. They are also more reliable at predicting the presence of unknown deposits.  相似文献   

8.
Current studies have focused on selecting constitutive models using optimization methods or selecting simple formulas or models using Bayesian methods. In contrast, this paper deals with the challenge to propose an effective Bayesian-based selection method for advanced soil models accounting for the soil uncertainty. Four representative critical state-based advanced sand models are chosen as database of constitutive model. Triaxial tests on Hostun sand are selected as training and testing data. The Bayesian method is enhanced based on transitional Markov chain Monte Carlo method, whereby the generalization ability for each model is simultaneously evaluated, for the model selection. The most plausible/suitable model in terms of predictive ability, generalization ability, and model complexity is selected using training data. The performance of the method is then validated by testing data. Finally, a series of drained triaxial tests on Karlsruhe sand is used for further evaluating the performance.  相似文献   

9.
Soil pollution data collection typically studies multivariate measurements at sampling locations, e.g., lead, zinc, copper or cadmium levels. With increased collection of such multivariate geostatistical spatial data, there arises the need for flexible explanatory stochastic models. Here, we propose a general constructive approach for building suitable models based upon convolution of covariance functions. We begin with a general theorem which asserts that, under weak conditions, cross convolution of covariance functions provides a valid cross covariance function. We also obtain a result on dependence induced by such convolution. Since, in general, convolution does not provide closed-form integration, we discuss efficient computation. We then suggest introducing such specification through a Gaussian process to model multivariate spatial random effects within a hierarchical model. We note that modeling spatial random effects in this way is parsimonious relative to say, the linear model of coregionalization. Through a limited simulation, we informally demonstrate that performance for these two specifications appears to be indistinguishable, encouraging the parsimonious choice. Finally, we use the convolved covariance model to analyze a trivariate pollution dataset from California.  相似文献   

10.
突发性水污染事件溯源方法   总被引:2,自引:0,他引:2       下载免费PDF全文
为快速准确地求解突发性水污染溯源问题,在微分进化与蒙特卡罗基础上提出了一种新的溯源方法。该方法将溯源问题视为贝叶斯估计问题,推导出污染源强度、位置和排放时刻等未知参数的后验概率密度函数;结合微分进化和蒙特卡罗模拟方法对后验概率分布进行采样,进而估计出这些未知参数,确定污染源项。通过算例与贝叶斯-蒙特卡罗方法进行对比,结果表明:该方法可使迭代次数有效缩减3/4,污染源强度、位置和排放时刻的平均相对误差分别减少1.23%、2.23%和4.15%,均值误差分别降低0.39%、0.83%和1.49%,其稳定性和可靠性明显高于贝叶斯-蒙特卡罗方法,能较好地识别突发性水污染源,为解决突发水污染事件中的追踪溯源难点问题提供了新的思路和方法。  相似文献   

11.
This paper presents an efficient Bayesian back-analysis procedure for braced excavations using wall deflection data at multiple points. Response surfaces obtained from finite element analyses are adopted to efficiently evaluate the wall responses. Deflection data for 49 wall sections from 11 case histories are collected to characterize the model error of the finite element method for evaluating the deflections at various points. A braced excavation project in Hang Zhou, China is chosen to illustrate the effectiveness of the proposed procedure. The results indicate that the soil parameters could be updated more significantly for the updating that uses the deflection data at multiple points than that only uses the maximum deflection data. The predicted deflections from the updated parameters agree fairly well with the field observations. The main significance of the proposed procedure is that it improves the updating efficiency of the soil parameters without adding monitoring effort compared with the traditional method that uses the maximum deflection data.  相似文献   

12.
Spatial data are often sparse by nature. However, in many instances, information may exist in the form of soft data, such as expert opinion. Scientists in the field often have a good understanding of the phenomenon under study and may be able to provide valuable information on its likely behavior. It is thus useful to have a sensible mechanism that incorporates expert opinion in inference. The Bayesian paradigm suffers from an inherent subjectivity that is unacceptable to many scientists. Aside from this philosophical problem, elicitation of prior distributions is a difficult task. Moreover, an intentionally misleading expert can have substantial influence on Bayesian inference. In our experience, eliciting data is much more natural to the experts than eliciting prior distributions on the parameters of a probability model that is a purely statistical construct. In this paper we elicit data, i.e., guess values for the realization of the process, from the experts. Utilizing a hierarchical modeling framework, we combine elicited data and actual observed data for inferential purposes. A distinguishing feature of this approach is that even an intentionally misleading expert proves to be useful. Theoretical results and simulations illustrate that incorporating expert opinion via elicited data substantially improves the estimation, prediction, and design aspects of statistical inference for spatial data.  相似文献   

13.
为了开展寒旱山区典型流域融雪径流过程的研究,提高融雪径流模型(SRM)在山区融雪地区的水文过程模拟精度,本文选取新疆提孜那甫河流域作为典型研究区,在SRM径流计算基础上,加入合适的基流数据并进行不确定性分析。考虑4种常见的基流分割方法(数字滤波法、加里宁法、BFI法(滑动最小值法)和HYSEP(hydrograph separation program)法),基于贝叶斯理论,采用马尔科夫链蒙特卡洛(MCMC)模拟进行参数不确定性分析,对使用不同基流数据SRM的融雪径流模拟表现进行综合评价。分析结果表明,基于加里宁基流分割方法的模型(SRMK)能够最佳地模拟研究区融雪径流过程(纳什系数NSE在识别期和验证期分别为0.866和0.721,大于其他对比模型)。MCMC模拟能够较好地识别SRM参数,获得可靠的参数后验概率分布。当实测降水资料缺乏或其代表性较差时,TRMM(tropical rainfall measuring mission)卫星数据能够描述研究区的降水过程特征。  相似文献   

14.
Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.  相似文献   

15.
In site investigation, the amount of observation data obtained for geotechnical property characterisation is often too sparse to obtain meaningful statistics and probability distributions of geotechnical properties. To address this problem, a Bayesian equivalent sample method was recently developed. This paper aims to generalize the Bayesian equivalent sample method to various geotechnical properties, when measured by different direct or indirect test procedures, and to implement the generalized method in Excel by developing an Excel VBA program called Bayesian Equivalent Sample Toolkit (BEST). The BEST program makes it possible for practitioners to apply the Bayesian equivalent sample method without being compromised by sophisticated algorithms in probability, statistics and simulation. The program is demonstrated and validated through examples of soil and rock property characterisations.  相似文献   

16.
水文气象极值统计推断的可靠性问题   总被引:1,自引:0,他引:1       下载免费PDF全文
董双林 《水科学进展》2012,23(4):575-580
论述极值统计推断不可靠的表现、产生原因及提高可靠性的途径、可操作性方法和可靠性评价。提出收敛域准则、极大似然估计的精估计和粗估计、泛似然估计、多母体现象,应用超拟合现象、抽样误差、蒙特卡罗随机模拟等概念或方法,得出全国各气象站多个气象要素的可靠极值统计推断结果,初步形成小样本统计理论。  相似文献   

17.
Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing research domain generally referred to as hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters, their inherent trade-off between resolution and range, as well as the notoriously site-specific nature of petrophysical parameter relations, the fundamental usefulness of multi-method surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative Integration of the resulting vast and diverse database into a unified model of the probed subsurface region that is consistent with all available measurements. To this end, we present a novel approach toward hydrogeophysical data integration based on a Monte-Carlo-type conditional stochastic simulation method that we consider to be particularly suitable for high-resolution local-scale studies. Monte Carlo techniques are flexible and versatile, allowing for accounting for a wide variety of data and constraints of differing resolution and hardness, and thus have the potential of providing, in a geostatistical sense, realistic models of the pertinent target parameter distributions. Compared to more conventional approaches, such as co-kriging or cluster analysis, our approach provides significant advancements in the way that larger-scale structural information contained in the hydrogeophysical data can be accounted for. After outlining the methodological background of our algorithm, we present the results of its application to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the detailed local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to a field dataset collected at the Boise Hydrogeophysical Research Site. Finally, we compare the performance of our data integration approach to that of more conventional methods with regard to the prediction of flow and transport phenomena in highly heterogeneous media and discuss the Implications arising.  相似文献   

18.
We present an open‐source algorithm in Mathematica application (Wolfram Research) with a transparent data reduction and Monte Carlo simulation of systematic and random uncertainties for U‐Th geochronometry by multi‐collector ICP‐MS. Uranium and thorium were quantitatively separated from matrix elements through a single U/TEVA extraction chromatography step. A rigorous calibrator‐sample bracketing routine was adopted using CRM‐112A and IRMM‐035 standard solutions, doped with an IRMM‐3636a 233U/236U ‘double‐spike’ to account for instrumental mass bias and deviations of measured isotope ratios from certified values. The mean of 234U/238U and 230Th/232Th in the standard solutions varied within 0.42 and 0.25‰ (permil) of certified ratios, respectively, and were consistent with literature values within uncertainties. Based on multiple dissolutions with lithium metaborate flux fusion, U and Th concentrations in USGS BCR‐2 CRM were updated to 1739 ± 2 and 5987 ± 50 ng g?1 (95% CI), respectively. The measurement reproducibility of our analytical technique was evaluated by analysing six aliquots of an in‐house reference material, prepared by homogenising a piece of speleothem (CC3A) from Cathedral Cave, Utah, which returned a mean age of 21483 ± 63 years (95% CI, 2.9‰). Replicate analysis of ten samples from CC3A was consistent with ages previously measured at the University of Minnesota by single‐collector ICP‐MS within uncertainties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号