首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Coregionalization analysis has been presented as a method of multi-scale analysis for multivariate spatial data. Despite an increasing use of this method in environmental and earth sciences, the uncertainty associated with the estimation of parameters in coregionalization analysis (e.g., sills and functions of sills) is potentially high and has not yet been characterized. This article aims to discuss the theory underlying coregionalization analysis and assess the robustness and limits of the method. A theoretical framework is developed to calculate the ergodic and fluctuation variance-covariance matrices of least-squares estimators of sills in the linear model of coregionalization. To adjust for the positive semidefiniteness constraint on estimated coregionalization matrices, a confidence interval estimation procedure for sills and functions of sills is presented. Thereafter, the relative importance of uncertainty measures (bias and variance) for sills and structural coefficients of correlation and determination is assessed under different scenarios to identify factors controlling their uncertainty. Our results show that the sampling grid density, the choice of the least-squares estimator of sills, the positive semidefiniteness constraint, the presence of scale dependence in the correlations, and the number and range of variogram models, all affect the level of uncertainty, sometimes through multiple interactions. The asymptotic properties of variogram model parameter estimators in a bounded sampling domain impose a theoretical limit to their accuracy and precision. Because of this limit, the uncertainty was found to be high for several scenarios, especially with three variogram models, and was often more dependent on the ratio of variogram range to domain extent than on the sampling grid density. In practice, in the coregionalization analysis of a real dataset, the circular requirement for sill estimates in the calculation of uncertainty measures makes the quantification of uncertainty very problematic, if not impossible. The use of coregionalization analysis must be made with due knowledge of the uncertainty levels and limits of the method.  相似文献   

2.
Coregionalization analysis has been presented as a method of multi-scale analysis for multivariate spatial data. Despite an increasing use of this method in environmental and earth sciences, the uncertainty associated with the estimation of parameters in coregionalization analysis (e.g., sills and functions of sills) is potentially high and has not yet been characterized. This article aims to discuss the theory underlying coregionalization analysis and assess the robustness and limits of the method. A theoretical framework is developed to calculate the ergodic and fluctuation variance-covariance matrices of least-squares estimators of sills in the linear model of coregionalization. To adjust for the positive semidefiniteness constraint on estimated coregionalization matrices, a confidence interval estimation procedure for sills and functions of sills is presented. Thereafter, the relative importance of uncertainty measures (bias and variance) for sills and structural coefficients of correlation and determination is assessed under different scenarios to identify factors controlling their uncertainty. Our results show that the sampling grid density, the choice of the least-squares estimator of sills, the positive semidefiniteness constraint, the presence of scale dependence in the correlations, and the number and range of variogram models, all affect the level of uncertainty, sometimes through multiple interactions. The asymptotic properties of variogram model parameter estimators in a bounded sampling domain impose a theoretical limit to their accuracy and precision. Because of this limit, the uncertainty was found to be high for several scenarios, especially with three variogram models, and was often more dependent on the ratio of variogram range to domain extent than on the sampling grid density. In practice, in the coregionalization analysis of a real dataset, the circular requirement for sill estimates in the calculation of uncertainty measures makes the quantification of uncertainty very problematic, if not impossible. The use of coregionalization analysis must be made with due knowledge of the uncertainty levels and limits of the method.  相似文献   

3.
Uncertainty quantification for geomechanical and reservoir predictions is in general a computationally intensive problem, especially if a direct Monte Carlo approach with large numbers of full-physics simulations is used. A common solution to this problem, well-known for the fluid flow simulations, is the adoption of surrogate modeling approximating the physical behavior with respect to variations in uncertain parameters. The objective of this work is the quantification of such uncertainty both within geomechanical predictions and fluid-flow predictions using a specific surrogate modeling technique, which is based on a functional approach. The methodology realizes an approximation of full-physics simulated outputs that are varying in time and space when uncertainty parameters are changed, particularly important for the prediction of uncertainty in vertical displacement resulting from geomechanical modeling. The developed methodology has been applied both to a subsidence uncertainty quantification example and to a real reservoir forecast risk assessment. The surrogate quality obtained with these applications confirms that the proposed method makes it possible to perform reliable time–space varying dependent risk assessment with a low computational cost, provided the uncertainty space is low-dimensional.  相似文献   

4.
Stability is a key issue in any mining or tunnelling activity. Joint frequency constitutes an important input into stability analyses. Three techniques are used herein to quantify the local and spatial joint frequency uncertainty, or possible joint frequencies given joint frequency data, at unsampled locations. Rock quality designation is estimated from the predicted joint frequencies. The first method is based on kriging with subsequent Poisson sampling. The second method transforms the data to near-Gaussian variables and uses the turning band method to generate a range of possible joint frequencies. The third method assumes that the data are Poisson distributed and models the log-intensity of these data with a spatially smooth Gaussian prior distribution. Intensities are obtained and Poisson variables are generated to examine the expected joint frequency and associated variability. The joint frequency data is from an iron ore in the northern part of Norway. The methods are tested at unsampled locations and validated at sampled locations. All three methods perform quite well when predicting sampled points. The probability that the joint frequency exceeds 5 joints per metre is also estimated to illustrate a more realistic utilisation. The obtained probability map highlights zones in the ore where stability problems have occurred. It is therefore concluded that the methods work and that more emphasis should have been placed on these kinds of analyses when the mine was planned. By using simulation instead of estimation, it is possible to obtain a clear picture of possible joint frequency values or ranges, i.e. the uncertainty.  相似文献   

5.
Response surface experimental designs provide a framework for evaluating sensitivities and assessing uncertainties in reservoir-production forecasts for continuous parameters (i.e. permeability, flow rate, etc.). In this paper, the method is extended in order to integrate both continuous and discrete parameters (i.e. fault status: open/close, injection scheme: SWAG/WAG, etc.). This paper presents an appropriate experimental designs approach, notably the regression models associated with, and the statistical interpretation (sensitivity study, Monte Carlo simulations, etc.). The method has been successfully applied to a reservoir oil-production simulation problem. The objective was to define the best production scheme by optimizing the well-completion level. This application has highlighted the advantages of this new approach, both in terms of decreasing simulation cost and improving the interpretation quality.  相似文献   

6.
7.
This paper presents a strain-rate dependent plastic constitutive model for clays. Based on the concepts of critical-state soil mechanics and bounding surface plasticity theory, the model reproduces the mechanical response of clays under triaxial and simple shear loading conditions. The model parameters are determined for Boston Blue Clay, London Clay and Kaolin Clay, and the performance of the model in simulating the mechanical response of these clays is demonstrated for low to medium strain rates. The sensitivity of each model parameter is checked by perturbing the calibrated values by ±20 %. Subsequently, a probabilistic analysis using Monte Carlo simulations is performed by treating the model parameters as random variables and the impact of the statistics of the parameters on the undrained shear strength is investigated.  相似文献   

8.
阐述潮汐分析方法和建模方法,归纳总结了FES,CSR,GOT,NAO,TPXO,EOT,DTU,HAMTIDE及OSU系列全球海潮模型的建立机构、使用数据及构建方法等。对比分析2010年后出现的几种新的海潮模型(FES2012,EOT11a,DTU10和HAMTIDE11a)在南大洋的M2振幅,发现模型间差异主要集中在浅水及极地地区,其中极地地区高精度卫星测高数据的缺少及海冰的季节性变化,是导致建模精度较差的主要原因。最后对海潮模型的发展方向提出一些建议。  相似文献   

9.
10.
陈昌军  郑雄伟  张卫飞 《水文》2012,32(2):16-20
模型不确定性研究是水文科学的重要课题。以尼泊尔Bagmati流域为案例,采用了马尔科夫链蒙托卡罗(Markov Chain Monte Carlo)、蒙托卡罗(Monte Carlo)和拉丁超立方体(Latin Hypercube)等三种方法,分析了水箱模型输出成果的不确定性,并将三种方法所获得参数不确定性进行了比较。另外,运用Meta-Gaussian模型计算了总体不确定性,在基于所采用的似然函数基础上,对由参数导致模型输出的不确定性和模型输出的总体不确定性进行了比较。结果显示,模型的不确定性比参数不确定性更为重要,同时也表明,尽管蒙托卡罗和拉丁超立方体两种模拟方法产生几乎相同的结果,但两者都与马尔科夫链蒙托卡罗方法有很大的不同。  相似文献   

11.
Uncertainty quantification for subsurface flow problems is typically accomplished through model-based inversion procedures in which multiple posterior (history-matched) geological models are generated and used for flow predictions. These procedures can be demanding computationally, however, and it is not always straightforward to maintain geological realism in the resulting history-matched models. In some applications, it is the flow predictions themselves (and the uncertainty associated with these predictions), rather than the posterior geological models, that are of primary interest. This is the motivation for the data-space inversion (DSI) procedure developed in this paper. In the DSI approach, an ensemble of prior model realizations, honoring prior geostatistical information and hard data at wells, are generated and then (flow) simulated. The resulting production data are assembled into data vectors that represent prior ‘realizations’ in the data space. Pattern-based mapping operations and principal component analysis are applied to transform non-Gaussian data variables into lower-dimensional variables that are closer to multivariate Gaussian. The data-space inversion is posed within a Bayesian framework, and a data-space randomized maximum likelihood method is introduced to sample the conditional distribution of data variables given observed data. Extensive numerical results are presented for two example cases involving oil–water flow in a bimodal channelized system and oil–water–gas flow in a Gaussian permeability system. For both cases, DSI results for uncertainty quantification (e.g., P10, P50, P90 posterior predictions) are compared with those obtained from a strict rejection sampling (RS) procedure. Close agreement between the DSI and RS results is consistently achieved, even when the (synthetic) true data to be matched fall near the edge of the prior distribution. Computational savings using DSI are very substantial in that RS requires \(O(10^5\)\(10^6)\) flow simulations, in contrast to 500 for DSI, for the cases considered.  相似文献   

12.
13.
Mathematical Geosciences - In mine planning, geospatial estimates of variables such as comminution indexes and metallurgical recovery are extremely important to locate blocks for which the energy...  相似文献   

14.
以内蒙古东胜煤田北部艾来五库沟-台吉召地段勘查区为例,针对三维地质建模方法在多源多类地质数据应用、动态建模及建模结果不确定性评估等方面存在的不足进行研究,并实现了煤层三维建模.煤层三维建模要领是:先按照数据的内在逻辑关系和几何结构两个层次进行数据融合处理,将钻孔、剖面及断层等数据转化为形式简单、结构一致的离散化样品数据;再以这些数据和各类边界线为基础,通过距离幂次反比(inverse distance weighting,IDW)法插值得到煤层顶底板高程,并结合这些属性值和TIN(triangulated irregular network)剖分技术动态生成煤层结构模型;然后利用一个与煤层顶底板高程估值结果相对应的算子——插值方差,完成对煤层模型的局部不确定性评估.详细讨论了煤层三维建模的整体流程并基于三维地学信息系统平台QuantyView加以实现.实际应用结果表明,所提方法具有显著的实际应用价值.   相似文献   

15.
作为深时数字地球项目的底层框架,全球古地理重建模型包括地质历史时期板块的位置和运动轨迹以及地表特征两方面的内容。过去数十年里,基于不同方法、不同资料的全球古地理重建模型不断涌现。综合古地磁学、古生物学、沉积学、地球物理、地球化学以及地球动力学领域的知识与资料解释古地理,并建立起数字化、可修改、随时间演变的模型是当前常见的方法。文章介绍了国内外全球古地理重建模型的构建方法,并比较了六种主流的重建模型(PaleoMap、PLATES、UNIL、GOLONKA、GMAP和EarthByte),旨在为国内相关领域研究提供参考。文章还介绍了数字化全球重建古地理模型在古气候、板块构造驱动力以及盆地演化方面的应用及知识发现。通过对现存模型的介绍,提出展望,希望在深时数字地球计划的框架下整合国内外优秀科学家,重新设计并建立真正统一的四维古地理重建模型。  相似文献   

16.
基于卫星高度计的全球大洋潮汐模式的准确度评估   总被引:2,自引:0,他引:2  
依据152个深海验潮站与大洋岛屿地面验潮站观测得到的8个主要分潮(M2、S2、K1、O1、N2、K2、P1及Q1)调和常数,对现有7个全球大洋潮汐模式的准确度进行了检验,结果显示各模式在深海大洋区域均达到了比较高的准确度:M2分潮的潮高均方根偏差在1.0~1.3cm之间;8个分潮的和方根偏差在2.1~2.3cm之间,与早期的模式相比,准确度又有了进一步提高。还依据中国近海18个岛屿的调和常数对其中的5个大洋潮汐模式的准确度进行了检验,结果表明,M2分潮均方根偏差在4.4~10cm,明显高于大洋的均方根偏差。其中日本国家天文台的潮汐模式NAO99在中国近海的结果相对较准确。  相似文献   

17.
大气环流模式在黄河流域的适用性评价   总被引:3,自引:1,他引:2  
本文以国际上较流行的5个大气环流模式(HadCM3、GFDL、ECHAM4、CSIRO-Mk2以及CGCM2)对黄河流域1961~1990年温度和降水的模拟结果为基础,通过与该流域同期观测值比较,分析了各大气环流模式(GCM)在黄河流域的适用性.研究结果表明:HadCM3、GFDL两个模式对黄河流域温度的模拟结果较好;ECHAM4、HadCM3两个模式对黄河流域降水的模拟结果较好.5个大气环流模式对温度的模拟明显优于对降水的模拟.总体而言,英国的HadCM3模式在黄河流域的适用性最好,可为黄河流域水文水资源、水土流失对全球气候变化响应等相关研究,提供未来气候变化情景的借鉴.  相似文献   

18.
Skilful prediction of the monthly and seasonal summer monsoon rainfall over India at a smaller spatial scale is a major challenge for the scientific community. The present study is aimed at achieving this objective by hybridising two mathematical techniques, namely synthetic superensemble (SSE) and supervised principal component regression (SPCR) on six state-of-the art Global Climate Models (GCMs). The performance of the mathematical model is evaluated using correlation analysis, the root mean square error, and the Nash–Sutcliffe efficiency index. Results feature reasonable improvement over central India, which is a zone of maximum rainfall activity in the summer monsoon season. The study also highlights improvement in the monthly prediction of rainfall over raw GCMs (15–20% improvement) with exceptional improvement in July. The developed model is also examined for anomalous years of monsoon and it is found that the model is able to capture the signs of anomalies over different gridpoints of the Indian domain.  相似文献   

19.
A new uncertainty quantification framework is adopted for carbon sequestration to evaluate the effect of spatial heterogeneity of reservoir permeability on CO2 migration. Sequential Gaussian simulation is used to generate multiple realizations of permeability fields with various spatial statistical attributes. In order to deal with the computational difficulties, the following ideas/approaches are integrated. First, different efficient sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling) are used to reduce the number of forward calculations, explore effectively the parameter space, and quantify the input uncertainty. Second, a scalable numerical simulator, extreme-scale Subsurface Transport Over Multiple Phases, is adopted as the forward modeling simulator for CO2 migration. The framework has the capability to quantify input uncertainty, generate exploratory samples effectively, perform scalable numerical simulations, visualize output uncertainty, and evaluate input-output relationships. The framework is demonstrated with a given CO2 injection scenario in heterogeneous sandstone reservoirs. Results show that geostatistical parameters for permeability have different impacts on CO2 plume radius: the mean parameter has positive effects at the top layers, but affects the bottom layers negatively. The variance generally has a positive effect on the plume radius at all layers, particularly at middle layers, where the transport of CO2 is highly influenced by the subsurface heterogeneity structure. The anisotropy ratio has weak impacts on the plume radius, but affects the shape of the CO2 plume.  相似文献   

20.
Significant uncertainties are associated with the definition of both the exploration targeting criteria and computational algorithms used to generate mineral prospectivity maps. In prospectivity modeling, the input and computational uncertainties are generally made implicit, by making a series of best-guess or best-fit decisions, on the basis of incomplete and imprecise information. The individual uncertainties are then compounded and propagated into the final prospectivity map as an implicit combined uncertainty which is impossible to directly analyze and use for decision making. This paper proposes a new approach to explicitly define uncertainties of individual targeting criteria and propagate them through a computational algorithm to evaluate the combined uncertainty of a prospectivity map. Applied to fuzzy logic prospectivity models, this approach involves replacing point estimates of fuzzy membership values by statistical distributions deemed representative of likely variability of the corresponding fuzzy membership values. Uncertainty is then propagated through a fuzzy logic inference system by applying Monte Carlo simulations. A final prospectivity map is represented by a grid of statistical distributions of fuzzy prospectivity. Such modeling of uncertainty in prospectivity analyses allows better definition of exploration target quality, as understanding of uncertainty is consistently captured, propagated and visualized in a transparent manner. The explicit uncertainty information of prospectivity maps can support further risk analysis and decision making. The proposed probabilistic fuzzy logic approach can be used in any area of geosciences to model uncertainty of complex fuzzy systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号