首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
Minimum Acceptance Criteria for Geostatistical Realizations   总被引:2,自引:0,他引:2  
Geostatistical simulation is being used increasingly for numerical modeling of natural phenomena. The development of simulation as an alternative to kriging is the result of improved characterization of heterogeneity and a model of joint uncertainty. The popularity of simulation has increased in both mining and petroleum industries. Simulation is widely available in commercial software. Many of these software packages, however, do not necessarily provide the tools for careful checking of the geostatistical realizations prior to their use in decision-making. Moreover, practitioners may not understand all that should be checked. There are some basic checks that should be performed on all geostatistical models. This paper identifies (1) the minimum criteria that should be met by all geostatistical simulation models, and (2) the checks required to verify that these minimum criteria are satisfied. All realizations should honor the input information including the geological interpretation, the data values at their locations, the data distribution, and the correlation structure, within acceptable statistical fluctuations. Moreover, the uncertainty measured by the differences between simulated realizations should be a reasonable measure of uncertainty. A number of different applications are shown to illustrate the various checks. These checks should be an integral part of any simulation modeling work flow.  相似文献   

2.
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.  相似文献   

3.
Multiple-Point Statistics for Training Image Selection   总被引:2,自引:0,他引:2  
Selecting a training image (TI) that is representative of the target spatial phenomenon (reservoir, mineral deposit, soil type, etc.) is essential for an effective application of multiple-point statistics (MPS) simulation. It is often possible to narrow potential TIs to a general subset based on the available geological knowledge; however, this is largely subjective. A method is presented that compares the distribution of runs and the multiple-point density function from available exploration data and TIs. The difference in the MPS can be used to select the TI that is most representative of the data set. This tool may be applied to further narrow a suite of TIs for a more realistic model of spatial uncertainty. In addition, significant differences between the spatial statistics of local conditioning data and a TI may lead to artifacts in MPS. The utilization of this tool will identify contradictions between conditioning data and TIs. TI selection is demonstrated for a deepwater reservoir with 32 wells.  相似文献   

4.
This paper presents a variant of p-field simulation that allows generation of spatial realizations through sampling of a set of conditional probability distribution functions (ccdf) by sets of probability values, called p-fields. Whereas in the common implementation of the algorithm the p-fields are nonconditional realizations of random functions with uniform marginal distributions, they are here conditional to 0.5 probability values at data locations, which entails a preferential sampling of the central part of the ccdf around these locations. The approach is illustrated using a randomly sampled (200 observations of the NIR channel) SPOT scene of a semi-deciduous tropical forest. Results indicate that the use of conditional probability fields improves the reproduction of statistics such as histogram and semivariogram, while yielding more accurate predictions of reflectance values than the common p-field implementation or the more CPU-intensive sequential indicator simulation. Pixel values are then classified as forest or savannah depending on whether the simulated reflectance value exceeds a given threshold value. In this case study, the proposed approach leads to a more precise and accurate prediction of the size of contiguous areas covered by savannah than the two other simulation algorithms.  相似文献   

5.
地统计方法学研究进展   总被引:15,自引:2,他引:13  
郭怀成  周丰  刀谞 《地理研究》2008,27(5):1191-1202
地统计方法学已经成为空间预测与不确定性分析的关键性工具。本研究从文献计量学和方法学演变过程两个角度开展了1967~2005年的地统计方法学研究综述研究。首先从宏观角度分析其发展趋势、应用情况与模式,然后总结其演化规律、适宜性和选择原则。研究表明:地统计方法学的演化规律表现为稳态向非稳态演变、单变量向多变量(含二次信息)演变、参数与非参数方法相互补充、线性向非线性方法演变和空间静态向时空动态演变;此外,未来研究发展方向集中在半变异函数估计新方法、不确定性地统计学、时空地统计学与多点地统计学、机理模型与地统计学耦合研究和基于地统计学模拟的不确定性决策等。  相似文献   

6.
An important aim of modern geostatistical modeling is to quantify uncertainty in geological systems. Geostatistical modeling requires many input parameters. The input univariate distribution or histogram is perhaps the most important. A new method for assessing uncertainty in the histogram, particularly uncertainty in the mean, is presented. This method, referred to as the conditional finite-domain (CFD) approach, accounts for the size of the domain and the local conditioning data. It is a stochastic approach based on a multivariate Gaussian distribution. The CFD approach is shown to be convergent, design independent, and parameterization invariant. The performance of the CFD approach is illustrated in a case study focusing on the impact of the number of data and the range of correlation on the limiting uncertainty in the parameters. The spatial bootstrap method and CFD approach are compared. As the number of data increases, uncertainty in the sample mean decreases in both the spatial bootstrap and the CFD. Contrary to spatial bootstrap, uncertainty in the sample mean in the CFD approach decreases as the range of correlation increases. This is a direct result of the conditioning data being more correlated to unsampled locations in the finite domain. The sensitivity of the limiting uncertainty relative to the variogram and the variable limits are also discussed.  相似文献   

7.
Continuous depletion of groundwater levels from deliberate and uncontrolled exploitation of groundwater resources lead to the severe problems in arid and semi-arid hard-rock regions of the world. Geostatistics and geographic information system (GIS) have been proved as successful tools for efficient planning and management of the groundwater resources. The present study demonstrated applicability of geostatistics and GIS to understand spatial and temporal behavior of groundwater levels in a semi-arid hard-rock aquifer of Western India. Monthly groundwater levels of 50 sites in the study area for 36-month period (May 2006 to June 2009; excluding 3 months) were analyzed to find spatial autocorrelation and variances in the groundwater levels. Experimental variogram of the observed groundwater levels was computed at 750-m lag distance interval and the four most-widely used geostatistical models were fitted to the experimental variogram. The best-fit geostatistical model was selected by using two goodness-of-fit criteria, i.e., root mean square error (RMSE) and correlation coefficient (r). Then spatial maps of the groundwater levels were prepared through kriging technique by means of the best-fit geostatistical model. Results of two spatial statistics (Geary’s C and Moran’s I) indicated a strong positive autocorrelation in the groundwater levels within 3-km lag distance. It is emphasized that the spatial statistics are promising tools for geostatistical modeling, which help choose appropriate values of model parameters. Nugget-sill ratio (<0.25) revealed that the groundwater levels have strong spatial dependence in the area. The statistical indicators (RMSE and r) suggested that any of the three geostatistical models, i.e., spherical, circular, and exponential, can be selected as the best-fit model for reliable and accurate spatial interpolation. However, exponential model is used as the best-fit model in the present study. Selection of the exponential model as the best-fit was further supported by very high values of coefficient of determination (r 2 ranging from 0.927 to 0.994). Spatial distribution maps of groundwater levels indicated that the groundwater levels are strongly affected by surface topography and the presence of surface water bodies in the study area. Temporal pattern of the groundwater levels is mainly controlled by the rainy-season recharge and amount of groundwater extraction. Furthermore, it was found that the kriging technique is helpful in identifying critical locations over the study area where water saving and groundwater augmentation techniques need to be implemented to protect depleting groundwater resources.  相似文献   

8.

Experimental variograms are crucial for most geostatistical studies. In kriging, for example, the variography has a direct influence on the interpolation weights. Despite the great importance of variogram estimators in predicting geostatistical features, they are commonly influenced by outliers in the dataset. The effect of some randomly spatially distributed outliers can mask the pattern of the experimental variogram and produce a destructuration effect, implying that the true data spatial continuity cannot be reproduced. In this paper, an algorithm to detect and remove the effect of outliers in experimental variograms using the Mahalanobis distance is proposed. An example of the algorithm’s application is presented, showing that the developed technique is able to satisfactorily detect and remove outliers from a variogram.

  相似文献   

9.
Spatial data uncertainty models (SDUM) are necessary tools that quantify the reliability of results from geographical information system (GIS) applications. One technique used by SDUM is Monte Carlo simulation, a technique that quantifies spatial data and application uncertainty by determining the possible range of application results. A complete Monte Carlo SDUM for generalized continuous surfaces typically has three components: an error magnitude model, a spatial statistical model defining error shapes, and a heuristic that creates multiple realizations of error fields added to the generalized elevation map. This paper introduces a spatial statistical model that represents multiple statistics simultaneously and weighted against each other. This paper's case study builds a SDUM for a digital elevation model (DEM). The case study accounts for relevant shape patterns in elevation errors by reintroducing specific topological shapes, such as ridges and valleys, in appropriate localized positions. The spatial statistical model also minimizes topological artefacts, such as cells without outward drainage and inappropriate gradient distributions, which are frequent problems with random field-based SDUM. Multiple weighted spatial statistics enable two conflicting SDUM philosophies to co-exist. The two philosophies are ‘errors are only measured from higher quality data’ and ‘SDUM need to model reality’. This article uses an automatic parameter fitting random field model to initialize Monte Carlo input realizations followed by an inter-map cell-swapping heuristic to adjust the realizations to fit multiple spatial statistics. The inter-map cell-swapping heuristic allows spatial data uncertainty modelers to choose the appropriate probability model and weighted multiple spatial statistics which best represent errors caused by map generalization. This article also presents a lag-based measure to better represent gradient within a SDUM. This article covers the inter-map cell-swapping heuristic as well as both probability and spatial statistical models in detail.  相似文献   

10.

Delineation of facies in the subsurface and quantification of uncertainty in their boundaries are significant steps in mineral resource evaluation and reservoir modeling, which impact downstream analyses of a mining or petroleum project. This paper investigates the ability of nonparametric geostatistical simulation algorithms (sequential indicator, single normal equation and filter-based simulation) to construct realizations that reproduce some expected statistical and spatial features, namely facies proportions, boundary regularity, contact relationships and spatial correlation structure, as well as the expected fluctuations of these features across the realizations. The investigation is held through a synthetic case study and a real case study, in which a pluri-Gaussian model is considered as the reference for comparing the simulation results. Sequential indicator simulation and single normal equation simulation based on over-restricted neighborhood implementations yield the poorest results, followed by filter-based simulation, whereas single normal equation simulation with a large neighborhood implementation provides results that are closest to the reference pluri-Gaussian model. However, some biases and inaccurate fluctuations in the realization statistics (facies proportions, indicator direct and cross-variograms) still arise, which can be explained by the use of a single finite-size training image to construct the realizations.

  相似文献   

11.
Spatial uncertainty analysis is a complex and difficult task for orebody estimation in the mining industry. Conventional models (kriging and its variants) with variogram-based statistics fail to capture the spatial complexity of an orebody. Due to this, the grade and tonnage are incorrectly estimated resulting in inaccurate mine plans, which lead to costly financial decision. Multiple-point geostatistical simulation model can overcome the limitations of the conventional two-point spatial models. In this study, a multiple-point geostatistical method, namely SNESIM, was applied to generate multiple equiprobable orebody models for a copper deposit in Africa, and it helped to analyze the uncertainty of ore tonnage of the deposit. The grade uncertainty was evaluated by sequential Gaussian simulation within each equiprobable orebody models. The results were validated by reproducing the marginal distribution and two- and three-point statistics. The results show that deviations of volume of the simulated orebody models vary from ? 3 to 5% compared to the training image. The grade simulation results demonstrated that the average grades from the different simulation are varied from 3.77 to 4.92% and average grade 4.33%. The results also show that the volume and grade uncertainty model overestimates the orebody volume as compared to the conventional orebody. This study demonstrates that incorporating grade and volume uncertainty leads to significant changes in resource estimates.  相似文献   

12.

Mineral resource classification plays an important role in the downstream activities of a mining project. Spatial modeling of the grade variability in a deposit directly impacts the evaluation of recovery functions, such as the tonnage, metal quantity and mean grade above cutoffs. The use of geostatistical simulations for this purpose is becoming popular among practitioners because they produce statistical parameters of the sample dataset in cases of global distribution (e.g., histograms) and local distribution (e.g., variograms). Conditional simulations can also be assessed to quantify the uncertainty within the blocks. In this sense, mineral resource classification based on obtained realizations leads to the likely computation of reliable recovery functions, showing the worst and best scenarios. However, applying the proper geostatistical (co)-simulation algorithms is critical in the case of modeling variables with strong cross-correlation structures. In this context, enhanced approaches such as projection pursuit multivariate transforms (PPMTs) are highly desirable. In this paper, the mineral resources in an iron ore deposit are computed and categorized employing the PPMT method, and then, the outputs are compared with conventional (co)-simulation methods for the reproduction of statistical parameters and for the calculation of tonnage at different levels of cutoff grades. The results show that the PPMT outperforms conventional (co)-simulation approaches not only in terms of local and global cross-correlation reproductions between two underlying grades (Fe and Al2O3) in this iron deposit but also in terms of mineral resource categories according to the Joint Ore Reserves Committee standard.

  相似文献   

13.
This article describes a proposed work-sequence to generate accurate reservoir-architecture models, describing the geometry of bounding surfaces (i.e., fault locations and extents), of a structurally complex geologic setting in the Jeffara Basin (South East Tunisia) by means of geostatistical modeling. This uses the variogram as the main tool to measure the spatial variability of the studied geologic medium before making any estimation or simulation. However, it is not always easy to fit complex experimental variograms to theoretical models. Thus, our primary purpose was to establish a relationship between the geology and the components of the variograms to fit a mathematically consistent and geologically interpretable variogram model for improved predictions of surface geometries. We used a three-step approach based on available well data and seismic information. First, we determined the structural framework: a seismo-tectonic data analysis was carried out, and we showed that the study area is cut mainly by NW–SE-trending normal faults, which were classified according to geometric criteria (strike, throw magnitude, dip, and dip direction). We showed that these normal faults are at the origin of a large-scale trend structure (surfaces tilted toward the north-east). At a smaller scale, the normal faults create a distinct compartmentalization of the reservoirs. Then, a model of the reservoir system architecture was built by geostatistical methods. An efficient methodology was developed, to estimate the bounding faulted surfaces of the reservoir units. Emphasis was placed on (i) elaborating a methodology for variogram interpretation and modeling, whereby the importance of each variogram component is assessed in terms of probably geologic factor controlling the behavior of each structure; (ii) integrating the relevant fault characteristics, which were deduced from the previous fault classification analysis, as constraints in the kriging estimation of bounding surfaces to best reflect the geologic structure of the study area. Finally, the estimated bounding surfaces together with seismic data and variogram interpretations were used to obtain further insights into the tectonic evolution of the study area that has induced the current reservoirs configuration.  相似文献   

14.

Incorporating locally varying anisotropy (LVA) in geostatistical modeling improves estimates for structurally complex domains where a single set of anisotropic parameters modeled globally do not account for all geological features. In this work, the properties of two LVA-geostatistical modeling frameworks are explored through application to a complexly folded gold deposit in Ghana. The inference of necessary parameters is a significant requirement of geostatistical modeling with LVA; this work focuses on the case where LVA orientations, derived from expert geological interpretation, are used to improve the grade estimates. The different methodologies for inferring the required parameters in this context are explored. The results of considering different estimation frameworks and alternate methods of parameterization are evaluated with a cross-validation study, as well as visual inspection of grade continuity along select cross sections. Results show that stationary methodologies are outperformed by all LVA techniques, even when the LVA framework has minimal guidance on parameterization. Findings also show that additional improvements are gained by considering parameter inference where the LVA orientations and point data are used to infer the local range of anisotropy. Considering LVA for geostatistical modeling of the deposit considered in this work results in better reproduction of curvilinear geological features.

  相似文献   

15.
The factors determining the suitability of limestone for industrial use and its commercial value are the amounts of calcium oxide (CaO) and impurities. From 244 sample points in 18 drillhole sites in a limestone mine, southwestern Japan, data on four impurity elements, SiO2, Fe2O3, MnO, and P2O5 were collected. It generally is difficult to estimate spatial distributions of these contents, because most of the limestone bodies in Japan are located in the accretionary complex lithologies of Paleozoic and Mesozoic age. Because the spatial correlations of content data are not clearly shown by variogram analysis, a feedforward neural network was applied to estimate the content distributions. The network structure consists of three layers: input, middle, and output. The input layer has 17 neurons and the output layer four. Three neurons in the input layer correspond with x, y, z coordinates of a sample point and the others are rock types such as crystalline and conglomeratic limestones, and fossil types related to the geologic age of the limestone. Four neurons in the output layer correspond to the amounts of SiO2, Fe2O3, MnO, and P2O5. Numbers of neurons in the middle layer and training data differ with each estimation point to avoid the overfitting of the network. We could detect several important characteristics of the three-dimensional content distributions through the network such as a continuity of low content zones of SiO2 along a Lower Permian fossil zone trending NE-SW, and low-quality zones located in depths shallower than 50 m. The capability of the neural network-based method compared with the geostatistical method is demonstrated from the viewpoints of estimation errors and spatial characteristics of multivariate data. To evaluate the uncertainty of estimates, a method that draws several outputs by changing coordinates slightly from the target point and inputting them to the same trained network is proposed. Uncertainty differs with impurity elements, and is not based on just the spatial arrangement of data points.  相似文献   

16.

This paper proposes a new approach to the mining exploration drillholes positioning problem (DPP) that incorporates both geostatistical and optimization techniques. A metaheuristic was developed to solve the DPP taking into account an uncertainty index that quantifies the reliability of the current interpretation of the mineral deposit. The uncertainty index was calculated from multiple deposit realizations obtained by truncated Gaussian simulations conditional to the available drillholes samplings. A linear programming model was defined to select the subset of future drillholes that maximizes coverage of the uncertainty. A Tabu Search algorithm was developed to solve large instances of this set partitioning problem. The proposed Tabu Search algorithm is shown to provide good quality solutions approaching 95% of the optimal solution in a reasonable computing time, allowing close to optimal coverage of uncertainty for a fixed investment in drilling.

  相似文献   

17.
A fundamental task for petroleum exploration decision-making is to evaluate the uncertainty of well outcomes. The recent development of geostatistical simulation techniques provides an effective means to the generation of a full uncertainty model for any random variable. Sequential indicator simulation has been used as a tool to generate alternate, equal-probable stochastic models, from which various representations of uncertainties can be created. These results can be used as input for the quantification of various risks associated with a wildcat drilling program or the estimation of petroleum resources. A simple case study is given to demonstrate the use of sequential indicator simulation. The data involves a set of wildcat wells in a gas play. The multiple simulated stochastic models are then post-processed to characterize various uncertainties associated with drilling outcomes.  相似文献   

18.
This paper is concerned with the problem of predicting the surface elevation of the Braden breccia pipe at the El Teniente mine in Chile. This mine is one of the world’s largest and most complex porphyry-copper ore systems. As the pipe surface constitutes the limit of the deposit and the mining operation, predicting it accurately is important. The problem is tackled by applying a geostatistical approach based on closed-form non-stationary covariance functions with locally varying anisotropy. This approach relies on the mild assumption of local stationarity and involves a kernel-based experimental local variogram a weighted local least-squares method for the inference of local covariance parameters and a kernel smoothing technique for knitting the local covariance parameters together for kriging purpose. According to the results, this non-stationary geostatistical method outperforms the traditional stationary geostatistical method in terms of prediction and prediction uncertainty accuracies.  相似文献   

19.
In this study, we demonstrate a novel use of comaps to explore spatially the performance, specification and parameterisation of a non-stationary geostatistical predictor. The comap allows the spatial investigation of the relationship between two geographically referenced variables via conditional distributions. Rather than investigating bivariate relationships in the study data, we use comaps to investigate bivariate relationships in the key outputs of a spatial predictor. In particular, we calibrate moving window kriging (MWK) models, where a local variogram is found at every target location. This predictor has often proved worthy for processes that are heterogeneous, and most standard (global variogram) kriging algorithms can be adapted in this manner. We show that the use of comaps enables a better understanding of our chosen MWK models, which in turn allows a more informed choice when selecting one MWK specification over another. As case studies, we apply four variants of MWK to two heterogeneous example data sets: (i) freshwater acidification critical load data for Great Britain and (ii) London house price data. As both of these data sets are strewn with local anomalies, three of our chosen models are robust (and novel) extensions of MWK, where at least one of which is shown to perform better than a non-robust counterpart.  相似文献   

20.
Additional Samples: Where They Should Be Located   总被引:2,自引:0,他引:2  
Information for mine planning requires to be close spaced, if compared to the grid used for exploration and resource assessment. The additional samples collected during quasimining usually are located in the same pattern of the original diamond drillholes net but closer spaced. This procedure is not the best in mathematical sense for selecting a location. The impact of an additional information to reduce the uncertainty about the parameter been modeled is not the same everywhere within the deposit. Some locations are more sensitive in reducing the local and global uncertainty than others. This study introduces a methodology to select additional sample locations based on stochastic simulation. The procedure takes into account data variability and their spatial location. Multiple equally probable models representing a geological attribute are generated via geostatistical simulation. These models share basically the same histogram and the same variogram obtained from the original data set. At each block belonging to the model a value is obtained from the n simulations and their combination allows one to access local variability. Variability is measured using an uncertainty index proposed. This index was used to map zones of high variability. A value extracted from a given simulation is added to the original data set from a zone identified as erratic in the previous maps. The process of adding samples and simulation is repeated and the benefit of the additional sample is evaluated. The benefit in terms of uncertainty reduction is measure locally and globally. The procedure showed to be robust and theoretically sound, mapping zones where the additional information is most beneficial. A case study in a coal mine using coal seam thickness illustrates the method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号