首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Markov Chain Random Fields for Estimation of?Categorical Variables   总被引:3,自引:0,他引:3  
Multi-dimensional Markov chain conditional simulation (or interpolation) models have potential for predicting and simulating categorical variables more accurately from sample data because they can incorporate interclass relationships. This paper introduces a Markov chain random field (MCRF) theory for building one to multi-dimensional Markov chain models for conditional simulation (or interpolation). A MCRF is defined as a single spatial Markov chain that moves (or jumps) in a space, with its conditional probability distribution at each location entirely depending on its nearest known neighbors in different directions. A general solution for conditional probability distribution of a random variable in a MCRF is derived explicitly based on the Bayes’ theorem and conditional independence assumption. One to multi-dimensional Markov chain models for prediction and conditional simulation of categorical variables can be drawn from the general solution and MCRF-based multi-dimensional Markov chain models are nonlinear.  相似文献   

2.
Multi-dimensional Markov chain conditional simulation (or interpolation) models have potential for predicting and simulating categorical variables more accurately from sample data because they can incorporate interclass relationships. This paper introduces a Markov chain random field (MCRF) theory for building one to multi-dimensional Markov chain models for conditional simulation (or interpolation). A MCRF is defined as a single spatial Markov chain that moves (or jumps) in a space, with its conditional probability distribution at each location entirely depending on its nearest known neighbors in different directions. A general solution for conditional probability distribution of a random variable in a MCRF is derived explicitly based on the Bayes’ theorem and conditional independence assumption. One to multi-dimensional Markov chain models for prediction and conditional simulation of categorical variables can be drawn from the general solution and MCRF-based multi-dimensional Markov chain models are nonlinear.  相似文献   

3.
Uncertainty in future reservoir performance is usually evaluated from the simulated performance of a small number of reservoir realizations. Unfortunately, most of the practical methods for generating realizations conditional to production data are only approximately correct. It is not known whether or not the recently developed method of Gradual Deformation is an approximate method or if it actually generates realizations that are distributed correctly. In this paper, we evaluate the ability of the Gradual Deformation method to correctly assess the uncertainty in reservoir predictions by comparing the distribution of conditional realizations for a small test problem with the standard distribution from a Markov Chain Monte Carlo (MCMC) method, which is known to be correct, and with distributions from several approximate methods. Although the Gradual Deformation algorithm samples inefficiently for this test problem and is clearly not an exact method, it gives similar uncertainty estimates to those obtained by MCMC method based on a relatively small number of realizations.  相似文献   

4.
Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.  相似文献   

5.
This paper is an extension of the two-dimensional coupled Markov chain model developed by Elfeki and Dekking (2001) supplemented with extensive simulations. We focus on the development of various coupled Markov chains models: the so-called fully forward Markov chain, fully backward Markov chain and forward–backward Markov chain models. We addressed many issues such as: sensitivity analysis of optimal sampling intervals in horizontal and lateral directions, directional dependency, use of Walther’s law to describe lateral variability, effect of conditioning on number of boreholes on the model performance, stability of the Monte Carlo realizations, various implementation strategies, use of cross validation techniques to evaluate model performance and image division for statistically non-homogeneous deposits are addressed. The applications are made on three sites; two sites are located in the Netherlands, and the third is in the USA. The purpose of these applications is to show under which conditions the Markov models can be used, and to provide some guidelines for the practice. Entropy maps are good tools to indicate places where high uncertainty is present, so can be used for designing sampling networks to reduce uncertainty at these locations. Symmetric and diagonally dominant horizontal transition probabilities with proper sampling interval show plausible results (fits with geologists prediction) in terms of delineation of subsurface heterogeneous structures. Walther’s law can be utilised with a proper sampling interval to account for the lateral variability.  相似文献   

6.
7.
Spatial uncertainty modelling is a complex and challenging job for orebody modelling in mining, reservoir characterization in petroleum, and contamination modelling in air and water. Stochastic simulation algorithms are popular methods for such modelling. In this paper, discrete wavelet transformation (DWT)-based multiple point simulation algorithm for continuous variable is proposed that handles multi-scale spatial characteristics in datasets and training images. The DWT of a training image provides multi-scale high-frequency wavelet images and one low-frequency scaling image at the coarsest scale. The simulation of the proposed approach is performed on the frequency (wavelet) domain where the scaling image and wavelet images across the scale are simulated jointly. The inverse DWT reconstructs simulated realizations of an attribute of interest in the space domain. An automatic scale-selection algorithm using dominant mode difference is applied for the selection of the optimal scale of wavelet decomposition. The proposed algorithm reduces the computational time required for simulating large domain as compared to spatial domain multi-point simulation algorithm. The algorithm is tested with an exhaustive dataset using conditional and unconditional simulation in two- and three-dimensional fluvial reservoir and mining blasted rock data. The realizations generated by the proposed algorithm perform well and reproduce the statistics of the training image. The study conducted comparing the spatial domain filtersim multiple-point simulation algorithm suggests that the proposed algorithm generates equally good realizations at lower computational cost.  相似文献   

8.
Spatially distributed and varying natural phenomena encountered in geoscience and engineering problem solving are typically incompatible with Gaussian models, exhibiting nonlinear spatial patterns and complex, multiple-point connectivity of extreme values. Stochastic simulation of such phenomena is historically founded on second-order spatial statistical approaches, which are limited in their capacity to model complex spatial uncertainty. The newer multiple-point (MP) simulation framework addresses past limits by establishing the concept of a training image, and, arguably, has its own drawbacks. An alternative to current MP approaches is founded upon new high-order measures of spatial complexity, termed “high-order spatial cumulants.” These are combinations of moments of statistical parameters that characterize non-Gaussian random fields and can describe complex spatial information. Stochastic simulation of complex spatial processes is developed based on high-order spatial cumulants in the high-dimensional space of Legendre polynomials. Starting with discrete Legendre polynomials, a set of discrete orthogonal cumulants is introduced as a tool to characterize spatial shapes. Weighted orthonormal Legendre polynomials define the so-called Legendre cumulants that are high-order conditional spatial cumulants inferred from training images and are combined with available sparse data sets. Advantages of the high-order sequential simulation approach developed herein include the absence of any distribution-related assumptions and pre- or post-processing steps. The method is shown to generate realizations of complex spatial patterns, reproduce bimodal data distributions, data variograms, and high-order spatial cumulants of the data. In addition, it is shown that the available hard data dominate the simulation process and have a definitive effect on the simulated realizations, whereas the training images are only used to fill in high-order relations that cannot be inferred from data. Compared to the MP framework, the proposed approach is data-driven and consistently reconstructs the lower-order spatial complexity in the data used, in addition to high order.  相似文献   

9.
Comparing Training-Image Based Algorithms Using an Analysis of Distance   总被引:1,自引:1,他引:0  
As additional multiple-point statistical (MPS) algorithms are developed, there is an increased need for scientific ways for comparison beyond the usual visual comparison or simple metrics, such as connectivity measures. In this paper, we start from the general observation that any (not just MPS) geostatistical simulation algorithm represents two types of variability: (1) the within-realization variability, namely, that realizations reproduce a spatial continuity model (variogram, Boolean, or training-image based), (2) the between-realization variability representing a model of spatial uncertainty. In this paper, it is argued that any comparison of algorithms needs, at a minimum, to be based on these two randomizations. In fact, for certain MPS algorithms, it is illustrated with different examples that there is often a trade-off: Increased pattern reproduction entails reduced spatial uncertainty. In this paper, the subjective choice that the best algorithm maximizes pattern reproduction is made while at the same time maximizes spatial uncertainty. The discussion is also limited to fairly standard multiple-point algorithms and that our method does not necessarily apply to more recent or possibly future developments. In order to render these fundamental principles quantitative, this paper relies on a distance-based measure for both within-realization variability (pattern reproduction) and between-realization variability (spatial uncertainty). It is illustrated in this paper that this method is efficient and effective for two-dimensional, three-dimensional, continuous, and discrete training images.  相似文献   

10.
Bayesian lithology/fluid inversion—comparison of two algorithms   总被引:1,自引:0,他引:1  
Algorithms for inversion of seismic prestack AVO data into lithology-fluid classes in a vertical profile are evaluated. The inversion is defined in a Bayesian setting where the prior model for the lithology-fluid classes is a Markov chain, and the likelihood model relates seismic data and elastic material properties to these classes. The likelihood model is approximated such that the posterior model can be calculated recursively using the extremely efficient forward–backward algorithm. The impact of the approximation in the likelihood model is evaluated empirically by comparing results from the approximate approach with results generated from the exact posterior model. The exact posterior is assessed by sampling using a sophisticated Markov chain Monte Carlo simulation algorithm. The simulation algorithm is iterative, and it requires considerable computer resources. Seven realistic evaluation models are defined, from which synthetic seismic data are generated. Using identical seismic data, the approximate marginal posterior is calculated and the exact marginal posterior is assessed. It is concluded that the approximate likelihood model preserves 50% to 90% of the information content in the exact likelihood model.  相似文献   

11.
In earth and environmental sciences applications, uncertainty analysis regarding the outputs of models whose parameters are spatially varying (or spatially distributed) is often performed in a Monte Carlo framework. In this context, alternative realizations of the spatial distribution of model inputs, typically conditioned to reproduce attribute values at locations where measurements are obtained, are generated via geostatistical simulation using simple random (SR) sampling. The environmental model under consideration is then evaluated using each of these realizations as a plausible input, in order to construct a distribution of plausible model outputs for uncertainty analysis purposes. In hydrogeological investigations, for example, conditional simulations of saturated hydraulic conductivity are used as input to physically-based simulators of flow and transport to evaluate the associated uncertainty in the spatial distribution of solute concentration. Realistic uncertainty analysis via SR sampling, however, requires a large number of simulated attribute realizations for the model inputs in order to yield a representative distribution of model outputs; this often hinders the application of uncertainty analysis due to the computational expense of evaluating complex environmental models. Stratified sampling methods, including variants of Latin hypercube sampling, constitute more efficient sampling aternatives, often resulting in a more representative distribution of model outputs (e.g., solute concentration) with fewer model input realizations (e.g., hydraulic conductivity), thus reducing the computational cost of uncertainty analysis. The application of stratified and Latin hypercube sampling in a geostatistical simulation context, however, is not widespread, and, apart from a few exceptions, has been limited to the unconditional simulation case. This paper proposes methodological modifications for adopting existing methods for stratified sampling (including Latin hypercube sampling), employed to date in an unconditional geostatistical simulation context, for the purpose of efficient conditional simulation of Gaussian random fields. The proposed conditional simulation methods are compared to traditional geostatistical simulation, based on SR sampling, in the context of a hydrogeological flow and transport model via a synthetic case study. The results indicate that stratified sampling methods (including Latin hypercube sampling) are more efficient than SR, overall reproducing to a similar extent statistics of the conductivity (and subsequently concentration) fields, yet with smaller sampling variability. These findings suggest that the proposed efficient conditional sampling methods could contribute to the wider application of uncertainty analysis in spatially distributed environmental models using geostatistical simulation.  相似文献   

12.
Representing Spatial Uncertainty Using Distances and Kernels   总被引:8,自引:7,他引:1  
Assessing uncertainty of a spatial phenomenon requires the analysis of a large number of parameters which must be processed by a transfer function. To capture the possibly of a wide range of uncertainty in the transfer function response, a large set of geostatistical model realizations needs to be processed. Stochastic spatial simulation can rapidly provide multiple, equally probable realizations. However, since the transfer function is often computationally demanding, only a small number of models can be evaluated in practice, and are usually selected through a ranking procedure. Traditional ranking techniques for selection of probabilistic ranges of response (P10, P50 and P90) are highly dependent on the static property used. In this paper, we propose to parameterize the spatial uncertainty represented by a large set of geostatistical realizations through a distance function measuring “dissimilarity” between any two geostatistical realizations. The distance function allows a mapping of the space of uncertainty. The distance can be tailored to the particular problem. The multi-dimensional space of uncertainty can be modeled using kernel techniques, such as kernel principal component analysis (KPCA) or kernel clustering. These tools allow for the selection of a subset of representative realizations containing similar properties to the larger set. Without losing accuracy, decisions and strategies can then be performed applying a transfer function on the subset without the need to exhaustively evaluate each realization. This method is applied to a synthetic oil reservoir, where spatial uncertainty of channel facies is modeled through multiple realizations generated using a multi-point geostatistical algorithm and several training images.  相似文献   

13.
Direct Pattern-Based Simulation of Non-stationary Geostatistical Models   总被引:5,自引:2,他引:3  
Non-stationary models often capture better spatial variation of real world spatial phenomena than stationary ones. However, the construction of such models can be tedious as it requires modeling both statistical trend and stationary stochastic component. Non-stationary models are an important issue in the recent development of multiple-point geostatistical models. This new modeling paradigm, with its reliance on the training image as the source for spatial statistics or patterns, has had considerable practical appeal. However, the role and construction of the training image in the non-stationary case remains a problematic issue from both a modeling and practical point of view. In this paper, we provide an easy to use, computationally efficient methodology for creating non-stationary multiple-point geostatistical models, for both discrete and continuous variables, based on a distance-based modeling and simulation of patterns. In that regard, the paper builds on pattern-based modeling previously published by the authors, whereby a geostatistical realization is created by laying down patterns as puzzle pieces on the simulation grid, such that the simulated patterns are consistent (in terms of a similarity definition) with any previously simulated ones. In this paper we add the spatial coordinate to the pattern similarity calculation, thereby only borrowing patterns locally from the training image instead of globally. The latter would entail a stationary assumption. Two ways of adding the geographical coordinate are presented, (1) based on a functional that decreases gradually away from the location where the pattern is simulated and (2) based on an automatic segmentation of the training image into stationary regions. Using ample two-dimensional and three-dimensional case studies we study the behavior in terms of spatial and ensemble uncertainty of the generated realizations.  相似文献   

14.
On Modelling Discrete Geological Structures as Markov Random Fields   总被引:1,自引:0,他引:1  
The purpose of this paper is to extend the locally based prediction methodology of BayMar to a global one by modelling discrete spatial structures as Markov random fields. BayMar uses one-dimensional Markov-properties for estimating spatial correlation and Bayesian updating for locally integrating prior and additional information. The methodology of this paper introduces a new estimator of the field parameters based on the maximum likelihood technique for one-dimensional Markov chains. This makes the estimator straightforward to calculate also when there is a large amount of missing observations, which often is the case in geological applications. We make simulations (both unconditional and conditional on the observed data) and maximum a posteriori predictions (restorations) of the non-observed data using Markov chain Monte Carlo methods, in the restoration case by employing simulated annealing. The described method gives satisfactory predictions, while more work is needed in order to simulate, since it appears to have a tendency to overestimate strong spatial dependence. It provides an important development compared to the BayMar-methodology by facilitating global predictions and improved use of sparse data.  相似文献   

15.
Geologic uncertainties and limited well data often render recovery forecasting a difficult undertaking in typical appraisal and early development settings. Recent advances in geologic modeling algorithms permit automation of the model generation process via macros and geostatistical tools. This allows rapid construction of multiple alternative geologic realizations. Despite the advances in geologic modeling, computation of the reservoir dynamic response via full-physics reservoir simulation remains a computationally expensive task. Therefore, only a few of the many probable realizations are simulated in practice. Experimental design techniques typically focus on a few discrete geologic realizations as they are inherently more suitable for continuous engineering parameters and can only crudely approximate the impact of geology. A flow-based pattern recognition algorithm (FPRA) has been developed for quantifying the forecast uncertainty as an alternative. The proposed algorithm relies on the rapid characterization of the geologic uncertainty space represented by an ensemble of sufficiently diverse static model realizations. FPRA characterizes the geologic uncertainty space by calculating connectivity distances, which quantify how different each individual realization is from all others in terms of recovery response. Fast streamline simulations are employed in evaluating these distances. By applying pattern recognition techniques to connectivity distances, a few representative realizations are identified within the model ensemble for full-physics simulation. In turn, the recovery factor probability distribution is derived from these intelligently selected simulation runs. Here, FPRA is tested on an example case where the objective is to accurately compute the recovery factor statistics as a function of geologic uncertainty in a channelized turbidite reservoir. Recovery factor cumulative distribution functions computed by FPRA compare well to the one computed via exhaustive full-physics simulations.  相似文献   

16.
Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.  相似文献   

17.

One main problem in the modeling of mineral deposits is to design a block model that divides the deposit into homogeneous subdomains. The spatial uncertainty in the geological boundaries becomes a critical factor prior to the modeling of the ore properties. For this reason, reducing the uncertainty of geological models leads to an improved mineral resource evaluation. This research work addresses the problem of updating the geological models by using actual online-sensor measurement data. A novel algorithm is provided, which integrates the discrete wavelet transform to the Ensemble Kalman Filter for assimilating online-sensor production data into geological models. The geological realizations in each time step are transformed to frequency coefficients and, after each assimilation step, the updated realizations are back-transformed to the original categorical distribution. Furthermore, a reconciliation process is performed to compare the online-sensor data derived from the production blocks and the updated realizations in each time step. The algorithm is illustrated through an application to the Golgohar iron deposit located in SW of Sirjan, Iran, and proves to reproduce the statistical parameters and connectivity values of the primary geological realizations.

  相似文献   

18.
Markov Processes and Discrete Multifractals   总被引:7,自引:0,他引:7  
Fractals and multifractals are a natural consequence of self-similarity resulting from scale-independent processes. Multifractals are spatially intertwined fractals which can be further grouped into two classes according to the characteristics of their fractal dimension spectra: continuous and discrete multifractals. The concept of multifractals emphasizes spatial associations between fractals and fractal spectra. Distinguishing discrete multifractals from continuous multifractals makes it possible to describe discrete physical processes from a multifractal point of view. It is shown that multiplicative cascade processes can generate continuous multifractals and that Markov processes result in discrete multifractals. The latter result provides not only theoretical evidence for existence of discrete multifractals but also a fundamental model illustrating the general properties of discrete multifractals. Classical prefractal examples are used to show how asymmetrical Markov process can be applied to generate prefractal sets and discrete multifractals. The discrete multifractal model based on Markov processes was applied to a dataset of gold deposits in the Great Basin, Nevada, USA. The gold deposits were regarded as discrete multifractals consisting of three spatially interrelated point sets (small, medium, and large deposits) yielding fractal dimensions of 0.541 for the small deposits (<25 tons Au), 0.296 for the medium deposits (25--500 tons Au), and 0.09 for the large deposits (>500 tons Au), respectively.  相似文献   

19.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

20.
In oil industry and subsurface hydrology, geostatistical models are often used to represent the porosity or the permeability field. In history matching of a geostatistical reservoir model, we attempt to find multiple realizations that are conditional to dynamic data and representative of the model uncertainty space. A relevant way to simulate the conditioned realizations is by generating Monte Carlo Markov chains (MCMC). The huge dimensions (number of parameters) of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. In practice, we have to stop the chain far before it has browsed the whole support of the posterior probability density function. Furthermore, as the relationship between the production data and the random field is highly nonlinear, the posterior can be strongly multimodal and the chain may stay stuck in one of the modes. In this work, we propose a methodology to enhance the sampling properties of classical single MCMC in history matching. We first show how to reduce the dimension of the problem by using a truncated Karhunen–Loève expansion of the random field of interest and assess the number of components to be kept. Then, we show how we can improve the mixing properties of MCMC, without increasing the global computational cost, by using parallel interacting Markov Chains. Finally, we show the encouraging results obtained when applying the method to a synthetic history matching case.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号