首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In history matching of lithofacies reservoir model, we attempt to find multiple realizations of lithofacies configuration that are conditional to dynamic data and representative of the model uncertainty space. This problem can be formalized in the Bayesian framework. Given a truncated Gaussian model as a prior and the dynamic data with its associated measurement error, we want to sample from the conditional distribution of the facies given the data. A relevant way to generate conditioned realizations is to use Markov chains Monte Carlo (MCMC). However, the dimensions of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. Furthermore, classical MCMC algorithms mix slowly, that is, they will not explore the whole support of the posterior in the time of the simulation. In this paper, we extend the methodology already described in a previous work to the problem of history matching of a Gaussian-related lithofacies reservoir model. We first show how to drastically reduce the dimension of the problem by using a truncated Karhunen-Loève expansion of the Gaussian random field underlying the lithofacies model. Moreover, we propose an innovative criterion of the choice of the number of components based on the connexity function. Then, we show how we improve the mixing properties of classical single MCMC, without increasing the global computational cost, by the use of parallel interacting Markov chains. Applying the dimension reduction and this innovative sampling method drastically lowers the number of iterations needed to sample efficiently from the posterior. We show the encouraging results obtained when applying the methodology to a synthetic history-matching case.  相似文献   

2.
Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.  相似文献   

3.
Calibrating a stochastic reservoir model on large, fine-grid to hydrodynamic data requires consistent methods to modify the petrophysical properties of the model. Several methods have been developed to address this problem. Recent methods include the Gradual Deformation Method (GDM) and the Probability Perturbation Method (PPM). The GDM has been applied to pixel-based models of continuous and categorical variables, as well as object-based models. Initially, the PPM has been applied to pixel-based models of categorical variables generated by sequential simulation. In addition, the PPM relies on an analytical formula (known as the tau-model) to approximate conditional probabilities. In this paper, an extension of the PPM to any type of probability distributions (discrete, continuous, or mixed) is presented. This extension is still constrained by the approximation using the tau-model. However, when applying the method to white noises, this approximation is no longer necessary. The result is an entirely new and rigorous method for perturbing any type of stochastic models, a modified PPM employed in similar manner to the GDM.  相似文献   

4.
5.
6.
In oil industry and subsurface hydrology, geostatistical models are often used to represent the porosity or the permeability field. In history matching of a geostatistical reservoir model, we attempt to find multiple realizations that are conditional to dynamic data and representative of the model uncertainty space. A relevant way to simulate the conditioned realizations is by generating Monte Carlo Markov chains (MCMC). The huge dimensions (number of parameters) of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. In practice, we have to stop the chain far before it has browsed the whole support of the posterior probability density function. Furthermore, as the relationship between the production data and the random field is highly nonlinear, the posterior can be strongly multimodal and the chain may stay stuck in one of the modes. In this work, we propose a methodology to enhance the sampling properties of classical single MCMC in history matching. We first show how to reduce the dimension of the problem by using a truncated Karhunen–Loève expansion of the random field of interest and assess the number of components to be kept. Then, we show how we can improve the mixing properties of MCMC, without increasing the global computational cost, by using parallel interacting Markov Chains. Finally, we show the encouraging results obtained when applying the method to a synthetic history matching case.  相似文献   

7.
Stochastic fractal (fGn and fBm) porosity and permeability fields are conditioned to given variogram, static (or hard), and multiwell pressure data within a Bayesian estimation framework. Because fGn distributions are normal/second-order stationary, it is shown that the Bayesian estimation methods based on the assumption of normal/second-order stationary distributions can be directly used to generate fGn porosity/permeability fields conditional to pressure data. However, because fBm is not second-order stationary, it is shown that such Bayesian estimation methods can be used with implementation of a pseudocovariance approach to generate fBm porosity/permeability fields conditional to multiwell pressure data. In addition, we provide methods to generate unconditional realizations of fBm/fGn fields honoring all variogram parameters. These unconditional realizations can then be conditioned to hard and pressure data observed at wells by using the randomized maximum likelihood method. Synthetic examples generated from one-, two-, and three-dimensional single-phase flow simulators are used to show the applicability of our methodology for generating realizations of fBm/fGn porosity and permeability fields conditioned to well-test pressure data and evaluating the uncertainty in reservoir performance predictions appropriately using these history-matched realizations.  相似文献   

8.
Gradual deformation is a parameterization method that reduces considerably the unknown parameter space of stochastic models. This method can be used in an iterative optimization procedure for constraining stochastic simulations to data that are complex, nonanalytical functions of the simulated variables. This method is based on the fact that linear combinations of multi-Gaussian random functions remain multi-Gaussian random functions. During the past few years, we developed the gradual deformation method by combining independent realizations. This paper investigates another alternative: the combination of dependent realizations. One of our motivations for combining dependent realizations was to improve the numerical stability of the gradual deformation method. Because of limitations both in the size of simulation grids and in the precision of simulation algorithms, numerical realizations of a stochastic model are never perfectly independent. It was shown that the accumulation of very small dependence between realizations might result in significant structural drift from the initial stochastic model. From the combination of random functions whose covariance and cross-covariance are proportional to each other, we derived a new formulation of the gradual deformation method that can explicitly take into account the numerical dependence between realizations. This new formulation allows us to reduce the structural deterioration during the iterative optimization. The problem of combining dependent realizations also arises when deforming conditional realizations of a stochastic model. As opposed to the combination of independent realizations, combining conditional realizations avoids the additional conditioning step during the optimization process. However, this procedure is limited to global deformations with fixed structural parameters.  相似文献   

9.
Assessment of uncertainty in the performance of fluvial reservoirs often requires the ability to generate realizations of channel sands that are conditional to well observations. For channels with low sinuosity this problem has been effectively solved. When the sinuosity is large, however, the standard stochastic models for fluvial reservoirs are not valid, because the deviation of the channel from a principal direction line is multivalued. In this paper, I show how the method of randomized maximum likelihood can be used to generate conditional realizations of channels with large sinuosity. In one example, a Gaussian random field model is used to generate an unconditional realization of a channel with large sinuosity, and this realization is then conditioned to well observations. Channels generated in the second approach are less realistic, but may be sufficient for modeling reservoir connectivity in a realistic way. In the second example, an unconditional realization of a channel is generated by a complex geologic model with random forcing. It is then adjusted in a meaningful way to honor well observations. The key feature in the solution is the use of channel direction instead of channel deviation as the characteristic random function describing the geometry of the channel.  相似文献   

10.
11.
In earth and environmental sciences applications, uncertainty analysis regarding the outputs of models whose parameters are spatially varying (or spatially distributed) is often performed in a Monte Carlo framework. In this context, alternative realizations of the spatial distribution of model inputs, typically conditioned to reproduce attribute values at locations where measurements are obtained, are generated via geostatistical simulation using simple random (SR) sampling. The environmental model under consideration is then evaluated using each of these realizations as a plausible input, in order to construct a distribution of plausible model outputs for uncertainty analysis purposes. In hydrogeological investigations, for example, conditional simulations of saturated hydraulic conductivity are used as input to physically-based simulators of flow and transport to evaluate the associated uncertainty in the spatial distribution of solute concentration. Realistic uncertainty analysis via SR sampling, however, requires a large number of simulated attribute realizations for the model inputs in order to yield a representative distribution of model outputs; this often hinders the application of uncertainty analysis due to the computational expense of evaluating complex environmental models. Stratified sampling methods, including variants of Latin hypercube sampling, constitute more efficient sampling aternatives, often resulting in a more representative distribution of model outputs (e.g., solute concentration) with fewer model input realizations (e.g., hydraulic conductivity), thus reducing the computational cost of uncertainty analysis. The application of stratified and Latin hypercube sampling in a geostatistical simulation context, however, is not widespread, and, apart from a few exceptions, has been limited to the unconditional simulation case. This paper proposes methodological modifications for adopting existing methods for stratified sampling (including Latin hypercube sampling), employed to date in an unconditional geostatistical simulation context, for the purpose of efficient conditional simulation of Gaussian random fields. The proposed conditional simulation methods are compared to traditional geostatistical simulation, based on SR sampling, in the context of a hydrogeological flow and transport model via a synthetic case study. The results indicate that stratified sampling methods (including Latin hypercube sampling) are more efficient than SR, overall reproducing to a similar extent statistics of the conductivity (and subsequently concentration) fields, yet with smaller sampling variability. These findings suggest that the proposed efficient conditional sampling methods could contribute to the wider application of uncertainty analysis in spatially distributed environmental models using geostatistical simulation.  相似文献   

12.
Ensemble methods present a practical framework for parameter estimation, performance prediction, and uncertainty quantification in subsurface flow and transport modeling. In particular, the ensemble Kalman filter (EnKF) has received significant attention for its promising performance in calibrating heterogeneous subsurface flow models. Since an ensemble of model realizations is used to compute the statistical moments needed to perform the EnKF updates, large ensemble sizes are needed to provide accurate updates and uncertainty assessment. However, for realistic problems that involve large-scale models with computationally demanding flow simulation runs, the EnKF implementation is limited to small-sized ensembles. As a result, spurious numerical correlations can develop and lead to inaccurate EnKF updates, which tend to underestimate or even eliminate the ensemble spread. Ad hoc practical remedies, such as localization, local analysis, and covariance inflation schemes, have been developed and applied to reduce the effect of sampling errors due to small ensemble sizes. In this paper, a fast linear approximate forecast method is proposed as an alternative approach to enable the use of large ensemble sizes in operational settings to obtain more improved sample statistics and EnKF updates. The proposed method first clusters a large number of initial geologic model realizations into a small number of groups. A representative member from each group is used to run a full forward flow simulation. The flow predictions for the remaining realizations in each group are approximated by a linearization around the full simulation results of the representative model (centroid) of the respective cluster. The linearization can be performed using either adjoint-based or ensemble-based gradients. Results from several numerical experiments with two-phase and three-phase flow systems in this paper suggest that the proposed method can be applied to improve the EnKF performance in large-scale problems where the number of full simulation is constrained.  相似文献   

13.
Representing Spatial Uncertainty Using Distances and Kernels   总被引:8,自引:7,他引:1  
Assessing uncertainty of a spatial phenomenon requires the analysis of a large number of parameters which must be processed by a transfer function. To capture the possibly of a wide range of uncertainty in the transfer function response, a large set of geostatistical model realizations needs to be processed. Stochastic spatial simulation can rapidly provide multiple, equally probable realizations. However, since the transfer function is often computationally demanding, only a small number of models can be evaluated in practice, and are usually selected through a ranking procedure. Traditional ranking techniques for selection of probabilistic ranges of response (P10, P50 and P90) are highly dependent on the static property used. In this paper, we propose to parameterize the spatial uncertainty represented by a large set of geostatistical realizations through a distance function measuring “dissimilarity” between any two geostatistical realizations. The distance function allows a mapping of the space of uncertainty. The distance can be tailored to the particular problem. The multi-dimensional space of uncertainty can be modeled using kernel techniques, such as kernel principal component analysis (KPCA) or kernel clustering. These tools allow for the selection of a subset of representative realizations containing similar properties to the larger set. Without losing accuracy, decisions and strategies can then be performed applying a transfer function on the subset without the need to exhaustively evaluate each realization. This method is applied to a synthetic oil reservoir, where spatial uncertainty of channel facies is modeled through multiple realizations generated using a multi-point geostatistical algorithm and several training images.  相似文献   

14.
Geologic uncertainties and limited well data often render recovery forecasting a difficult undertaking in typical appraisal and early development settings. Recent advances in geologic modeling algorithms permit automation of the model generation process via macros and geostatistical tools. This allows rapid construction of multiple alternative geologic realizations. Despite the advances in geologic modeling, computation of the reservoir dynamic response via full-physics reservoir simulation remains a computationally expensive task. Therefore, only a few of the many probable realizations are simulated in practice. Experimental design techniques typically focus on a few discrete geologic realizations as they are inherently more suitable for continuous engineering parameters and can only crudely approximate the impact of geology. A flow-based pattern recognition algorithm (FPRA) has been developed for quantifying the forecast uncertainty as an alternative. The proposed algorithm relies on the rapid characterization of the geologic uncertainty space represented by an ensemble of sufficiently diverse static model realizations. FPRA characterizes the geologic uncertainty space by calculating connectivity distances, which quantify how different each individual realization is from all others in terms of recovery response. Fast streamline simulations are employed in evaluating these distances. By applying pattern recognition techniques to connectivity distances, a few representative realizations are identified within the model ensemble for full-physics simulation. In turn, the recovery factor probability distribution is derived from these intelligently selected simulation runs. Here, FPRA is tested on an example case where the objective is to accurately compute the recovery factor statistics as a function of geologic uncertainty in a channelized turbidite reservoir. Recovery factor cumulative distribution functions computed by FPRA compare well to the one computed via exhaustive full-physics simulations.  相似文献   

15.
An adequate representation of the detailed spatial variation of subsurface parameters for underground flow and mass transport simulation entails heterogeneous models. Uncertainty characterization generally calls for a Monte Carlo analysis of many equally likely realizations that honor both direct information (e.g., conductivity data) and information about the state of the system (e.g., piezometric head or concentration data). Thus, the problems faced is how to generate multiple realizations conditioned to parameter data, and inverse-conditioned to dependent state data. We propose using Markov chain Monte Carlo approach (MCMC) with block updating and combined with upscaling to achieve this purpose. Our proposal presents an alternative block updating scheme that permits the application of MCMC to inverse stochastic simulation of heterogeneous fields and incorporates upscaling in a multi-grid approach to speed up the generation of the realizations. The main advantage of MCMC, compared to other methods capable of generating inverse-conditioned realizations (such as the self-calibrating or the pilot point methods), is that it does not require the solution of a complex optimization inverse problem, although it requires the solution of the direct problem many times.  相似文献   

16.
Ensemble-based methods are becoming popular assisted history matching techniques with a growing number of field applications. These methods use an ensemble of model realizations, typically constructed by means of geostatistics, to represent the prior uncertainty. The performance of the history matching is very dependent on the quality of the initial ensemble. However, there is a significant level of uncertainty in the parameters used to define the geostatistical model. From a Bayesian viewpoint, the uncertainty in the geostatistical modeling can be represented by a hyper-prior in a hierarchical formulation. This paper presents the first steps towards a general parametrization to address the problem of uncertainty in the prior modeling. The proposed parametrization is inspired in Gaussian mixtures, where the uncertainty in the prior mean and prior covariance is accounted by defining weights for combining multiple Gaussian ensembles, which are estimated during the data assimilation. The parametrization was successfully tested in a simple reservoir problem where the orientation of the major anisotropic direction of the permeability field was unknown.  相似文献   

17.
The determination of the optimal type and placement of a nonconventional well in a heterogeneous reservoir represents a challenging optimization problem. This determination is significantly more complicated if uncertainty in the reservoir geology is included in the optimization. In this study, a genetic algorithm is applied to optimize the deployment of nonconventional wells. Geological uncertainty is accounted for by optimizing over multiple reservoir models (realizations) subject to a prescribed risk attitude. To reduce the excessive computational requirements of the base method, a new statistical proxy (which provides fast estimates of the objective function) based on cluster analysis is introduced into the optimization process. This proxy provides an estimate of the cumulative distribution function (CDF) of the scenario performance, which enables the quantification of proxy uncertainty. Knowledge of the proxy-based performance estimate in conjunction with the proxy CDF enables the systematic selection of the most appropriate scenarios for full simulation. Application of the overall method for the optimization of monobore and dual-lateral well placement demonstrates the performance of the hybrid optimization procedure. Specifically, it is shown that by simulating only 10% or 20% of the scenarios (as determined by application of the proxy), optimization results very close to those achieved by simulating all cases are obtained.  相似文献   

18.
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our implementation of the MCMC method provides the gold standard against which the aforementioned Gaussian approximations are assessed. We present numerical synthetic experiments where we quantify the capability of each of the ad hoc Gaussian approximation in reproducing the mean and the variance of the posterior distribution (characterized via MCMC) associated to a data assimilation problem. Both single-phase and two-phase (oil–water) reservoir models are considered so that fundamental differences in the resulting forward operators are highlighted. The main objective of our controlled experiments was to exhibit the substantial discrepancies of the approximation properties of standard ad hoc Gaussian approximations. Numerical investigations of the type we present here will lead to the greater understanding of the cost-efficient, but ad hoc, Bayesian techniques used for data assimilation in petroleum reservoirs and hence ultimately to improved techniques with more accurate uncertainty quantification.  相似文献   

19.
A stochastic channel embedded in a background facies is conditioned to data observed at wells. The background facies is a fixed rectangular box. The model parameters consist of geometric parameters that describe the shape, size, and location of the channel, and permeability and porosity in the channel and nonchannel facies. We extend methodology previously developed to condition a stochastic channel to well-test pressure data, and well observations of the channel thickness and the depth of the top of the channel. The main objective of this work is to characterize the reduction in uncertainty in channel model parameters and predicted reservoir performance that can be achieved by conditioning to well-test pressure data at one or more wells. Multiple conditional realizations of the geometric parameters and rock properties are generated to evaluate the uncertainty in model parameters. The ensemble of predictions of reservoir performance generated from the suite of realizations provides a Monte Carlo estimate of the uncertainty in future performance predictions. In addition, we provide some insight on how prior variances, data measurement errors, and sensitivity coefficients interact to determine the reduction in model parameters obtained by conditioning to pressure data and examine the value of active and observation well data in resolving model parameters.  相似文献   

20.
Uncertainty in surfactant–polymer flooding is an important challenge to the wide-scale implementation of this process. Any successful design of this enhanced oil recovery process will necessitate a good understanding of uncertainty. Thus, it is essential to have the ability to quantify this uncertainty in an efficient manner. Monte Carlo simulation is the traditional uncertainty quantification approach that is used for quantifying parametric uncertainty. However, the convergence of Monte Carlo simulation is relatively low, requiring a large number of realizations to converge. This study proposes the use of the probabilistic collocation method in parametric uncertainty quantification for surfactant–polymer flooding using four synthetic reservoir models. Four sources of uncertainty were considered: the chemical flood residual oil saturation, surfactant and polymer adsorption, and the polymer viscosity multiplier. The output parameter approximated is the recovery factor. The output metrics were the input–output model response relationship, the probability density function, and the first two moments. These were compared with the results obtained from Monte Carlo simulation over a large number of realizations. Two methods for solving for the coefficients of the output parameter polynomial chaos expansion are compared: Gaussian quadrature and linear regression. The linear regression approach used two types of sampling: full-tensor product nodes and Chebyshev-derived nodes. In general, the probabilistic collocation method was applied successfully to quantify the uncertainty in the recovery factor. Applying the method using the Gaussian quadrature produced more accurate results compared with using the linear regression with full-tensor product nodes. Applying the method using the linear regression with Chebyshev derived sampling also performed relatively well. Possible enhancements to improve the performance of the probabilistic collocation method were discussed. These enhancements include improved sparse sampling, approximation order-independent sampling, and using arbitrary random input distribution that could be more representative of reality.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号