首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or four-dimensional seismic data. To update geostatistical realizations, the local gradual deformation method can be used. However, history matching is a complex inverse problem, and the computational effort in terms of the number of reservoir simulations required in the optimization procedure increases with the number of matching parameters. History matching large fields with a large number of parameters has been an ongoing challenge in reservoir simulation. This paper presents a new technique to improve history matching with the local gradual deformation method using the gradient-based optimizations. The new approach is based on the approximate derivative calculations using the partial separability of the objective function. The objective function is first split into local components, and only the most influential parameters in each component are used for the derivative computation. A perturbation design is then proposed to simultaneously compute all the derivatives with only a few simulations. This new technique makes history matching using the local gradual deformation method with large numbers of parameters tractable.  相似文献   

2.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or 4D seismic data. Although reservoir heterogeneities are commonly generated using geostatistical models, random realizations cannot generally match observed dynamic data. To constrain model realizations to reproduce measured dynamic data, an optimization procedure may be applied in an attempt to minimize an objective function, which quantifies the mismatch between real and simulated data. Such assisted history matching methods require a parameterization of the geostatistical model to allow the updating of an initial model realization. However, there are only a few parameterization methods available to update geostatistical models in a way consistent with the underlying geostatistical properties. This paper presents a local domain parameterization technique that updates geostatistical realizations using assisted history matching. This technique allows us to locally change model realizations through the variation of geometrical domains whose geometry and size can be easily controlled and parameterized. This approach provides a new way to parameterize geostatistical realizations in order to improve history matching efficiency.  相似文献   

3.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

4.
Representing Spatial Uncertainty Using Distances and Kernels   总被引:8,自引:7,他引:1  
Assessing uncertainty of a spatial phenomenon requires the analysis of a large number of parameters which must be processed by a transfer function. To capture the possibly of a wide range of uncertainty in the transfer function response, a large set of geostatistical model realizations needs to be processed. Stochastic spatial simulation can rapidly provide multiple, equally probable realizations. However, since the transfer function is often computationally demanding, only a small number of models can be evaluated in practice, and are usually selected through a ranking procedure. Traditional ranking techniques for selection of probabilistic ranges of response (P10, P50 and P90) are highly dependent on the static property used. In this paper, we propose to parameterize the spatial uncertainty represented by a large set of geostatistical realizations through a distance function measuring “dissimilarity” between any two geostatistical realizations. The distance function allows a mapping of the space of uncertainty. The distance can be tailored to the particular problem. The multi-dimensional space of uncertainty can be modeled using kernel techniques, such as kernel principal component analysis (KPCA) or kernel clustering. These tools allow for the selection of a subset of representative realizations containing similar properties to the larger set. Without losing accuracy, decisions and strategies can then be performed applying a transfer function on the subset without the need to exhaustively evaluate each realization. This method is applied to a synthetic oil reservoir, where spatial uncertainty of channel facies is modeled through multiple realizations generated using a multi-point geostatistical algorithm and several training images.  相似文献   

5.
Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.  相似文献   

6.
7.
In oil industry and subsurface hydrology, geostatistical models are often used to represent the porosity or the permeability field. In history matching of a geostatistical reservoir model, we attempt to find multiple realizations that are conditional to dynamic data and representative of the model uncertainty space. A relevant way to simulate the conditioned realizations is by generating Monte Carlo Markov chains (MCMC). The huge dimensions (number of parameters) of the model and the computational cost of each iteration are two important pitfalls for the use of MCMC. In practice, we have to stop the chain far before it has browsed the whole support of the posterior probability density function. Furthermore, as the relationship between the production data and the random field is highly nonlinear, the posterior can be strongly multimodal and the chain may stay stuck in one of the modes. In this work, we propose a methodology to enhance the sampling properties of classical single MCMC in history matching. We first show how to reduce the dimension of the problem by using a truncated Karhunen–Loève expansion of the random field of interest and assess the number of components to be kept. Then, we show how we can improve the mixing properties of MCMC, without increasing the global computational cost, by using parallel interacting Markov Chains. Finally, we show the encouraging results obtained when applying the method to a synthetic history matching case.  相似文献   

8.
Geologic uncertainties and limited well data often render recovery forecasting a difficult undertaking in typical appraisal and early development settings. Recent advances in geologic modeling algorithms permit automation of the model generation process via macros and geostatistical tools. This allows rapid construction of multiple alternative geologic realizations. Despite the advances in geologic modeling, computation of the reservoir dynamic response via full-physics reservoir simulation remains a computationally expensive task. Therefore, only a few of the many probable realizations are simulated in practice. Experimental design techniques typically focus on a few discrete geologic realizations as they are inherently more suitable for continuous engineering parameters and can only crudely approximate the impact of geology. A flow-based pattern recognition algorithm (FPRA) has been developed for quantifying the forecast uncertainty as an alternative. The proposed algorithm relies on the rapid characterization of the geologic uncertainty space represented by an ensemble of sufficiently diverse static model realizations. FPRA characterizes the geologic uncertainty space by calculating connectivity distances, which quantify how different each individual realization is from all others in terms of recovery response. Fast streamline simulations are employed in evaluating these distances. By applying pattern recognition techniques to connectivity distances, a few representative realizations are identified within the model ensemble for full-physics simulation. In turn, the recovery factor probability distribution is derived from these intelligently selected simulation runs. Here, FPRA is tested on an example case where the objective is to accurately compute the recovery factor statistics as a function of geologic uncertainty in a channelized turbidite reservoir. Recovery factor cumulative distribution functions computed by FPRA compare well to the one computed via exhaustive full-physics simulations.  相似文献   

9.
Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non-stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system including valley structures embedded in a layered Miocene sequence in a single run. Besides, statistical information retrieved from the AEM data improved the simulation of the geology significantly, especially for the deep-seated buried valley sediments where borehole information is sparse.  相似文献   

10.
11.
Uncertainty in future reservoir performance is usually evaluated from the simulated performance of a small number of reservoir realizations. Unfortunately, most of the practical methods for generating realizations conditional to production data are only approximately correct. It is not known whether or not the recently developed method of Gradual Deformation is an approximate method or if it actually generates realizations that are distributed correctly. In this paper, we evaluate the ability of the Gradual Deformation method to correctly assess the uncertainty in reservoir predictions by comparing the distribution of conditional realizations for a small test problem with the standard distribution from a Markov Chain Monte Carlo (MCMC) method, which is known to be correct, and with distributions from several approximate methods. Although the Gradual Deformation algorithm samples inefficiently for this test problem and is clearly not an exact method, it gives similar uncertainty estimates to those obtained by MCMC method based on a relatively small number of realizations.  相似文献   

12.
A new approach based on principal component analysis (PCA) for the representation of complex geological models in terms of a small number of parameters is presented. The basis matrix required by the method is constructed from a set of prior geological realizations generated using a geostatistical algorithm. Unlike standard PCA-based methods, in which the high-dimensional model is constructed from a (small) set of parameters by simply performing a multiplication using the basis matrix, in this method the mapping is formulated as an optimization problem. This enables the inclusion of bound constraints and regularization, which are shown to be useful for capturing highly connected geological features and binary/bimodal (rather than Gaussian) property distributions. The approach, referred to as optimization-based PCA (O-PCA), is applied here mainly for binary-facies systems, in which case the requisite optimization problem is separable and convex. The analytical solution of the optimization problem, as well as the derivative of the model with respect to the parameters, is obtained analytically. It is shown that the O-PCA mapping can also be viewed as a post-processing of the standard PCA model. The O-PCA procedure is applied both to generate new (random) realizations and for gradient-based history matching. For the latter, two- and three-dimensional systems, involving channelized and deltaic-fan geological models, are considered. The O-PCA method is shown to perform very well for these history matching problems, and to provide models that capture the key sand–sand and sand–shale connectivities evident in the true model. Finally, the approach is extended to generate bimodal systems in which the properties of both facies are characterized by Gaussian distributions. MATLAB code with the O-PCA implementation, and examples demonstrating its use are provided online as Supplementary Materials.  相似文献   

13.
为了适当地完成储层表征的过程,一个有效的方法就是把现场所有可以利用的信息融合成一个一致性的模型。在实际生产中实现这种融合并非简单的任务,所以有必要运用如地震反演等特殊方法。应用地震反演可以使测井数据和地震数据的有效结合成为可能,并且可以得到一个模型,该模型在预测过程中可通过流体数字模拟来验证。地震反演可以通过多种方法进行,主要分为两大类:一类是确定性方法(其代表是回归反演和约束稀疏脉冲反演),另一类是随机方法(其代表是地质统计学反演)。在本次研究中,通过随机反演结果和确定性反演结果的对比展示了随机反演是如何改进储层表征过程的。事实上,随机反演,可以运用较高的采样率(和储层模型的网格大小相接近),来产生一个更可靠的模型。随机反演的另一个好处就是随机方法可产生一些基本的统计测量值来改进解释精度,并且在储层表征过程中能生成大量的实现,从而使储层模型的不确定性研究成为可能。  相似文献   

14.
Ensemble-based methods are becoming popular assisted history matching techniques with a growing number of field applications. These methods use an ensemble of model realizations, typically constructed by means of geostatistics, to represent the prior uncertainty. The performance of the history matching is very dependent on the quality of the initial ensemble. However, there is a significant level of uncertainty in the parameters used to define the geostatistical model. From a Bayesian viewpoint, the uncertainty in the geostatistical modeling can be represented by a hyper-prior in a hierarchical formulation. This paper presents the first steps towards a general parametrization to address the problem of uncertainty in the prior modeling. The proposed parametrization is inspired in Gaussian mixtures, where the uncertainty in the prior mean and prior covariance is accounted by defining weights for combining multiple Gaussian ensembles, which are estimated during the data assimilation. The parametrization was successfully tested in a simple reservoir problem where the orientation of the major anisotropic direction of the permeability field was unknown.  相似文献   

15.
We present a method of aquifer characterization that is able to utilize multiple sources of conditioning data to build a more realistic model of heterogeneity. This modeling approach (InMod) uses geophysical data to delineate bounding surfaces within sedimentary deposits. The depositional volumes between bounding surfaces are identified automatically from the geophysical data by a region growing algorithm. Simple geometric rules are used to constrain the growth of the regions in 3-D. The nodes within the depositional volume are assigned to categorical lithologies using geostatistical realizations and a dynamic lookup routine that can be conditioned to field data. The realizations created with this method preserve geologically expected features and produces sharp juxtapositions of high and low hydraulic conductivity lithologies along bounding surfaces. The realizations created with InMod also have higher variance than models created only with geostatistics and honor the volumetric distribution of sediments measured from field data.  相似文献   

16.
Stochastic geostatistical techniques are essential tools for groundwater flow and transport modelling in highly heterogeneous media. Typically, these techniques require massive numbers of realizations to accurately simulate the high variability and account for the uncertainty. These massive numbers of realizations imposed several constraints on the stochastic techniques (e.g. increasing the computational effort, limiting the domain size, grid resolution, time step and convergence issues). Understanding the connectivity of the subsurface layers gives an opportunity to overcome these constraints. This research presents a sampling framework to reduce the number of the required Monte Carlo realizations utilizing the connectivity properties of the hydraulic conductivity distributions in a three-dimensional domain. Different geostatistical distributions were tested in this study including exponential distribution with the Turning Bands (TBM) algorithm and spherical distribution using Sequential Gaussian Simulation (SGSIM). It is found that the total connected fraction of the largest clusters and its tortuosity are highly correlated with the percentage of mass arrival and the first arrival quantiles at different control planes. Applying different sampling techniques together with several indicators suggested that a compact sample representing only 10% of the total number of realizations can be used to produce results that are close to the results of the full set of realizations. Also, the proposed sampling techniques specially utilizing the low conductivity clustering show very promising results in terms of matching the full range of realizations. Finally, the size of selected clusters relative to domain size significantly affects transport characteristics and the connectivity indicators.  相似文献   

17.
Sedimentological processes often result in complex three-dimensional subsurface heterogeneity of hydrogeological parameter values. Variogram-based stochastic approaches are often not able to describe heterogeneity in such complex geological environments. This work shows how multiple-point geostatistics can be applied in a realistic hydrogeological application to determine the impact of complex geological heterogeneity on groundwater flow and transport. The approach is applied to a real aquifer in Belgium that exhibits a complex sedimentary heterogeneity and anisotropy. A training image is constructed based on geological and hydrogeological field data. Multiple-point statistics are borrowed from this training image to simulate hydrofacies occurrence, while intrafacies permeability variability is simulated using conventional variogram-based geostatistical methods. The simulated hydraulic conductivity realizations are used as input to a groundwater flow and transport model to investigate the effect of small-scale sedimentary heterogeneity on contaminant plume migration. Results show that small-scale sedimentary heterogeneity has a significant effect on contaminant transport in the studied aquifer. The uncertainty on the spatial facies distribution and intrafacies hydraulic conductivity distribution results in a significant uncertainty on the calculated concentration distribution. Comparison with standard variogram-based techniques shows that multiple-point geostatistics allow better reproduction of irregularly shaped low-permeability clay drapes that influence solute transport.  相似文献   

18.
Stochastic Simulation of Patterns Using Distance-Based Pattern Modeling   总被引:6,自引:2,他引:4  
The advent of multiple-point geostatistics (MPS) gave rise to the integration of complex subsurface geological structures and features into the model by the concept of training images. Initial algorithms generate geologically realistic realizations by using these training images to obtain conditional probabilities needed in a stochastic simulation framework. More recent pattern-based geostatistical algorithms attempt to improve the accuracy of the training image pattern reproduction. In these approaches, the training image is used to construct a pattern database. Consequently, sequential simulation will be carried out by selecting a pattern from the database and pasting it onto the simulation grid. One of the shortcomings of the present algorithms is the lack of a unifying framework for classifying and modeling the patterns from the training image. In this paper, an entirely different approach will be taken toward geostatistical modeling. A novel, principled and unified technique for pattern analysis and generation that ensures computational efficiency and enables a straightforward incorporation of domain knowledge will be presented.  相似文献   

19.
Conditioning Surface-Based Geological Models to Well and Thickness Data   总被引:2,自引:1,他引:1  
Geostatistical simulation methods aim to represent spatial uncertainty through realizations that reflect a certain geological concept by means of a spatial continuity model. Most common spatial continuity models are either variogram, training image, or Boolean based. In this paper, a more recent spatial model of geological continuity is developed, termed the event, or surface-based model, which is specifically applicable to modeling cases with complex stratigraphy, such as in sedimentary systems. These methods rely on a rule-based stacking of events, which are mathematically represented by two-dimensional thickness variations over the domain, where positive thickness is associated with deposition and negative thickness with erosion. Although it has been demonstrated that the surface-based models accurately represent the geological variation present in complex layered systems, they are more difficult to constrain to hard and soft data as is typically required of practical geostatistical techniques. In this paper, we develop a practical methodology for constraining such models to hard data from wells and thickness data interpreted from geophysics, such as seismic data. Our iterative methodology relies on a decomposition of the parameter optimization problem into smaller, manageable problems that are solved sequentially. We demonstrate this method on a real case study of a turbidite sedimentary basin.  相似文献   

20.
Optimization with the Gradual Deformation Method   总被引:1,自引:0,他引:1  
Building reservoir models consistent with production data and prior geological knowledge is usually carried out through the minimization of an objective function. Such optimization problems are nonlinear and may be difficult to solve because they tend to be ill-posed and to involve many parameters. The gradual deformation technique was introduced recently to simplify these problems. Its main feature is the preservation of the spatial structure: perturbed realizations exhibit the same spatial variability as the starting ones. It is shown that optimizations based on gradual deformation converge exponentially to the global minimum, at least for linear problems. In addition, it appears that combining the gradual deformation parameterization with optimizations may remove step by step the structure preservation capability of the gradual deformation method. This bias is negligible when deformation is restricted to a few realization chains, but grows increasingly when the chain number tends to infinity. As in practice, optimization of reservoir models is limited to a small number of iterations with respect to the number of gridblocks, the spatial variability is preserved. Last, the optimization processes are implemented on the basis of the Levenberg–Marquardt method. Although the objective functions, written in terms of Gaussian white noises, are reduced to the data mismatch term, the conditional realization space can be properly sampled.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号