首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper describes a novel approach for creating an efficient, general, and differentiable parameterization of large-scale non-Gaussian, non-stationary random fields (represented by multipoint geostatistics) that is capable of reproducing complex geological structures such as channels. Such parameterizations are appropriate for use with gradient-based algorithms applied to, for example, history-matching or uncertainty propagation. It is known that the standard Karhunen–Loeve (K–L) expansion, also called linear principal component analysis or PCA, can be used as a differentiable parameterization of input random fields defining the geological model. The standard K–L model is, however, limited in two respects. It requires an eigen-decomposition of the covariance matrix of the random field, which is prohibitively expensive for large models. In addition, it preserves only the two-point statistics of a random field, which is insufficient for reproducing complex structures. In this work, kernel PCA is applied to address the limitations associated with the standard K–L expansion. Although widely used in machine learning applications, it does not appear to have found any application for geological model parameterization. With kernel PCA, an eigen-decomposition of a small matrix called the kernel matrix is performed instead of the full covariance matrix. The method is much more efficient than the standard K–L procedure. Through use of higher order polynomial kernels, which implicitly define a high-dimensionality feature space, kernel PCA further enables the preservation of high-order statistics of the random field, instead of just two-point statistics as in the K–L method. The kernel PCA eigen-decomposition proceeds using a set of realizations created by geostatistical simulation (honoring two-point or multipoint statistics) rather than the analytical covariance function. We demonstrate that kernel PCA is capable of generating differentiable parameterizations that reproduce the essential features of complex geological structures represented by multipoint geostatistics. The kernel PCA representation is then applied to history match a water flooding problem. This example demonstrates that kernel PCA can be used with gradient-based history matching to provide models that match production history while maintaining multipoint geostatistics consistent with the underlying training image.  相似文献   

2.
A new approach based on principal component analysis (PCA) for the representation of complex geological models in terms of a small number of parameters is presented. The basis matrix required by the method is constructed from a set of prior geological realizations generated using a geostatistical algorithm. Unlike standard PCA-based methods, in which the high-dimensional model is constructed from a (small) set of parameters by simply performing a multiplication using the basis matrix, in this method the mapping is formulated as an optimization problem. This enables the inclusion of bound constraints and regularization, which are shown to be useful for capturing highly connected geological features and binary/bimodal (rather than Gaussian) property distributions. The approach, referred to as optimization-based PCA (O-PCA), is applied here mainly for binary-facies systems, in which case the requisite optimization problem is separable and convex. The analytical solution of the optimization problem, as well as the derivative of the model with respect to the parameters, is obtained analytically. It is shown that the O-PCA mapping can also be viewed as a post-processing of the standard PCA model. The O-PCA procedure is applied both to generate new (random) realizations and for gradient-based history matching. For the latter, two- and three-dimensional systems, involving channelized and deltaic-fan geological models, are considered. The O-PCA method is shown to perform very well for these history matching problems, and to provide models that capture the key sand–sand and sand–shale connectivities evident in the true model. Finally, the approach is extended to generate bimodal systems in which the properties of both facies are characterized by Gaussian distributions. MATLAB code with the O-PCA implementation, and examples demonstrating its use are provided online as Supplementary Materials.  相似文献   

3.

A new low-dimensional parameterization based on principal component analysis (PCA) and convolutional neural networks (CNN) is developed to represent complex geological models. The CNN–PCA method is inspired by recent developments in computer vision using deep learning. CNN–PCA can be viewed as a generalization of an existing optimization-based PCA (O-PCA) method. Both CNN–PCA and O-PCA entail post-processing a PCA model to better honor complex geological features. In CNN–PCA, rather than use a histogram-based regularization as in O-PCA, a new regularization involving a set of metrics for multipoint statistics is introduced. The metrics are based on summary statistics of the nonlinear filter responses of geological models to a pre-trained deep CNN. In addition, in the CNN–PCA formulation presented here, a convolutional neural network is trained as an explicit transform function that can post-process PCA models quickly. CNN–PCA is shown to provide both unconditional and conditional realizations that honor the geological features present in reference SGeMS geostatistical realizations for a binary channelized system. Flow statistics obtained through simulation of random CNN–PCA models closely match results for random SGeMS models for a demanding case in which O-PCA models lead to significant discrepancies. Results for history matching are also presented. In this assessment CNN–PCA is applied with derivative-free optimization, and a subspace randomized maximum likelihood method is used to provide multiple posterior models. Data assimilation and significant uncertainty reduction are achieved for existing wells, and physically reasonable predictions are also obtained for new wells. Finally, the CNN–PCA method is extended to a more complex nonstationary bimodal deltaic fan system, and is shown to provide high-quality realizations for this challenging example.

  相似文献   

4.
Traditional ensemble-based history matching method, such as the ensemble Kalman filter and iterative ensemble filters, usually update reservoir parameter fields using numerical grid-based parameterization. Although a parameter constraint term in the objective function for deriving these methods exists, it is difficult to preserve the geological continuity of the parameter field in the updating process of these methods; this is especially the case in the estimation of statistically anisotropic fields (such as a statistically anisotropic Gaussian field and facies field with elongated facies) with uncertainties about the anisotropy direction. In this work, we propose a Karhunen-Loeve expansion-based global parameterization technique that is combined with the ensemble-based history matching method for inverse modeling of statistically anisotropic fields. By using the Karhunen-Loeve expansion, a Gaussian random field can be parameterized by a group of independent Gaussian random variables. For a facies field, we combine the Karhunen-Loeve expansion and the level set technique to perform the parameterization; that is, for each facies, we use a Gaussian random field and a level set algorithm to parameterize it, and the Gaussian random field is further parameterized by the Karhunen-Loeve expansion. We treat the independent Gaussian random variables in the Karhunen-Loeve expansion as the model parameters. When the anisotropy direction of the statistically anisotropic field is uncertain, we also treat it as a model parameter for updating. After model parameterization, we use the ensemble randomized maximum likelihood filter to perform history matching. Because of the nature of the Karhunen-Loeve expansion, the geostatistical characteristics of the parameter field can be preserved in the updating process. Synthetic cases are set up to test the performance of the proposed method. Numerical results show that the proposed method is suitable for estimating statistically anisotropic fields.  相似文献   

5.
6.
多点统计地质建模技术研究进展与应用*   总被引:1,自引:0,他引:1       下载免费PDF全文
多点统计地质建模技术自提出至今已有20余年的历史,已经成为储集层地质建模的国际前沿研究方向,在理论和应用研究方面都取得了长足进展。以多点统计地质建模技术的发展历程为主线,以多点统计地质建模技术的技术进展为核心,论述了多点统计地质建模技术的研究进展,对主要的多点统计地质建模方法进行了分类,系统讨论了具有发展潜力的多点统计地质建模方法的原理、特点以及存在的问题,并以扎格罗斯盆地孔隙型碳酸盐岩油藏S油藏为例进行了应用研究,对比了多点统计模拟与序贯指示模拟的优劣。研究表明,多点统计模拟在复杂的相模拟方面,较序贯指示模拟具有明显的优势;基于图型的Dispat方法采用图型替换数据事件的策略,使相的分布规律更符合地质学家的地质认识。这一认识为孔隙型碳酸盐岩油藏建模提供了一种新思路,对类似油藏的地质建模具有借鉴作用。  相似文献   

7.
Assimilation of production data into reservoir models for which the distribution of porosity and permeability is largely controlled by facies has become increasingly common. When the locations of the facies bodies must be conditioned to observations, the truncated plurigaussian model has been often shown to be a useful method for modeling as it allows gaussian variables to be updated instead of facies types. Previous experience has also shown that ensemble Kalman filter-like methods are particularly effective for assimilation of data into truncated plurigaussian models. In this paper, some limitations are shown of the ensemble-based or gradient-based methods when applied to truncated plurigaussian models of a certain type that is likely to occur for modeling channel facies. It is also shown that it is possible to improve the data match and increase the ensemble spread by modifying the updating step using an approximate derivative of the truncation map.  相似文献   

8.
Over the last years, the ensemble Kalman filter (EnKF) has become a very popular tool for history matching petroleum reservoirs. EnKF is an alternative to more traditional history matching techniques as it is computationally fast and easy to implement. Instead of seeking one best model estimate, EnKF is a Monte Carlo method that represents the solution with an ensemble of state vectors. Lately, several ensemble-based methods have been proposed to improve upon the solution produced by EnKF. In this paper, we compare EnKF with one of the most recently proposed methods, the adaptive Gaussian mixture filter (AGM), on a 2D synthetic reservoir and the Punq-S3 test case. AGM was introduced to loosen up the requirement of a Gaussian prior distribution as implicitly formulated in EnKF. By combining ideas from particle filters with EnKF, AGM extends the low-rank kernel particle Kalman filter. The simulation study shows that while both methods match the historical data well, AGM is better at preserving the geostatistics of the prior distribution. Further, AGM also produces estimated fields that have a higher empirical correlation with the reference field than the corresponding fields obtained with EnKF.  相似文献   

9.
The application of the ensemble Kalman filter (EnKF) for history matching petroleum reservoir models has been the subject of intense investigation during the past 10 years. Unfortunately, EnKF often fails to provide reasonable data matches for highly nonlinear problems. This fact motivated the development of several iterative ensemble-based methods in the last few years. However, there exists no study comparing the performance of these methods in the literature, especially in terms of their ability to quantify uncertainty correctly. In this paper, we compare the performance of nine ensemble-based methods in terms of the quality of the data matches, quantification of uncertainty, and computational cost. For this purpose, we use a small but highly nonlinear reservoir model so that we can generate the reference posterior distribution of reservoir properties using a very long chain generated by a Markov chain Monte Carlo sampling algorithm. We also consider one adjoint-based implementation of the randomized maximum likelihood method in the comparisons.  相似文献   

10.
While 3D seismic has been the basis for geological model building for a long time, time-lapse seismic has primarily been used in a qualitative manner to assist in monitoring reservoir behavior. With the growing acceptance of assisted history matching methods has come an equally rising interest in incorporating 3D or time-lapse seismic data into the history matching process in a more quantitative manner. The common approach in recent studies has been to invert the seismic data to elastic or to dynamic reservoir properties, typically acoustic impedance or saturation changes. Here we consider the use of both 3D and time-lapse seismic amplitude data based on a forward modeling approach that does not require any inversion in the traditional sense. Advantages of such an approach may be better estimation and treatment of model and measurement errors, the combination of two inversion steps into one by removing the explicit inversion to state space variables, and more consistent dependence on the validity of assumptions underlying the inversion process. In this paper, we introduce this approach with the use of an assisted history matching method in mind. Two ensemble-based methods, the ensemble Kalman filter and the ensemble randomized maximum likelihood method, are used to investigate issues arising from the use of seismic amplitude data, and possible solutions are presented. Experiments with a 3D synthetic reservoir model show that additional information on the distribution of reservoir fluids, and on rock properties such as porosity and permeability, can be extracted from the seismic data. The role for localization and iterative methods are discussed in detail.  相似文献   

11.
The degrees of freedom (DOF) in standard ensemble-based data assimilation is limited by the ensemble size. Successful assimilation of a data set with large information content (IC) therefore requires that the DOF is sufficiently large. A too small number of DOF with respect to the IC may result in ensemble collapse, or at least in unwarranted uncertainty reduction in the estimation results. In this situation, one has two options to restore a proper balance between the DOF and the IC: to increase the DOF or to decrease the IC. Spatially dense data sets typically have a large IC. Within subsurface applications, inverted time-lapse seismic data used for reservoir history matching is an example of a spatially dense data set. Such data are considered to have great potential due to their large IC, but they also contain errors that are challenging to characterize properly. The computational cost of running the forward simulations for reservoir history matching with any kind of data is large for field cases, such that a moderately large ensemble size is standard. Realization of the potential in seismic data for ensemble-based reservoir history matching is therefore not straightforward, not only because of the unknown character of the associated data errors, but also due to the imbalance between a large IC and a too small number of DOF. Distance-based localization is often applied to increase the DOF but is example specific and involves cumbersome implementation work. We consider methods to obtain a proper balance between the IC and the DOF when assimilating inverted seismic data for reservoir history matching. To decrease the IC, we consider three ways to reduce the influence of the data space; subspace pseudo inversion, data coarsening, and a novel way of performing front extraction. To increase the DOF, we consider coarse-scale simulation, which allows for an increase in the DOF by increasing the ensemble size without increasing the total computational cost. We also consider a combination of decreasing the IC and increasing the DOF by proposing a novel method consisting of a combination of data coarsening and coarse-scale simulation. The methods were compared on one small and one moderately large example with seismic bulk-velocity fields at four assimilation times as data. The size of the examples allows for calculation of a reference solution obtained with standard ensemble-based data assimilation methodology and an unrealistically large ensemble size. With the reference solution as the yardstick with which the quality of other methods are measured, we find that the novel method combining data coarsening and coarse-scale simulations gave the best results. With very restricted computational resources available, this was the only method that gave satisfactory results.  相似文献   

12.
An approach for geostatistically consistent matching of 3D flow simulation models and 3D geological models is proposed. This approach uses an optimization algorithm based on identification of the parameters of the geostatistical model (for example, the variogram parameters, such as range, sill, and nugget effect). Here, the inverse problem is considered in the greatest generality taking into account facies heterogeneity and the variogram anisotropy. The correlation dependence parameters (porosity-to-log permeability) are clarified for each single facies.  相似文献   

13.
Song  Suihong  Mukerji  Tapan  Hou  Jiagen 《Mathematical Geosciences》2021,53(7):1413-1444

Conditional facies modeling combines geological spatial patterns with different types of observed data, to build earth models for predictions of subsurface resources. Recently, researchers have used generative adversarial networks (GANs) for conditional facies modeling, where an unconditional GAN is first trained to learn the geological patterns using the original GAN’s loss function, then appropriate latent vectors are searched to generate facies models that are consistent with the observed conditioning data. A problem with this approach is that the time-consuming search process needs to be conducted for every new conditioning data. As an alternative, we improve GANs for conditional facies simulation (called GANSim) by introducing an extra condition-based loss function and adjusting the architecture of the generator to take the conditioning data as inputs, based on progressive growing of GANs. The condition-based loss function is defined as the inconsistency between the input conditioning value and the corresponding characteristics exhibited by the output facies model, and forces the generator to learn the ability of being consistent with the input conditioning data, together with the learning of geological patterns. Our input conditioning factors include global features (e.g., the mud facies proportion) alone, local features such as sparse well facies data alone, and joint combination of global features and well facies data. After training, we evaluate both the quality of generated facies models and the conditioning ability of the generators, by manual inspection and quantitative assessment. The trained generators are quite robust in generating high-quality facies models conditioned to various types of input conditioning information.

  相似文献   

14.
Reservoir management requires periodic updates of the simulation models using the production data available over time. Traditionally, validation of reservoir models with production data is done using a history matching process. Uncertainties in the data, as well as in the model, lead to a nonunique history matching inverse problem. It has been shown that the ensemble Kalman filter (EnKF) is an adequate method for predicting the dynamics of the reservoir. The EnKF is a sequential Monte-Carlo approach that uses an ensemble of reservoir models. For realistic, large-scale applications, the ensemble size needs to be kept small due to computational inefficiency. Consequently, the error space is not well covered (poor cross-correlation matrix approximations) and the updated parameter field becomes scattered and loses important geological features (for example, the contact between high- and low-permeability values). The prior geological knowledge present in the initial time is not found anymore in the final updated parameter. We propose a new approach to overcome some of the EnKF limitations. This paper shows the specifications and results of the ensemble multiscale filter (EnMSF) for automatic history matching. EnMSF replaces, at each update time, the prior sample covariance with a multiscale tree. The global dependence is preserved via the parent–child relation in the tree (nodes at the adjacent scales). After constructing the tree, the Kalman update is performed. The properties of the EnMSF are presented here with a 2D, two-phase (oil and water) small twin experiment, and the results are compared to the EnKF. The advantages of using EnMSF are localization in space and scale, adaptability to prior information, and efficiency in case many measurements are available. These advantages make the EnMSF a practical tool for many data assimilation problems.  相似文献   

15.
The spatial continuity of facies is one of the key factors controlling flow in reservoir models. Traditional pixel-based methods such as truncated Gaussian random fields and indicator simulation are based on only two-point statistics, which is insufficient to capture complex facies structures. Current methods for multi-point statistics either lack a consistent statistical model specification or are too computer intensive to be applicable. We propose a Markov mesh model based on generalized linear models for geological facies modeling. The approach defines a consistent statistical model that is facilitated by efficient estimation of model parameters and generation of realizations. Our presentation includes a formulation of the general framework, model specifications in two and three dimensions, and details on how the parameters can be estimated from a training image. We illustrate the method using multiple training images, including binary and trinary images and simulations in two and three dimensions. We also do a thorough comparison to the snesim approach. We find that the current model formulation is applicable for multiple training images and compares favorably to the snesim approach in our test examples. The method is highly memory efficient.  相似文献   

16.
Application of Multiple Point Geostatistics to Non-stationary Images   总被引:5,自引:2,他引:3  
Simulation of flow and solute transport through aquifers or oil reservoirs requires a precise representation of subsurface heterogeneity that can be achieved by stochastic simulation approaches. Traditional geostatistical methods based on variograms, such as truncated Gaussian simulation or sequential indicator simulation, may fail to generate the complex, curvilinear, continuous and interconnected facies distributions that are often encountered in real geological media, due to their reliance on two-point statistics. Multiple Point Geostatistics (MPG) overcomes this constraint by using more complex point configurations whose statistics are retrieved from training images. Obtaining representative statistics requires stationary training images, but geological understanding often suggests a priori facies variability patterns. This research aims at extending MPG to non-stationary facies distributions. The proposed method subdivides the training images into different areas. The statistics for each area are stored in separate frequency search trees. Several training images are used to ensure that the obtained statistics are representative. The facies probability distribution for each cell during simulation is calculated by weighting the probabilities from the frequency trees. The method is tested on two different object-based training image sets. Results show that non-stationary training images can be used to generate suitable non-stationary facies distributions.  相似文献   

17.
Ensemble methods present a practical framework for parameter estimation, performance prediction, and uncertainty quantification in subsurface flow and transport modeling. In particular, the ensemble Kalman filter (EnKF) has received significant attention for its promising performance in calibrating heterogeneous subsurface flow models. Since an ensemble of model realizations is used to compute the statistical moments needed to perform the EnKF updates, large ensemble sizes are needed to provide accurate updates and uncertainty assessment. However, for realistic problems that involve large-scale models with computationally demanding flow simulation runs, the EnKF implementation is limited to small-sized ensembles. As a result, spurious numerical correlations can develop and lead to inaccurate EnKF updates, which tend to underestimate or even eliminate the ensemble spread. Ad hoc practical remedies, such as localization, local analysis, and covariance inflation schemes, have been developed and applied to reduce the effect of sampling errors due to small ensemble sizes. In this paper, a fast linear approximate forecast method is proposed as an alternative approach to enable the use of large ensemble sizes in operational settings to obtain more improved sample statistics and EnKF updates. The proposed method first clusters a large number of initial geologic model realizations into a small number of groups. A representative member from each group is used to run a full forward flow simulation. The flow predictions for the remaining realizations in each group are approximated by a linearization around the full simulation results of the representative model (centroid) of the respective cluster. The linearization can be performed using either adjoint-based or ensemble-based gradients. Results from several numerical experiments with two-phase and three-phase flow systems in this paper suggest that the proposed method can be applied to improve the EnKF performance in large-scale problems where the number of full simulation is constrained.  相似文献   

18.
Generation of correlated properties in heterogeneous porous media   总被引:1,自引:0,他引:1  
The spatial distribution of rock properties in porous media, such as permeability and porosity, often is strongly variable. Therefore, these properties usefully may be considered as a random field. However, this variability is correlated frequently on length scales comparable to geological lengths (for example, scales of sand bodies or facies). To solve various engineering problems (for example, in the oil recovery process) numerical models of a porous medium often are used. A need exists then to understand correlated random fields and to generate them over discretized numerical grids. The paper describes the general mathematical methods required to do this, with one particular method (the nearest neighbor model) described in detail. How parameters of the mathematical model may be related to rock property statistics for the nearest neighbor model is shown. The method is described in detail in one, two, and three dimensions. Examples are given of how model parameters may be determined from real data.  相似文献   

19.
Multiple-point statistics (MPS) provides a flexible grid-based approach for simulating complex geologic patterns that contain high-order statistical information represented by a conceptual prior geologic model known as a training image (TI). While MPS is quite powerful for describing complex geologic facies connectivity, conditioning the simulation results on flow measurements that have a nonlinear and complex relation with the facies distribution is quite challenging. Here, an adaptive flow-conditioning method is proposed that uses a flow-data feedback mechanism to simulate facies models from a prior TI. The adaptive conditioning is implemented as a stochastic optimization algorithm that involves an initial exploration stage to find the promising regions of the search space, followed by a more focused search of the identified regions in the second stage. To guide the search strategy, a facies probability map that summarizes the common features of the accepted models in previous iterations is constructed to provide conditioning information about facies occurrence in each grid block. The constructed facies probability map is then incorporated as soft data into the single normal equation simulation (snesim) algorithm to generate a new candidate solution for the next iteration. As the optimization iterations progress, the initial facies probability map is gradually updated using the most recently accepted iterate. This conditioning process can be interpreted as a stochastic optimization algorithm with memory where the new models are proposed based on the history of the successful past iterations. The application of this adaptive conditioning approach is extended to the case where multiple training images are proposed as alternative geologic scenarios. The advantages and limitations of the proposed adaptive conditioning scheme are discussed and numerical experiments from fluvial channel formations are used to compare its performance with non-adaptive conditioning techniques.  相似文献   

20.
Many variogram (or covariance) models that are valid—or realizable—models of Gaussian random functions are not realizable indicator variogram (or covariance) models. Unfortunately there is no known necessary and sufficient condition for a function to be the indicator variogram of a random set. Necessary conditions can be easily obtained for the behavior at the origin or at large distance. The power, Gaussian, cubic or cardinal-sine models do not fulfill these conditions and are therefore not realizable. These considerations are illustrated by a Monte Carlo simulation demonstrating nonrealizability over some very simple three-point configurations in two or three dimensions. No definitive result has been obtained about the spherical model. Among the commonly used models for Gaussian variables, only the exponential appears to be a realizable indicator variogram model in all dimensions. It can be associated with a mosaic, a Boolean or a truncated Gaussian random set. In one dimension, the exponential indicator model is closely associated with continuous-time Markov chains, which can also lead to more variogram models such as the damped oscillation model. One-dimensional random sets can also be derived from renewal processes, or mosaic models associated with such processes. This provides an interesting link between the geostatistical formalism, focused mostly on two-point statistics, and the approach of quantitative sedimentologists who compute the probability distribution function of the thickness of different geological facies. The last part of the paper presents three approaches for obtaining new realizable indicator variogram models in three dimensions. One approach consists of combining existing realizable models. Other approaches are based on the formalism of Boolean random sets and truncated Gaussian functions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号