首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
This paper describes a new method for gradually deforming realizations of Gaussian-related stochastic models while preserving their spatial variability. This method consists in building a stochastic process whose state space is the ensemble of the realizations of a spatial stochastic model. In particular, a stochastic process, built by combining independent Gaussian random functions, is proposed to perform the gradual deformation of realizations. Then, the gradual deformation algorithm is coupled with an optimization algorithm to calibrate realizations of stochastic models to nonlinear data. The method is applied to calibrate a continuous and a discrete synthetic permeability fields to well-test pressure data. The examples illustrate the efficiency of the proposed method. Furthermore, we present some extensions of this method (multidimensional gradual deformation, gradual deformation with respect to structural parameters, and local gradual deformation) that are useful in practice. Although the method described in this paper is operational only in the Gaussian framework (e.g., lognormal model, truncated Gaussian model, etc.), the idea of gradually deforming realizations through a stochastic process remains general and therefore promising even for calibrating non-Gaussian models.  相似文献   

2.
Gradual deformation is a parameterization method that reduces considerably the unknown parameter space of stochastic models. This method can be used in an iterative optimization procedure for constraining stochastic simulations to data that are complex, nonanalytical functions of the simulated variables. This method is based on the fact that linear combinations of multi-Gaussian random functions remain multi-Gaussian random functions. During the past few years, we developed the gradual deformation method by combining independent realizations. This paper investigates another alternative: the combination of dependent realizations. One of our motivations for combining dependent realizations was to improve the numerical stability of the gradual deformation method. Because of limitations both in the size of simulation grids and in the precision of simulation algorithms, numerical realizations of a stochastic model are never perfectly independent. It was shown that the accumulation of very small dependence between realizations might result in significant structural drift from the initial stochastic model. From the combination of random functions whose covariance and cross-covariance are proportional to each other, we derived a new formulation of the gradual deformation method that can explicitly take into account the numerical dependence between realizations. This new formulation allows us to reduce the structural deterioration during the iterative optimization. The problem of combining dependent realizations also arises when deforming conditional realizations of a stochastic model. As opposed to the combination of independent realizations, combining conditional realizations avoids the additional conditioning step during the optimization process. However, this procedure is limited to global deformations with fixed structural parameters.  相似文献   

3.
We present a two-step stochastic inversion approach for monitoring the distribution of CO2 injected into deep saline aquifers for the typical scenario of one single injection well and a database comprising a common suite of well logs as well as time-lapse vertical seismic profiling (VSP) data. In the first step, we compute several sets of stochastic models of the elastic properties using conventional sequential Gaussian co-simulations (SGCS) representing the considered reservoir before CO2 injection. All realizations within a set of models are then iteratively combined using a modified gradual deformation algorithm aiming at reducing the mismatch between the observed and simulated VSP data. In the second step, these optimal static models then serve as input for a history matching approach using the same modified gradual deformation algorithm for minimizing the mismatch between the observed and simulated VSP data following the injection of CO2. At each gradual deformation step, the injection and migration of CO2 is simulated and the corresponding seismic traces are computed and compared with the observed ones. The proposed stochastic inversion approach has been tested for a realistic, and arguably particularly challenging, synthetic case study mimicking the geological environment of a potential CO2 injection site in the Cambrian-Ordivician sedimentary sequence of the St. Lawrence platform in Southern Québec. The results demonstrate that the proposed two-step reservoir characterization approach is capable of adequately resolving and monitoring the distribution of the injected CO2. This finds its expression in optimized models of P- and S-wave velocities, density, and porosity, which, compared to conventional stochastic reservoir models, exhibit a significantly improved structural similarity with regard to the corresponding reference models. The proposed approach is therefore expected to allow for an optimal injection forecast by using a quantitative assimilation of all available data from the appraisal stage of a CO2 injection site.  相似文献   

4.
5.
Compensating for estimation smoothing in kriging   总被引:2,自引:0,他引:2  
Smoothing is a characteristic inherent to all minimum mean-square-error spatial estimators such as kriging. Cross-validation can be used to detect and model such smoothing. Inversion of the model produces a new estimator—compensated kriging. A numerical comparison based on an exhaustive permeability sampling of a 4-ft2 slab of Berea Sandstone shows that the estimation surface generated by compensated kriging has properties intermediate between those generated by ordinary kriging and stochastic realizations resulting from simulated annealing and sequential Gaussian simulation. The frequency distribution is well reproduced by the compensated kriging surface, which also approximates the experimental semivariogram well—better than ordinary kriging, but not as well as stochastic realizations. Compensated kriging produces surfaces that are more accurate than stochastic realizations, but not as accurate as ordinary kriging.  相似文献   

6.
Stochastic simulation is increasingly used to map the spatial variability in the grades of elements of interest and to assess the uncertainty in the mineral resources and ore reserves. The practical implementation requires specifying a stochastic model, which describes the spatial distribution of the grades, and an algorithm to construct realizations of these grades, viewed as different possible outcomes or scenarios. In the case of the Gaussian random field model, a variety of algorithms have been proposed in the past decades, but their ability to reproduce the model statistics is often unequal. In this paper, we compare two such algorithms, namely the turning bands and the sequential algorithms. The comparison is hold through a synthetic case study and a real case study in a porphyry copper deposit located in southeastern Iran, in which it is of interest to jointly simulate the copper, molybdenum, silver, lead and zinc grades. Statistical testing and graphical validations are realized to check whether or not the realizations reproduce the features of the true grades, in particular their direct and cross variograms. Sequential simulation based on collocated cokriging turns out to poorly reproduce the cross variograms, while turning bands proves to be accurate in all the analyzed cases.  相似文献   

7.
A new approach based on principal component analysis (PCA) for the representation of complex geological models in terms of a small number of parameters is presented. The basis matrix required by the method is constructed from a set of prior geological realizations generated using a geostatistical algorithm. Unlike standard PCA-based methods, in which the high-dimensional model is constructed from a (small) set of parameters by simply performing a multiplication using the basis matrix, in this method the mapping is formulated as an optimization problem. This enables the inclusion of bound constraints and regularization, which are shown to be useful for capturing highly connected geological features and binary/bimodal (rather than Gaussian) property distributions. The approach, referred to as optimization-based PCA (O-PCA), is applied here mainly for binary-facies systems, in which case the requisite optimization problem is separable and convex. The analytical solution of the optimization problem, as well as the derivative of the model with respect to the parameters, is obtained analytically. It is shown that the O-PCA mapping can also be viewed as a post-processing of the standard PCA model. The O-PCA procedure is applied both to generate new (random) realizations and for gradient-based history matching. For the latter, two- and three-dimensional systems, involving channelized and deltaic-fan geological models, are considered. The O-PCA method is shown to perform very well for these history matching problems, and to provide models that capture the key sand–sand and sand–shale connectivities evident in the true model. Finally, the approach is extended to generate bimodal systems in which the properties of both facies are characterized by Gaussian distributions. MATLAB code with the O-PCA implementation, and examples demonstrating its use are provided online as Supplementary Materials.  相似文献   

8.
Optimization with the Gradual Deformation Method   总被引:1,自引:0,他引:1  
Building reservoir models consistent with production data and prior geological knowledge is usually carried out through the minimization of an objective function. Such optimization problems are nonlinear and may be difficult to solve because they tend to be ill-posed and to involve many parameters. The gradual deformation technique was introduced recently to simplify these problems. Its main feature is the preservation of the spatial structure: perturbed realizations exhibit the same spatial variability as the starting ones. It is shown that optimizations based on gradual deformation converge exponentially to the global minimum, at least for linear problems. In addition, it appears that combining the gradual deformation parameterization with optimizations may remove step by step the structure preservation capability of the gradual deformation method. This bias is negligible when deformation is restricted to a few realization chains, but grows increasingly when the chain number tends to infinity. As in practice, optimization of reservoir models is limited to a small number of iterations with respect to the number of gridblocks, the spatial variability is preserved. Last, the optimization processes are implemented on the basis of the Levenberg–Marquardt method. Although the objective functions, written in terms of Gaussian white noises, are reduced to the data mismatch term, the conditional realization space can be properly sampled.  相似文献   

9.
10.
The prediction of fluid flows within hydrocarbon reservoirs requires the characterization of petrophysical properties. Such characterization is performed on the basis of geostatistics and history-matching; in short, a reservoir model is first randomly drawn, and then sequentially adjusted until it reproduces the available dynamic data. Two main concerns typical of the problem under consideration are the heterogeneity of rocks occurring at all scales and the use of data of distinct resolution levels. Therefore, referring to sequential Gaussian simulation, this paper proposes a new stochastic simulation method able to handle several scales for both continuous or discrete random fields. This method adds flexibility to history-matching as it boils down to the multiscale parameterization of reservoir models. In other words, reservoir models can be updated at either coarse or fine scales, or both. Parameterization adapts to the available data; the coarser the scale targeted, the smaller the number of unknown parameters, and the more efficient the history-matching process. This paper focuses on the use of variational optimization techniques driven by the gradual deformation method to vary reservoir models. Other data assimilation methods and perturbation processes could have been envisioned as well. Last, a numerical application case is presented in order to highlight the advantages of the proposed method for conditioning permeability models to dynamic data. For simplicity, we focus on two-scale processes. The coarse scale describes the variations in the trend while the fine scale characterizes local variations around the trend. The relationships between data resolution and parameterization are investigated.  相似文献   

11.
Constraining stochastic models of reservoir properties such as porosity and permeability can be formulated as an optimization problem. While an optimization based on random search methods preserves the spatial variability of the stochastic model, it is prohibitively computer intensive. In contrast, gradient search methods may be very efficient but it does not preserve the spatial variability of the stochastic model. The gradual deformation method allows for modifying a reservoir model (i.e., realization of the stochastic model) from a small number of parameters while preserving its spatial variability. It can be considered as a first step towards the merger of random and gradient search methods. The gradual deformation method yields chains of reservoir models that can be investigated successively to identify an optimal reservoir model. The investigation of each chain is based on gradient computations, but the building of chains of reservoir models is random. In this paper, we propose an algorithm that further improves the efficiency of the gradual deformation method. Contrary to the previous gradual deformation method, we also use gradient information to build chains of reservoir models. The idea is to combine the initial reservoir model or the previously optimized reservoir model with a compound reservoir model. This compound model is a linear combination of a set of independent reservoir models. The combination coefficients are calculated so that the search direction from the initial model is as close as possible to the gradient search direction. This new gradual deformation scheme allows us for reducing the number of optimization parameters while selecting an optimal search direction. The numerical example compares the performance of the new gradual deformation scheme with that of the traditional one.  相似文献   

12.
Stochastic sequential simulation is a common modelling technique used in Earth sciences and an integral part of iterative geostatistical seismic inversion methodologies. Traditional stochastic sequential simulation techniques based on bi-point statistics assume, for the entire study area, stationarity of the spatial continuity pattern and a single probability distribution function, as revealed by a single variogram model and inferred from the available experimental data, respectively. In this paper, the traditional direct sequential simulation algorithm is extended to handle non-stationary natural phenomena. The proposed stochastic sequential simulation algorithm can take into consideration multiple regionalized spatial continuity patterns and probability distribution functions, depending on the spatial location of the grid node to be simulated. This work shows the application and discusses the benefits of the proposed stochastic sequential simulation as part of an iterative geostatistical seismic inversion methodology in two distinct geological environments in which non-stationarity behaviour can be assessed by the simultaneous interpretation of the available well-log and seismic reflection data. The results show that the elastic models generated by the proposed stochastic sequential simulation are able to reproduce simultaneously the regional and global variogram models and target distribution functions relative to the average volume of each sub-region. When used as part of a geostatistical seismic inversion procedure, the retrieved inverse models are more geologically realistic, since they incorporate the knowledge of the subsurface geology as provided, for example, by seismic and well-log data interpretation.  相似文献   

13.
Uncertainty quantification is typically accomplished by simulating multiple geological realizations, which can be very expensive computationally if the flow process is complicated and the models are highly resolved. Upscaling procedures can be applied to reduce computational demands, though it is essential that the resulting coarse-model predictions correspond to reference fine-scale solutions. In this work, we develop an ensemble level upscaling (EnLU) procedure for compositional systems, which enables the efficient generation of multiple coarse models for use in uncertainty quantification. We apply a newly developed global compositional upscaling method to provide coarse-scale parameters and functions for selected realizations. This global upscaling entails transmissibility and relative permeability upscaling, along with the computation of a-factors to capture component fluxes. Additional features include near-well upscaling for all coarse parameters and functions, and iteration on the a-factors, which is shown to improve accuracy. In the EnLU framework, this global upscaling is applied for only a few selected realizations. For 90 % or more of the realizations, upscaled functions are assigned statistically based on quickly computed flow and permeability attributes. A sequential Gaussian co-simulation procedure is incorporated to provide coarse models that honor the spatial correlation structure of the upscaled properties. The resulting EnLU procedure is applied for multiple realizations of two-dimensional models, for both Gaussian and channelized permeability fields. Results demonstrate that EnLU provides P10, P50, and P90 results for phase and component production rates that are in close agreement with reference fine-scale results. Less accuracy is observed in realization-by-realization comparisons, though the models are still much more accurate than those generated using standard coarsening procedures.  相似文献   

14.
A Bayesian linear inversion methodology based on Gaussian mixture models and its application to geophysical inverse problems are presented in this paper. The proposed inverse method is based on a Bayesian approach under the assumptions of a Gaussian mixture random field for the prior model and a Gaussian linear likelihood function. The model for the latent discrete variable is defined to be a stationary first-order Markov chain. In this approach, a recursive exact solution to an approximation of the posterior distribution of the inverse problem is proposed. A Markov chain Monte Carlo algorithm can be used to efficiently simulate realizations from the correct posterior model. Two inversion studies based on real well log data are presented, and the main results are the posterior distributions of the reservoir properties of interest, the corresponding predictions and prediction intervals, and a set of conditional realizations. The first application is a seismic inversion study for the prediction of lithological facies, P- and S-impedance, where an improvement of 30% in the root-mean-square error of the predictions compared to the traditional Gaussian inversion is obtained. The second application is a rock physics inversion study for the prediction of lithological facies, porosity, and clay volume, where predictions slightly improve compared to the Gaussian inversion approach.  相似文献   

15.
A fast Fourier transform (FFT) moving average (FFT-MA) method for generating Gaussian stochastic processes is derived. Using discrete Fourier transforms makes the calculations easy and fast so that large random fields can be produced. On the other hand, the basic moving average frame allows us to uncouple the random numbers from the structural parameters (mean, variance, correlation length, ... ), but also to draw the randomness components in spatial domain. Such features impart great flexibility to the FFT-MA generator. For instance, changing only the random numbers gives distinct realizations all having the same covariance function. Similarly, several realizations can be built from the same random number set, but from different structural parameters. Integrating the FFT-MA generator into an optimization procedure provides a tool theoretically capable to determine the random numbers identifying the Gaussian field as well as the structural parameters from dynamic data. Moreover, all or only some of the random numbers can be perturbed so that realizations produced using the FFT-MA generator can be locally updated through an optimization process.  相似文献   

16.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or four-dimensional seismic data. To update geostatistical realizations, the local gradual deformation method can be used. However, history matching is a complex inverse problem, and the computational effort in terms of the number of reservoir simulations required in the optimization procedure increases with the number of matching parameters. History matching large fields with a large number of parameters has been an ongoing challenge in reservoir simulation. This paper presents a new technique to improve history matching with the local gradual deformation method using the gradient-based optimizations. The new approach is based on the approximate derivative calculations using the partial separability of the objective function. The objective function is first split into local components, and only the most influential parameters in each component are used for the derivative computation. A perturbation design is then proposed to simultaneously compute all the derivatives with only a few simulations. This new technique makes history matching using the local gradual deformation method with large numbers of parameters tractable.  相似文献   

17.
Conditional Simulation with Patterns   总被引:17,自引:0,他引:17  
An entirely new approach to stochastic simulation is proposed through the direct simulation of patterns. Unlike pixel-based (single grid cells) or object-based stochastic simulation, pattern-based simulation simulates by pasting patterns directly onto the simulation grid. A pattern is a multi-pixel configuration identifying a meaningful entity (a puzzle piece) of the underlying spatial continuity. The methodology relies on the use of a training image from which the pattern set (database) is extracted. The use of training images is not new. The concept of a training image is extensively used in simulating Markov random fields or for sequentially simulating structures using multiple-point statistics. Both these approaches rely on extracting statistics from the training image, then reproducing these statistics in multiple stochastic realizations, at the same time conditioning to any available data. The proposed approach does not rely, explicitly, on either a statistical or probabilistic methodology. Instead, a sequential simulation method is proposed that borrows heavily from the pattern recognition literature and simulates by pasting at each visited location along a random path a pattern that is compatible with the available local data and any previously simulated patterns. This paper discusses the various implementation details to accomplish this idea. Several 2D illustrative as well as realistic and complex 3D examples are presented to showcase the versatility of the proposed algorithm.  相似文献   

18.
Estimation or simulation? That is the question   总被引:1,自引:0,他引:1  
The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.  相似文献   

19.
Assessment of uncertainty in the performance of fluvial reservoirs often requires the ability to generate realizations of channel sands that are conditional to well observations. For channels with low sinuosity this problem has been effectively solved. When the sinuosity is large, however, the standard stochastic models for fluvial reservoirs are not valid, because the deviation of the channel from a principal direction line is multivalued. In this paper, I show how the method of randomized maximum likelihood can be used to generate conditional realizations of channels with large sinuosity. In one example, a Gaussian random field model is used to generate an unconditional realization of a channel with large sinuosity, and this realization is then conditioned to well observations. Channels generated in the second approach are less realistic, but may be sufficient for modeling reservoir connectivity in a realistic way. In the second example, an unconditional realization of a channel is generated by a complex geologic model with random forcing. It is then adjusted in a meaningful way to honor well observations. The key feature in the solution is the use of channel direction instead of channel deviation as the characteristic random function describing the geometry of the channel.  相似文献   

20.
An adequate representation of the detailed spatial variation of subsurface parameters for underground flow and mass transport simulation entails heterogeneous models. Uncertainty characterization generally calls for a Monte Carlo analysis of many equally likely realizations that honor both direct information (e.g., conductivity data) and information about the state of the system (e.g., piezometric head or concentration data). Thus, the problems faced is how to generate multiple realizations conditioned to parameter data, and inverse-conditioned to dependent state data. We propose using Markov chain Monte Carlo approach (MCMC) with block updating and combined with upscaling to achieve this purpose. Our proposal presents an alternative block updating scheme that permits the application of MCMC to inverse stochastic simulation of heterogeneous fields and incorporates upscaling in a multi-grid approach to speed up the generation of the realizations. The main advantage of MCMC, compared to other methods capable of generating inverse-conditioned realizations (such as the self-calibrating or the pilot point methods), is that it does not require the solution of a complex optimization inverse problem, although it requires the solution of the direct problem many times.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号