首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Geologic uncertainties and limited well data often render recovery forecasting a difficult undertaking in typical appraisal and early development settings. Recent advances in geologic modeling algorithms permit automation of the model generation process via macros and geostatistical tools. This allows rapid construction of multiple alternative geologic realizations. Despite the advances in geologic modeling, computation of the reservoir dynamic response via full-physics reservoir simulation remains a computationally expensive task. Therefore, only a few of the many probable realizations are simulated in practice. Experimental design techniques typically focus on a few discrete geologic realizations as they are inherently more suitable for continuous engineering parameters and can only crudely approximate the impact of geology. A flow-based pattern recognition algorithm (FPRA) has been developed for quantifying the forecast uncertainty as an alternative. The proposed algorithm relies on the rapid characterization of the geologic uncertainty space represented by an ensemble of sufficiently diverse static model realizations. FPRA characterizes the geologic uncertainty space by calculating connectivity distances, which quantify how different each individual realization is from all others in terms of recovery response. Fast streamline simulations are employed in evaluating these distances. By applying pattern recognition techniques to connectivity distances, a few representative realizations are identified within the model ensemble for full-physics simulation. In turn, the recovery factor probability distribution is derived from these intelligently selected simulation runs. Here, FPRA is tested on an example case where the objective is to accurately compute the recovery factor statistics as a function of geologic uncertainty in a channelized turbidite reservoir. Recovery factor cumulative distribution functions computed by FPRA compare well to the one computed via exhaustive full-physics simulations.  相似文献   

2.
Uncertainty quantification is currently one of the leading challenges in the geosciences, in particular in reservoir modeling. A wealth of subsurface data as well as expert knowledge are available to quantify uncertainty and state predictions on reservoir performance or reserves. The geosciences component within this larger modeling framework is partially an interpretive science. Geologists and geophysicists interpret data to postulate on the nature of the depositional environment, for example on the type of fracture system, the nature of faulting, and the type of rock physics model. Often, several alternative scenarios or interpretations are offered, including some associated belief quantified with probabilities. In the context of facies modeling, this could result in various interpretations of facies architecture, associations, geometries, and the way they are distributed in space. A quantitative approach to specify this uncertainty is to provide a set of alternative 3D training images from which several geostatistical models can be generated. In this paper, we consider quantifying uncertainty on facies models in the early development stage of a reservoir when there is still considerable uncertainty on the nature of the spatial distribution of the facies. At this stage, production data are available to further constrain uncertainty. We develop a workflow that consists of two steps: (1) determining which training images are no longer consistent with production data and should be rejected and (2) to history match with a given fixed training image. We illustrate our ideas and methodology on a test case derived from a real field case of predicting flow in a newly planned well in a turbidite reservoir off the African West coast.  相似文献   

3.
Large scale geomechanical simulations are being increasingly used to model the compaction of stress dependent reservoirs, predict the long term integrity of under‐ground radioactive waste disposals, and analyse the viability of hot‐dry rock geothermal sites. These large scale simulations require the definition of homogenous mechanical properties for each geomechanical cell whereas the rock properties are expected to vary at a smaller scale. Therefore, this paper proposes a new methodology that makes possible to define the equivalent mechanical properties of the geomechanical cells using the fine scale information given in the geological model. This methodology is implemented on a synthetic reservoir case and two upscaling procedures providing the effective elastic properties of the Hooke's law are tested. The first upscaling procedure is an analytical method for perfectly stratified rock mass, whereas the second procedure computes lower and upper bounds of the equivalent properties with no assumption on the small scale heterogeneity distribution. Both procedures are applied to one geomechanical cell extracted from the reservoir structure. The results show that the analytical and numerical upscaling procedures provide accurate estimations of the effective parameters. Furthermore, a large scale simulation using the homogenized properties of each geomechanical cell calculated with the analytical method demonstrates that the overall behaviour of the reservoir structure is well reproduced for two different loading cases. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

4.
Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.  相似文献   

5.
Assessment of uncertainty due to inadequate data and imperfect geological knowledge is an essential aspect of the subsurface model building process. In this work, a novel methodology for characterizing complex geological structures is presented that integrates dynamic data. The procedure results in the assessment of uncertainty associated with the predictions of flow and transport. The methodology is an extension of a previously developed pattern search-based inverse method that models the spatial variation in flow parameters by searching for patterns in an ensemble of reservoir models. More specifically, the pattern-searching algorithm is extended in two directions: (1) state values (such as piezometric head) and parameters (such as conductivities) are simultaneously and sequentially estimated, which implies that real-time assimilation of dynamic data is possible as in ensemble filtering approaches; and (2) both the estimated parameter and state variables are considered when pattern searching is implemented. The new scheme results in two main advantages—better characterization of parameters, especially for delineating small scale features, and an ensemble of head states that can be used to update the parameter field using the dynamic data at the next instant, without running expensive flow simulations. An efficient algorithm for pattern search is developed, which works with a flexible search radius and can be optimized for the estimation of either large- or small-scale structures. Synthetic examples are employed to demonstrate the effectiveness and robustness of the proposed approach.  相似文献   

6.
Geomechanical models are often used to predict the impact on land surface of fluid withdrawal from deep reservoirs, as well as investigating measures for mitigation. The ability to accurately simulate surface displacements, however, is often impaired by limited information on the geomechanical parameters characterizing the geological formations of interest. In this study, we employ an ensemble smoother, a data assimilation algorithm, to provide improved estimates of reservoir parameters through assimilation of measurements of both horizontal and vertical surface displacement into geomechanical model results. The method leverages the demonstrated potential of remote sensing techniques developed in the last decade to provide accurate displacement data for large areas of the land surface. For evaluation purposes, the methodology is applied to the case of a disk‐shaped reservoir embedded in a homogeneous, isotropic, and linearly elastic half space, subject to a uniform change in fluid pressure. Multiple sources of uncertainty are investigated, including the radius, R, the thickness, h, and the depth, c, of the reservoir; the pore pressure change, Δp; porous medium's vertical uniaxial compressibility, cM, and Poisson's ratio, ν, and the ratio, s, between the compressibilities of the medium during loading and unloading cycles. Results from all simulations show that the ensemble smoother has the capability to effectively reduce the uncertainty associated with those parameters to which the variability and the spatial distribution of land surface displacements are most sensitive, namely, R, c, cM, and s. These analyses demonstrate that the estimation of these parameters values depends on the number of measurements assimilated and the error assigned to the measurement values. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

7.
One of the important recent advances in the field of hurricane/storm modelling has been the development of high-fidelity numerical simulation models for reliable and accurate prediction of wave and surge responses. The computational cost associated with these models has simultaneously created an incentive for researchers to investigate surrogate modelling (i.e. metamodeling) and interpolation/regression methodologies to efficiently approximate hurricane/storm responses exploiting existing databases of high-fidelity simulations. Moving least squares (MLS) response surfaces were recently proposed as such an approximation methodology, providing the ability to efficiently describe different responses of interest (such as surge and wave heights) in a large coastal region that may involve thousands of points for which the hurricane impact needs to be estimated. This paper discusses further implementation details and focuses on optimization characteristics of this surrogate modelling approach. The approximation of different response characteristics is considered, and special attention is given to predicting the storm surge for inland locations, for which the possibility of the location remaining dry needs to be additionally addressed. The optimal selection of the basis functions for the response surface and of the parameters of the MLS character of the approximation is discussed in detail, and the impact of the number of high-fidelity simulations informing the surrogate model is also investigated. Different normalizations of the response as well as choices for the objective function for the optimization problem are considered, and their impact on the accuracy of the resultant (under these choices) surrogate model is examined. Details for implementation of the methodology for efficient coastal risk assessment are reviewed, and the influence in the analysis of the model prediction error introduced through the surrogate modelling is discussed. A case study is provided, utilizing a recently developed database of high-fidelity simulations for the Hawaiian Islands.  相似文献   

8.
Determination of geomechanical parameters of petroleum reservoir and surrounding rock is important for coupled reservoir–geomechanical modeling, borehole stability analysis and hydraulic fracturing design. A displacement back analysis technique based on artificial neural network (ANN) and genetic algorithm (GA) combination is investigated in this paper to identify reservoir geomechanical parameters based on ground surface displacements. An ANN is used to map the nonlinear relationship between Young’s modulus, E, Poisson’s ratio, v, internal friction angle, Φ, cohesion, c and ground surface displacements. The necessary training and testing samples for ANN are created by using numerical analysis. GA is used to search the set of unknown reservoir geomechanical parameters. Results of the numerical experiment show that the displacement back analysis technique based on ANN–GA combination can effectively identify reservoir geomechanical parameters based on ground surface movements as a result of oil and gas production.  相似文献   

9.
We develop a new computational methodology for solving two‐phase flow in highly heterogeneous porous media incorporating geomechanical coupling subject to uncertainty in the poromechanical parameters. Within the framework of a staggered‐in‐time coupling algorithm, the numerical method proposed herein relies on a Petrov–Galerkin postprocessing approach projected on the Raviart–Thomas space to compute the Darcy velocity of the mixture in conjunction with a locally conservative higher order finite volume discretization of the nonlinear transport equation for the saturation and an operator splitting procedure based on the difference in the time‐scales of transport and geomechanics to compute the effects of transient porosity upon saturation. Notable features of the numerical modeling proposed herein are the local conservation properties inherited by the discrete fluxes that are crucial to correctly capture the fingering patterns arising from the interaction between heterogeneity and nonlinear viscous coupling. Water flooding in a poroelastic formation subject to an overburden is simulated with the geology characterized by multiscale self‐similar permeability and Young modulus random fields with power‐law covariance structure. Statistical moments of the poromechanical unknowns are computed within the framework of a high‐resolution Monte Carlo method. Numerical results illustrate the necessity of adopting locally conservative schemes to obtain reliable predictions of secondary recovery and finger growth in strongly heterogeneous deformable reservoirs. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

10.
The use of upscaled models is attractive in many-query applications that require a large number of simulation runs, such as uncertainty quantification and optimization. Highly coarsened models often display error in output quantities of interest, e.g., phase production and injection rates, so the direct use of these results for quantitative evaluations and decision making may not be appropriate. In this work, we introduce a machine-learning-based post-processing framework for modeling the error in coarse-model results in the context of uncertainty quantification. Coarse-scale models are constructed using an accurate global single-phase transmissibility upscaling procedure. The framework entails the use of high-dimensional regression (random forest in this work) to model error based on a number of error indicators or features. Many of these features are derived from approximations of the subgrid effects neglected in the coarse-scale saturation equation. These features are identified through volume averaging, and they are generated by solving a fine-scale saturation equation with a constant-in-time velocity field. Our approach eliminates the need for the user to hand-design a small number of informative (relevant) features. The training step requires the simulation of some number of fine and coarse models (in this work we perform either 10 or 30 training simulations), followed by construction of a regression model for each well. Classification is also applied for production wells. The methodology then provides a correction at each time step, and for each well, in the phase production and injection rates. Results are presented for two- and three-dimensional oil–water systems. The corrected coarse-scale solutions show significantly better accuracy than the uncorrected solutions, both in terms of realization-by-realization predictions for oil and water production rates, and for statistical quantities important for uncertainty quantification, such as P10, P50, and P90 predictions.  相似文献   

11.
12.
Obtaining accurate geological boundaries and assessing the uncertainty in these limits are critical for effective ore resource and reserve estimation. The uncertainty in the extent of an ore body can be the largest source of uncertainty in ore resource estimation when drilling is sparse. These limits are traditionally interpreted deterministically and it can be difficult to quantify uncertainty in the boundary and its impact on ore tonnage. The proposed methodology is to consider stochastic modeling of the ore boundary with a distance function recoding of the available data. This technique is modified to incorporate non-stationarities in the form of a locally varying anisotropy field used in kriging and sequential Gaussian simulation. Implementing locally varying anisotropy kriging retains the geologically realistic features of a deterministic model while allowing for a stochastic assessment of uncertainty. A case study of a gold deposit in Northern Canada is used to demonstrate the methodology. The proposed technique generates realistic, curvilinear geological boundary models and allows for an assessment of the uncertainty in the model.  相似文献   

13.
Flow simulation studies require an accurate model of the reservoir in terms of its sedimentological architecture. Pixel-based reservoir modeling techniques are often used to model this architecture. There are, however, two problem areas with such techniques. First, several statistical parameters have to be provided whose influence on the resulting model is not readily inferable. Second, conditioning the models to relevant geological data that carry great uncertainty on their own adds to the difficulty of obtaining reliable models and assessing model reliability. The Sequential Indicator Simulation (SIS) method has been used to examine the impact of such uncertainties on the final reservoir model. The effects of varying variogram types, frequencies of lithology occurrence, and the gridblock model orientation with respect to the sedimentological trends are illustrated using different reservoir modeling studies. Results indicate, for example, that the choice of variogram type can have a significant impact on the facies model. Also, reproduction of sedimentological trends and large geometries requires careful parameter selection. By choosing the appropriate modeling strategy, sedimentological principles can be translated into the numerical model. Solutions for dealing with such issues and the geological uncertainties are presented. In conclusion, each reservoir modeling study should begin by developing a thorough quantitative sedimentological understanding of the reservoir under study, followed by detailed sensitivity analyses of relevant statistical and geological parameters.  相似文献   

14.
We present a model-driven uncertainty quantification methodology based on sparse grid sampling techniques in the context of a generalized polynomial chaos expansion (GPCE) approximation of a basin-scale geochemical evolution scenario. The approach is illustrated through a one-dimensional example involving the process of quartz cementation in sandstones and the resulting effects on the dynamics of the vertical distribution of porosity, pressure, and temperature. The proposed theoretical framework and computational tools allow performing an efficient and accurate global sensitivity analysis (GSA) of the system states (i.e., porosity, temperature, pressure, and fluxes) in the presence of uncertain key mechanical and geochemical model parameters as well as boundary conditions. GSA is grounded on the use of the variance-based Sobol indices. These allow discriminating the relative weights of uncertain quantities on the global model variance and can be computed through the GPCE of the model response. Evaluation of the GPCE of the model response is performed through the implementation of a sparse grid approximation technique in the space of the selected uncertain quantities. GPCE is then be employed as a surrogate model of the system states to quantify uncertainty propagation through the model in terms of the probability distribution (and its statistical moments) of target system states.  相似文献   

15.
Qiao  C.  Myers  A. T. 《Natural Hazards》2022,110(3):1545-1563

Metocean conditions during hurricanes are defined by multiple parameters (e.g., significant wave height and surge height) that vary in time with significant auto- and cross-correlation. In many cases, the nature of the variation of these characteristics in time is important to design and assess the risk to offshore structures, but a persistent problem is that measurements are sparse and time history simulations using metocean models are computationally onerous. Surrogate modeling is an appealing approach to ease the computational burden of metocean modeling; however, modeling the time-dependency of metocean conditions using surrogate models is challenging because the conditions at one time instant are dependent on not only the conditions at that instant but also on the conditions at previous time instances. In this paper, time-dependent surrogate modeling of significant wave height, peak wave period, peak wave direction, and storm surge is explored using a database of metocean conditions at an offshore site. Three types of surrogate models, including Kriging, multilayer perceptron (MLP), and recurrent neural network with gated recurrent unit (RNN-GRU), are evaluated, with two different time-dependent structures considered for the Kriging model and two training set sizes for the MLP model, resulting in a total of five models evaluated in this paper. The performance of the models is compared in terms of accuracy and sensitivity toward hyperparameters, and the MLP and RNN-GRU models are demonstrated to have extraordinary prediction performance in this context.

  相似文献   

16.
This study illustrates a procedure conducive to a preliminary risk analysis of overpressure development in sedimentary basins characterized by alternating depositional events of sandstone and shale layers. The approach rests on two key elements: (1) forward modeling of fluid flow and compaction, and (2) application of a model-complexity reduction technique based on a generalized polynomial chaos expansion (gPCE). The forward model considers a one-dimensional vertical compaction processes. The gPCE model is then used in an inverse modeling context to obtain efficient model parameter estimation and uncertainty quantification. The methodology is applied to two field settings considered in previous literature works, i.e. the Venture Field (Scotian Shelf, Canada) and the Navarin Basin (Bering Sea, Alaska, USA), relying on available porosity and pressure information for model calibration. It is found that the best result is obtained when porosity and pressure data are considered jointly in the model calibration procedure. Uncertainty propagation from unknown input parameters to model outputs, such as pore pressure vertical distribution, is investigated and quantified. This modeling strategy enables one to quantify the relative importance of key phenomena governing the feedback between sediment compaction and fluid flow processes and driving the buildup of fluid overpressure in stratified sedimentary basins characterized by the presence of low-permeability layers. The results here illustrated (1) allow for diagnosis of the critical role played by the parameters of quantitative formulations linking porosity and permeability in compacted shales and (2) provide an explicit and detailed quantification of the effects of their uncertainty in field settings.  相似文献   

17.
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our implementation of the MCMC method provides the gold standard against which the aforementioned Gaussian approximations are assessed. We present numerical synthetic experiments where we quantify the capability of each of the ad hoc Gaussian approximation in reproducing the mean and the variance of the posterior distribution (characterized via MCMC) associated to a data assimilation problem. Both single-phase and two-phase (oil–water) reservoir models are considered so that fundamental differences in the resulting forward operators are highlighted. The main objective of our controlled experiments was to exhibit the substantial discrepancies of the approximation properties of standard ad hoc Gaussian approximations. Numerical investigations of the type we present here will lead to the greater understanding of the cost-efficient, but ad hoc, Bayesian techniques used for data assimilation in petroleum reservoirs and hence ultimately to improved techniques with more accurate uncertainty quantification.  相似文献   

18.
Updating of Population Parameters and Credibility of Discriminant Analysis   总被引:1,自引:0,他引:1  
The uncertainty of classification in discriminant analysis may result from the original characteristics of the phenomena studied, the approach of inferring population parameters, and the credibility of the parameters which are estimated by geologist or statistician. A credibility function and a significance function are proposed. Both can be used to appraise the uncertainty of classification. The former is involved with the uncertainty resulting from the errors in the reward-penalty matrix, while the latter may be involved with the uncertainty resulting from the original characteristics of the phenomena studied and the statistical approach. Inappropriate classified results may be originated from the bias estimates of population parameters (mean vector and covariance matrix), which are estimated by bias samples. These bias estimates can be updated by constraining the varying region of the mean vector. The equations for updating Bayesian estimates of the mean vector and the covariance matrix are demonstrated if the mean vector is restricted to a subregion of the entire real space. Results for a gas reservoir indicate that the discriminant rules based on the updated equations are more efficient than the traditional discriminant rules.  相似文献   

19.
Complicated sedimentary processes control the spatial distribution of geological heterogeneities. This serves to make the nature of the fluid flow in the hydrocarbon reservoirs immensely complex. Proper modeling of these heterogeneities and evaluation of their connectivity are crucial and affects all aspects of fluid flow. Since the natural variability of heterogeneity occurs in a myriad of length scales, accurate modeling of the rock type connectivity requires a very fine scheme, which is computationally very expensive. Hence, this makes other alternative methods such as the percolation approach attractive and necessary. The percolation approach considers the hypothesis that a reservoir can be split into either permeable (sand/fracture) or impermeable rocks (shale/matrix). In this approach, the connectivity of the permeable fraction governs the flow. This method links the global properties of the system to the density of the permeable objects distributed randomly in the system. Moreover, this approach reduces many results to some simple master curves from which all-possible outcomes can be predicted by simple algebraic transformations. The current study contributes to extending the applicability of the methodology to anisotropic systems as well as using the complicated and more realistic sandbody shapes (for example, ellipsoids). This enables us to attain a better assessment of the connectivity and its associated uncertainty of the complicated rock types. Furthermore, to validate the approach, the Burgan reservoir dataset of the Norouz offshore oil field in the south of Iran was used. The findings are in conformity with the percolation approach predictions.  相似文献   

20.
Reconstruction of geological structures has the potential to provide additional insight into the effect of the depositional history on the current-day geomechanical and hydro-geologic state. Accurate modeling of the reconstruction process is, however, complex, necessitating advanced procedures for the prediction of fault formation and evolution within fully coupled geomechanical, fluid flow and temperature fields. In this paper, a 3-D computational approach is presented that is able to forward model complex structural evolution with multiple intersecting faults that exhibit large relative movement within a coupled geomechanical/flow environment. The approach adopts the Lagrangian method, complemented by robust and efficient automated adaptive meshing techniques, an elasto-plastic constitutive model based on critical state concepts, and global energy dissipation regularized by inclusion of fracture energy in the equations governing state variable evolution. The proposed model is validated by comparison of 2-D plane strain and 3-D thin-slice predictions of a bench-scale experiment, and then applied to two conceptual coupled geomechanical/fluid flow field-scale benchmarks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号