首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We present a method to determine lower and upper bounds to the predicted production or any other economic objective from history-matched reservoir models. The method consists of two steps: 1) performing a traditional computer-assisted history match of a reservoir model with the objective to minimize the mismatch between predicted and observed production data through adjusting the grid block permeability values of the model. 2) performing two optimization exercises to minimize and maximize an economic objective over the remaining field life, for a fixed production strategy, by manipulating the same grid block permeabilities, however without significantly changing the mismatch obtained under step 1. This is accomplished through a hierarchical optimization procedure that limits the solution space of a secondary optimization problem to the (approximate) null space of the primary optimization problem. We applied this procedure to two different reservoir models. We performed a history match based on synthetic data, starting from a uniform prior and using a gradient-based minimization procedure. After history matching, minimization and maximization of the net present value (NPV), using a fixed control strategy, were executed as secondary optimization problems by changing the model parameters while staying close to the null space of the primary optimization problem. In other words, we optimized the secondary objective functions, while requiring that optimality of the primary objective (a good history match) was preserved. This method therefore provides a way to quantify the economic consequences of the well-known problem that history matching is a strongly ill-posed problem. We also investigated how this method can be used as a means to assess the cost-effectiveness of acquiring different data types to reduce the uncertainty in the expected NPV.  相似文献   

2.
Interpretation of geophysical data or other indirect measurements provides large-scale soft secondary data for modeling hard primary data variables. Calibration allows such soft data to be expressed as prior probability distributions of nonlinear block averages of the primary variable; poorer quality soft data leads to prior distributions with large variance, better quality soft data leads to prior distributions with low variance. Another important feature of most soft data is that the quality is spatially variable; soft data may be very good in some areas while poorer in other areas. The main aim of this paper is to propose a new method of integrating such soft data, which is large-scale and has locally variable precision. The technique of simulated annealing is used to construct stochastic realizations that reflect the uncertainty in the soft data. This is done by constraining the cumulative probability values of the block average values to follow a specified distribution. These probability values are determined by the local soft prior distribution and a nonlinear average of the small-scale simulated values within the block, which are all known. For each realization to accurately capture the information contained in the soft data distributions, we show that the probability values should be uniformly distributed between 0 and 1. An objective function is then proposed for a simulated annealing based approach to enforce this uniform probability constraint. The theoretical justification of this approach is discussed, implementation details are considered, and an example is presented.  相似文献   

3.
A geostatistically-based inverse technique, the sequential-self calibration (SSC) method, is used to update reservoir models so that they match observed pressure, water cut and time-lapse water saturation derived from 4-D seismic. Within the SSC, a steady-state genetic algorithm (GA) is applied to search the optimal master point locations, as well as the associated optimal permeability perturbations at the master locations. GA provides significant flexibility for SSC to parameterize master point locations, as well as to integrate different types of dynamic data because it does not require sensitivity coefficients. We show that the coupled SSC/GA method is very robust. Integrating dynamic data can significantly improve the characterization of reservoir heterogeneity with reduced uncertainty. Particularly, it can efficiently identify important large-scale spatial variation patterns (e.g., well connectivity, near well averages, high flow channels and low flow barriers) embedded in the reservoir heterogeneity. Using dynamic data, however, could be difficult to reproduce the permeability values on the cell-by-cell basis for the entire model. This reveals the important evidence that dynamic data carry information about large-scale spatial variation features, while they may be not sufficient to resolve the individual local values for the entire model. Through multiple realization analysis, the large-scale spatial features carried by the dynamic data can be extracted and represented by the ensemble mean model. Furthermore, the region informed by the dynamic data can be identified as the area with significant reduced variances in the ensemble variance model. Within this region, the cell-by-cell correlation between the true and updated permeability values can be significantly improved by integrating the dynamic data.  相似文献   

4.
Simulation-based optimization methods have been recently proposed for calibrating geotechnical models from laboratory and field tests. In these methods, geotechnical parameters are identified by matching model predictions to experimental data, i.e. by minimizing an objective function that measures the difference between the two. Expensive computational models, such as finite difference or finite element models are often required to simulate laboratory or field geotechnical tests. In such cases, simulation-based optimization might prove demanding since every evaluation of the objective function requires a new model simulation until the optimum set of parameter values is achieved. This paper introduces a novel simulation-based “hybrid moving boundary particle swarm optimization” (hmPSO) algorithm that enables calibration of geotechnical models from laboratory or field data. The hmPSO has proven effective in searching for model parameter values and, unlike other optimization methods, does not require information about the gradient of the objective function. Serial and parallel implementations of hmPSO have been validated in this work against a number of benchmarks, including numerical tests, and a challenging geotechnical problem consisting of the calibration of a water infiltration model for unsaturated soils. The latter application demonstrates the potential of hmPSO for interpreting laboratory and field tests as well as a tool for general back-analysis of geotechnical case studies.  相似文献   

5.
To physically investigate permeability upscaling, over 13,000 permeability values were measured with four different sample supports (i.e., sample volumes) on a block of Berea Sandstone. At each sample support, spatially exhaustive permeability datasets were measured, subject to consistent flow geometry and boundary conditions, with a specially adapted minipermeameter test system. Here, we present and analyze a subset of the data consisting of 2304 permeability values collected from a single block face oriented normal to stratification. Results reveal a number of distinct and consistent trends (i.e., upscaling) relating changes in key summary statistics to an increasing sample support. Examples include the sample mean and semivariogram range that increase with increasing sample support and the sample variance that decreases. To help interpret the measured mean upscaling, we compared it to theoretical models that are only available for somewhat different flow geometries. The comparison suggests that the nonuniform flow imposed by the minipermeameter coupled with permeability anisotropy at the scale of the local support (i.e., smallest sample support for which data is available) are the primary controls on the measured upscaling. This work demonstrates, experimentally, that it is not always appropriate to treat the local-support permeability as an intrinsic feature of the porous medium, that is, independent of its conditions of measurement.  相似文献   

6.
In sandstone, there is a trend between porosity (?) and permeability (k). It is a linear relationship having the form log (k)?=?a?+?(b ?). The slope, intercept, and degree of scatter of the log(k)???? trends vary from formation to another. These variations are attributed to differences in initial grain size and sorting, diagenetic history, cementation, clay content, pore geometry, and compaction history. In the literature, permeability and porosity modeling by using lab experiments was carried out by using unconsolidated sandstone, sand packs, or synthetic particles. Such models cannot be applied to predict flow properties of consolidated natural sandstone. Furthermore in these models, sand grain size, shape, and sorting factors were considered as the main factors that affect porosity and permeability. Hardly, any attention was paid to the confining pressure and the fraction of cementing material that bind the grain to form a coherent rock. If these two crucial aspects are not taken into consideration during the model development, the model cannot be applied to natural consolidated sandstone. The main objective of the present paper is to develop a new model for porosity versus permeability taking into account important factors such as sand grain size and sorting, compaction pressure, and concentration of cementing material that bind the sand grains. The effect for clay swelling or migration was however discarded, as the sand grains were washed prior to consolidation. The sand used in producing the sandstone cores was medium- to fine-sized well-sorted sand grains. The grain’s sphericity was measured to be in the range of (0.8–0.9) with little angularity. The fabricated cores have an average compressive strength of 5,700 psi, which is comparable with Bera sandstone strength. Also, the produced cores were stable in the fluid media as they were subjected to 300 °C to allow cementing material to be crystallized. The aspect of the present work was to analyze the dependence of both the permeability as well as the porosity on the variables of the present study that consist of grain size, cementation fraction, and the confining pressure. Using the experimental data, a linear relationship, in terms of each variable, was developed here that can eventually help researchers to fabricate cores with desired properties. The second step was to generate more general models to be used as references for scholars for further work in this research field. Nonlinear regression analysis was carried out on all the three variables of the present study to obtain two nonlinear correlations: one describes the behavior of permeability and the other describes porosity. In the third step, an advanced correlation that describes permeability versus porosity in a quantitative manner was developed by using nonlinear regression analysis. Permeability was studied accordingly as a function of all the three variables of the present study as well as porosity. This step represents the main objective of this paper.  相似文献   

7.
This paper presents a consistent Bayesian solution for data integration and history matching for oil reservoirs while accounting for both model and parameter uncertainties. The developed method uses Gaussian Process Regression to build a permeability map conforming to collected data at well bores. Following that, an augmented Markov Chain Monte Carlo sampler is used to condition the permeability map to dynamic production data. The selected proposal distribution for the Markov Chain Monte Carlo conforms to the Gaussian process regression output. The augmented Markov Chain Monte Carlo sampler allows transition steps between different models of the covariance function, and hence both the parameter and model space are effectively explored. In contrast to single model Markov Chain Monte Carlo samplers, the proposed augmented Markov Chain Monte Carlo sampler eliminates the selection bias of certain covariance structures of the inferred permeability field. The proposed algorithm can be used to account for general model and parameter uncertainties.  相似文献   

8.
Reservoir characterization needs the integration of various data through history matching, especially dynamic information such as production or 4D seismic data. Although reservoir heterogeneities are commonly generated using geostatistical models, random realizations cannot generally match observed dynamic data. To constrain model realizations to reproduce measured dynamic data, an optimization procedure may be applied in an attempt to minimize an objective function, which quantifies the mismatch between real and simulated data. Such assisted history matching methods require a parameterization of the geostatistical model to allow the updating of an initial model realization. However, there are only a few parameterization methods available to update geostatistical models in a way consistent with the underlying geostatistical properties. This paper presents a local domain parameterization technique that updates geostatistical realizations using assisted history matching. This technique allows us to locally change model realizations through the variation of geometrical domains whose geometry and size can be easily controlled and parameterized. This approach provides a new way to parameterize geostatistical realizations in order to improve history matching efficiency.  相似文献   

9.
Distance-based stochastic techniques have recently emerged in the context of ensemble modeling, in particular for history matching, model selection and uncertainty quantification. Starting with an initial ensemble of realizations, a distance between any two models is defined. This distance is defined such that the objective of the study is incorporated into the geological modeling process, thereby potentially enhancing the efficacy of the overall workflow. If the intent is to create new models that are constrained to dynamic data (history matching), the calculation of the distance requires flow simulation for each model in the initial ensemble. This can be very time consuming, especially for high-resolution models. In this paper, we present a multi-resolution framework for ensemble modeling. A distance-based procedure is employed, with emphasis on the rapid construction of multiple models that have improved dynamic data conditioning. Our intent is to construct new high-resolution models constrained to dynamic data, while performing most of the flow simulations only on upscaled models. An error modeling procedure is introduced into the distance calculations to account for potential errors in the upscaling. Based on a few fine-scale flow simulations, the upscaling error is estimated for each model using a clustering technique. We demonstrate the efficiency of the method on two examples, one where the upscaling error is small, and another where the upscaling error is significant. Results show that the error modeling procedure can accurately capture the error in upscaling, and can thus reproduce the fine-scale flow behavior from coarse-scale simulations with sufficient accuracy (in terms of uncertainty predictions). As a consequence, an ensemble of high-resolution models, which are constrained to dynamic data, can be obtained, but with a minimum of flow simulations at the fine scale.  相似文献   

10.
The objective of this research was to use numerical models based on mechanical approaches to improve the integration of the protective role of forests against rockfall into block propagation models. A model based on the discrete element method (DEM) was developed to take into account the complex mechanical processes involved during the impact of a block on a tree. This modelling approach requires the definition of many input parameters and cannot be directly integrated into block propagation models. A global sensitivity analysis identified the leading parameters of the block kinematics after impact (i.e. block energy reduction, trajectory changes, and rotational velocity): the impact velocity, the tree diameter, and the impact point horizontal location (i.e. eccentricity). Comparisons with the previous experimental and numerical studies of block impacts on trees demonstrated the applicability of the DEM model and showed some of the limitations of earlier approaches. Our sensitivity analysis highlights the significant influence of the impact velocity on the reduction of the block’s kinetic energy. Previous approaches usually also focus on parameters such as impact height, impact vertical incidence, and tree species, whose importance is only minor according to the present results. This suggests that the integration of forest effects into block propagation models could be both improved and simplified. The DEM model can also be used as an alternative to classical approaches for the integration of forest effects by directly coupling it with block propagation models. This direct coupling only requires the additional definition of the location and the diameter of each tree. Indeed, the input parameters related to the mechanical properties of the stem and the block/stem interaction in the DEM model can be set to average values because they are not leading parameters. The other input parameters are already defined or calculated in the block propagation model.  相似文献   

11.
A key challenge in the oil and gas industry is the ability to predict key petrophysical properties such as porosity and permeability. The predictability of such properties is often complicated by the complex nature of geologic materials. This study is aimed at developing models that can estimate permeability in different reservoir sandstone facies types. This has been achieved by integrating geological characterization, regression models and artificial neural network models with porosity as the input data and permeability as the output. The models have been developed, validated and tested using samples from three wells and their predictive accuracy tested by using them to predict the permeability in a fourth well which was excluded from the model development. The results indicate that developing the models on a facies basis provides a better predictive capability and simpler models compared to developing a single model for all the facies combined. The model for the combined facies predicted permeability with a correlation coefficient of 0.41 which is significantly lower than the correlation coefficient of 0.97, 0.93, 0.99, 0.96, 0.96 and 0.85 for the massive coarse-grained sandstones, massive fine-grained sandstones-moderately sorted, massive fine-grained sandstones-poorly sorted, massive very fine-grained sandstones, parallel-laminated sandstones and bioturbated sandstones, respectively. The models proposed in this paper can predict permeability at up to 99% accuracy. The lower correlation coefficient of the bioturbated sandstone facies compared to other facies is attributed to the complex and variable nature of bioturbation activities which controls the petrophysical properties of highly bioturbated rocks.  相似文献   

12.
煤层气井排采动态主控地质因素分析   总被引:3,自引:0,他引:3  
沁水盆地寿阳区块和柿庄区块煤层气(CBM)井的排采动态在整体上表现出明显差异,而单一区块内部煤层气井的排采动态也存在较大差异。本文就两个区块的煤系地层沉积相、煤层渗透率、断裂构造、地应力类型和构造应力强度以及顶底板岩性组合类型等因素对排采动态的影响开展对比分析。基于静态地质条件和排采动态资料的综合研究表明:煤系地层沉积相、煤层渗透率、地应力类型和构造应力强度的差异是两个区块煤层气井排采动态差异的主要原因;单一区块内煤层气井的排采动态差异受控于局部断裂构造、地应力类型以及煤层顶底板岩性组合类型等局部因素;在煤层气开发选区和开发井位部署时,应综合考虑资源量、渗透率和多种局部地质因素的共同影响。  相似文献   

13.
Randomized maximum likelihood is known in the petroleum reservoir community as a Bayesian history matching technique by means of minimizing a stochastic quadratic objective function. The algorithm is well established and has shown promising results in several applications. For linear models with linear observation operator, the algorithm samples the posterior density accurately. To improve the sampling for nonlinear models, we introduce a generalized version in its simplest form by re-weighting the prior. The weight term is motivated by a sufficiency condition on the expected gradient of the objective function. Recently, an ensemble version of the algorithm was proposed which can be implemented with any simulator. Unfortunately, the method has some practical implementation issues due to computation of low rank pseudo inverse matrices and in practice only the data mismatch part of the objective function is maintained. Here, we take advantage of the fact that the measurement space is often much smaller than the parameter space and project the prior uncertainty from the parameter space to the measurement space to avoid over fitting of data. The proposed algorithms show good performance on synthetic test cases including a 2D reservoir model.  相似文献   

14.
塔河9区三叠系下油组油藏属大底水、薄油层、受构造控制的砂岩孔隙型块状底水未饱和油藏.该区块的地质建模过程,采用目前较为先进的Petrel软件,充分利用钻井、地震、测井、地层对比信息,结合夹层解释结果,在孔渗曲线的基础上,选用不同的建模方法,通过对各种随机模型的评价,最终建立了接近油藏实际地质特征的三维精细地质模型.  相似文献   

15.
Many decision-making processes in the Earth sciences require the combination of multiple data originating from diverse sources. These data are often indirect and uncertain, and their combination would call for a probabilistic approach. These data are also partially redundant with each other or with all others taken jointly. This overlap in information arises due to a variety of reasons—because the data arises from the same geology, because they originate from the same location or the same measurement device, etc. The proposed tau model combines partially redundant data, each taking the form of a prior probability for the event being assessed to occur given that single datum. The parameters of that tau model measure the additional contribution brought by any single datum over that of all previously considered data; they are data sequence-dependent and also data value-dependent. Data redundancy depends on the sequence in which the data is considered and also on the data values themselves. However, for a given sequence, averaging the tau model parameters over all possible data values leads to exact analytical expressions and corresponding approximations and inference avenues. Information on multiple-point connectivity of permeability arrives from core data, well-test data and seismic data which are defined over varying supports with complex redundancy between these information sources. In order to compute these tau weights for determining connectivity, one needs a model of data redundancy, here expressed as a vectorial training image (Ti) constructed using a prior conceptual knowledge of geology and the physics of data measurement. From such a vectorial Ti, the tau weights can be computed exactly. Neglecting data redundancy leads to an over-compounding of individual data information and the possible risk of making extreme decisions.  相似文献   

16.
陈君  刘明明  李星  陈益峰  周创兵 《岩土力学》2016,37(6):1706-1714
裂隙岩体的渗透特性受控于裂隙的发育特征、连通特性和充填情况,并与岩体的地应力水平具有显著的相关性。基于裂隙岩体渗透性的影响因素,并考虑现有渗透系数估算模型的不足,利用钻孔压水试验和钻孔电视图像资料,建立考虑埋深(Z)、岩石质量指标(RQD)以及充填物指标(FSD)等3个指标的渗透系数估算ZRF模型,并应用到牙根二级水电站及其他工程区的渗透系数估算中。结果表明,与已有的渗透系数估算模型相比,ZRF模型较好地反映了岩体渗透性的影响因素,且模型参数物理意义明确,便于获取,对分析裂隙岩体渗透性具有一定的工程参考价值。  相似文献   

17.
The well-established free-fluid model from NMR technique provides continuous permeability values that closely match with core permeabilities better than most theoretical models especially when it is core calibrated for field specific use. However, only few wells have NMR logs in a field while marginal fields may not have any due to economic reasons. This study explored means of achieving one of the overriding objectives of most marginal field operators, which is to reduce the overall cost of production to the attainable minimum. The free-fluid model was modified into two simple and cost-effective models in order to optimize its applicability to predict permeability in the absence of NMR data. The two new models, which were developed for the single and double porosity systems analyzed in this study, consist of calibration parameters that can be empirically determined to account for variation in reservoir quality based on the rock type profile per field. A non-matrix parameter, α, was introduced into the model derived for tight gas sandstone being regarded as a double-porosity formation. This inclusion represents the permeability contribution of natural fractures or any crack-like pores to the different flow units. By using the alternative version to the known free-fluid model, continuous permeability curves that match experimental results were predicted without NMR logs.  相似文献   

18.
The factor of safety used in designing pile foundations for vertical load should depend on three things, prior information on load capacity summarized by empirical correlations with load capacity models, site specific information derived from load tests, and an objective function reflecting economic and safety considerations. A statistical approach to factor of safety selection was developed in order to suggest improvements of current standards for driven pile design. This approach recognizes a distinction between the variability of pile load capacity within individual sites, and the global variability upon which model correlations are based. Charts have been prepared for determining the FS required to achieve specified reliability indices, as a function of the number of load tests at a particular site and their outcomes.  相似文献   

19.
陈科 《地质与勘探》2021,57(6):1401-1407
毛管压力对煤层气开发具有重要意义。为定量表征中、低渗高阶煤煤岩毛管压力曲线,开展了中、低渗高煤阶煤岩高压压汞实验,利用得到的毛管压力数据回归建立了新的毛管压力曲线双参数和单参数数学模型,并研究了参数a、b间关系,以及参数a、b对毛管压力的影响及其影响因素。结果表明,本研究拟合建立的新毛管压力模型能够很好地拟合中、低高阶煤样的高压压汞实验测试得到的毛管压力数据,拟合程度达到98%以上,为定量表征高阶煤储层毛管压力提供了经验公式。根据本文建立的新毛管压力经验公式,在以SHg为横坐标,以lnPc-lnPmin-ln(SHg^0.5)为纵坐标的在直角坐标系中,SHg与lnPc-lnPmin-ln(SHg^0.5)成幂指数关系lnPc-lnPmin-ln(SHg^0.5)=aSHgb,因此可通过最小二乘法拟合计算参数a和b值,进而可以得到相应煤样的毛管压力数学模型。参数a和b呈负幂函数关系,其关系式为,基于该关系式,可进一步得到单一参数毛管压力数学模型为,则lnPc-lnPmin-ln(SHg^0.5)和在双自然对数坐标中成线性关系,通过最小二乘法进行线性回归得到直线斜率即为b值,进一步简化了毛管压力模型。利用新数学模型开展数值模拟,发现煤样毛管压力随参数a和b增加而增加;参数a和b与孔渗的关系表明,参数a随渗透率、孔隙度增加而降低,成线性关系,而参数b则随渗透率、孔隙度增加而增加,均成指数关系。  相似文献   

20.
Bayesian inference modeling may be applied to empirical stochastic prediction in geomorphology where outcomes of geomorphic processes can be expressed by probability density functions. Natural variations in process outputs are accommodated by the probability model. Uncertainty in the values of model parameters is reduced by considering statistically independent prior information on long-term, parameter behavior. Formal combination of model and parameter information yields a Bayesian probability distribution that accounts for parameter uncertainty, but not for model uncertainty or systematic error which is ignored herein. Prior information is determined by ordinary objective or subjective methods of geomorphic investigation. Examples involving simple stochastic models are given, as applied to the prediction of shifts in river courses, alpine rock avalanches, and fluctuating river bed levels. Bayesian inference models may be applied spatially and temporally as well as to functions of a random variable. They provide technically superior forecasts, for a given shortterm data set, to those of extrapolation or stochastic simulation models. In applications the contribution of the field geomorphologist is of fundamental quantitative importance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号