首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, we propose multilevel Monte Carlo (MLMC) methods that use ensemble level mixed multiscale methods in the simulations of multiphase flow and transport. The contribution of this paper is twofold: (1) a design of ensemble level mixed multiscale finite element methods and (2) a novel use of mixed multiscale finite element methods within multilevel Monte Carlo techniques to speed up the computations. The main idea of ensemble level multiscale methods is to construct local multiscale basis functions that can be used for any member of the ensemble. In this paper, we consider two ensemble level mixed multiscale finite element methods: (1) the no-local-solve-online ensemble level method (NLSO); and (2) the local-solve-online ensemble level method (LSO). The first approach was proposed in Aarnes and Efendiev (SIAM J. Sci. Comput. 30(5):2319-2339, 2008) while the second approach is new. Both mixed multiscale methods use a number of snapshots of the permeability media in generating multiscale basis functions. As a result, in the off-line stage, we construct multiple basis functions for each coarse region where basis functions correspond to different realizations. In the no-local-solve-online ensemble level method, one uses the whole set of precomputed basis functions to approximate the solution for an arbitrary realization. In the local-solve-online ensemble level method, one uses the precomputed functions to construct a multiscale basis for a particular realization. With this basis, the solution corresponding to this particular realization is approximated in LSO mixed multiscale finite element method (MsFEM). In both approaches, the accuracy of the method is related to the number of snapshots computed based on different realizations that one uses to precompute a multiscale basis. In this paper, ensemble level multiscale methods are used in multilevel Monte Carlo methods (Giles 2008a, Oper.Res. 56(3):607-617, b). In multilevel Monte Carlo methods, more accurate (and expensive) forward simulations are run with fewer samples, while less accurate (and inexpensive) forward simulations are run with a larger number of samples. Selecting the number of expensive and inexpensive simulations based on the number of coarse degrees of freedom, one can show that MLMC methods can provide better accuracy at the same cost as Monte Carlo (MC) methods. The main objective of the paper is twofold. First, we would like to compare NLSO and LSO mixed MsFEMs. Further, we use both approaches in the context of MLMC to speedup MC calculations.  相似文献   

2.
基于蒙特卡罗随机有限元法的三维随机渗流场研究   总被引:3,自引:0,他引:3  
王林  徐青 《岩土力学》2014,35(1):287-292
通过建立改进Latin超立方抽样和对偶抽样相结合的复合抽样法,以提高Monte Carlo方法的计算效率,并将其引入Monte Carlo随机有限元(MSFEM)。基于三维有限元模型,采用MCSFEM对山坪土石坝进行随机渗流场分析,研究渗透系数和水头边界条件的随机特性对渗流场的干扰,进行变异系数和抽样次数的敏感性分析。最后,对渗流场的求解量进行概型分析。研究表明:总水头势、流速及渗透体积力的变异性随着渗透系数随机性的增强而变大;复合抽样法既能有效加快Monte Carlo的收敛速度,又能降低样本间的统计相关性,说明了该方法的实用性与有效性;当渗透系数服从正态分布时,渗流场中所取结点的水头和坡降也服从正态分布。  相似文献   

3.
In this paper we present error and performance analysis of quasi-Monte Carlo algorithms for solving multidimensional integrals (up to 100 dimensions) on the grid using MPI. We take into account the fact that the Grid is a potentially heterogeneous computing environment, where the user does not know the specifics of the target architecture. Therefore parallel algorithms should be able to adapt to this heterogeneity, providing automated load-balancing. Monte Carlo algorithms can be tailored to such environments, provided parallel pseudorandom number generators are available. The use of quasi-Monte Carlo algorithms poses more difficulties. In both cases the efficient implementation of the algorithms depends on the functionality of the corresponding packages for generating pseudorandom or quasirandom numbers. We propose efficient parallel implementation of the Sobol sequence for a grid environment and we demonstrate numerical experiments on a heterogeneous grid. To achieve high parallel efficiency we use a newly developed special grid service called Job Track Service which provides efficient management of available computing resources through reservations.  相似文献   

4.
目前,测流不确定度通过误差试验或通过经验数值来确定,但这些方式存在着工作量大或不确定估计不足等局限性。为解决此问题,对基于实测数据和统计理论的插值方差估计法在不同测流条件下进行了验证,选取白河、襄阳和沙洋3个流量站进行了实测数据的不确定度分析,同时对白河站进行了Monte Carlo试验,比较插值方差估计法得到的不确定度与真实误差的差异。结果表明,插值方差估计法能较好地反映水位变化的影响,插值方差估计法所得到的不确定度与真实测流误差的相关系数达0.64,与断面水位变化的Spearman相关系数达0.79,高、中水位情况下插值方差估计法的不确定度估计结果较为合理,低水位情况下偏高。  相似文献   

5.
在平衡损失风险函数准则下,研究线性模型中回归系数的stein估计优于最小二乘估计(LS)的充分必要条件,然后在pitman closeness(PC)准则下比较了stein估计相对于最小二乘估计的优良性。  相似文献   

6.
A new uncertainty quantification framework is adopted for carbon sequestration to evaluate the effect of spatial heterogeneity of reservoir permeability on CO2 migration. Sequential Gaussian simulation is used to generate multiple realizations of permeability fields with various spatial statistical attributes. In order to deal with the computational difficulties, the following ideas/approaches are integrated. First, different efficient sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling) are used to reduce the number of forward calculations, explore effectively the parameter space, and quantify the input uncertainty. Second, a scalable numerical simulator, extreme-scale Subsurface Transport Over Multiple Phases, is adopted as the forward modeling simulator for CO2 migration. The framework has the capability to quantify input uncertainty, generate exploratory samples effectively, perform scalable numerical simulations, visualize output uncertainty, and evaluate input-output relationships. The framework is demonstrated with a given CO2 injection scenario in heterogeneous sandstone reservoirs. Results show that geostatistical parameters for permeability have different impacts on CO2 plume radius: the mean parameter has positive effects at the top layers, but affects the bottom layers negatively. The variance generally has a positive effect on the plume radius at all layers, particularly at middle layers, where the transport of CO2 is highly influenced by the subsurface heterogeneity structure. The anisotropy ratio has weak impacts on the plume radius, but affects the shape of the CO2 plume.  相似文献   

7.
Development of subsurface energy and environmental resources can be improved by tuning important decision variables such as well locations and operating rates to optimize a desired performance metric. Optimal well locations in a discretized reservoir model are typically identified by solving an integer programming problem while identification of optimal well settings (controls) is formulated as a continuous optimization problem. In general, however, the decision variables in field development optimization can include many design parameters such as the number, type, location, short-term and long-term operational settings (controls), and drilling schedule of the wells. In addition to the large number of decision variables, field optimization problems are further complicated by the existing technical and physical constraints as well as the uncertainty in describing heterogeneous properties of geologic formations. In this paper, we consider simultaneous optimization of well locations and dynamic rate allocations under geologic uncertainty using a variant of the simultaneous perturbation and stochastic approximation (SPSA). In addition, by taking advantage of the robustness of SPSA against errors in calculating the cost function, we develop an efficient field development optimization under geologic uncertainty, where an ensemble of models are used to describe important flow and transport reservoir properties (e.g., permeability and porosity). We use several numerical experiments, including a channel layer of the SPE10 model and the three-dimensional PUNQ-S3 reservoir, to illustrate the performance improvement that can be achieved by solving a combined well placement and control optimization using the SPSA algorithm under known and uncertain reservoir model assumptions.  相似文献   

8.
Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.  相似文献   

9.
The propagation of database parameter uncertainty has been assessed for aqueous and mineral equilibrium calculations of uranium by Monte Carlo and quasi-Monte Carlo simulations in simple inorganic solution compositions. The concentration output distributions of individual chemical species varies greatly depending on the solution composition modelled. The relative uncertainty for a particular species is generally reduced in regions of solution composition for which it is predicted to be dominant, due to the asymptotic behaviour imposed by the mass balance constraint where the species concentration approaches the total element concentration. The relative uncertainties of minor species, in regions where another species comprising one or several of the same components is predicted to be dominant with a high probability, also appear to be reduced slightly. Composition regions where two or several species are equally important tend to produce elevated uncertainties for related minor species, although the uncertainties of the major species themselves tend to be reduced. The non-linear behaviour of the equilibrium systems can lead to asymmetric or bimodal output distributions; this is particularly evident close to equivalence points or solubility boundaries. Relatively conservative estimates of input uncertainty can result in considerable output uncertainty due to both the complexity of uranium solution chemistry and the system interdependencies. The results of this study suggest that for some modelling scenarios, “classical” speciation calculations based on mean value estimates of the thermodynamic values may result in predictions of a relatively low probability compared to an approach that considers the effects of uncertainty propagation.  相似文献   

10.
Coregionalization analysis has been presented as a method of multi-scale analysis for multivariate spatial data. Despite an increasing use of this method in environmental and earth sciences, the uncertainty associated with the estimation of parameters in coregionalization analysis (e.g., sills and functions of sills) is potentially high and has not yet been characterized. This article aims to discuss the theory underlying coregionalization analysis and assess the robustness and limits of the method. A theoretical framework is developed to calculate the ergodic and fluctuation variance-covariance matrices of least-squares estimators of sills in the linear model of coregionalization. To adjust for the positive semidefiniteness constraint on estimated coregionalization matrices, a confidence interval estimation procedure for sills and functions of sills is presented. Thereafter, the relative importance of uncertainty measures (bias and variance) for sills and structural coefficients of correlation and determination is assessed under different scenarios to identify factors controlling their uncertainty. Our results show that the sampling grid density, the choice of the least-squares estimator of sills, the positive semidefiniteness constraint, the presence of scale dependence in the correlations, and the number and range of variogram models, all affect the level of uncertainty, sometimes through multiple interactions. The asymptotic properties of variogram model parameter estimators in a bounded sampling domain impose a theoretical limit to their accuracy and precision. Because of this limit, the uncertainty was found to be high for several scenarios, especially with three variogram models, and was often more dependent on the ratio of variogram range to domain extent than on the sampling grid density. In practice, in the coregionalization analysis of a real dataset, the circular requirement for sill estimates in the calculation of uncertainty measures makes the quantification of uncertainty very problematic, if not impossible. The use of coregionalization analysis must be made with due knowledge of the uncertainty levels and limits of the method.  相似文献   

11.
The simulation of non-point source pollution in agricultural basins is a computationally demanding process due to the large number of individual sources and potential pollution receptors (e.g., drinking water wells). In this study, we present an efficient computational framework for parallel simulation of diffuse pollution in such groundwater basins. To derive a highly detailed velocity field, we employed algebraic multigrid (AMG) preconditioners to solve the groundwater flow equation. We compare two variants of AMG implementations, the multilevel preconditioning provided by Trilinos and the BoomerAMG provided by HYPRE. We also perform a sensitivity analysis on the configuration of AMG methods to evaluate the application of these libraries to groundwater flow problems. For the transport simulation of diffuse contamination, we use the streamline approach, which decomposes the 3D transport problem into a large number of 1D problems that can be executed in parallel. The proposed framework is applied to a 2,600-km2 groundwater basin in California discretized into a grid with over 11 million degrees of freedom. Using a Monte Carlo approach with 200 nitrate loading realizations at the aquifer surface, we perform a stochastic analysis to quantify nitrate breakthrough prediction uncertainty at over 1,500 wells due to random, temporally distributed nitrate loading. The results show that there is a significant time lag between loading and aquifer response at production wells. Generally, typical production wells respond after 5–50 years depending on well depth and screen length, while the prediction uncertainty for nitrate in individual wells is very large—approximately twice the drinking water limit for nitrate.  相似文献   

12.
As a GIS tool,visibility analysis is used in many areas to evaluate both visible and non-visible places.Visibility analysis builds on a digital surface model describing the terrain morphology,including the position and shapes of all objects that can sometimes act as visibility barriers.However,some barriers,for example vegetation,may be permeable to a certain degree.Despite extensive research and use of visibility analysis in different areas,standard GIS tools do not take permeability into account.This article presents a new method to calculate visibility through partly permeable obstacles.The method is based on a quasi-Monte Carlo simulation with 100 iterations of visibility calculation.Each iteration result represents 1% of vegetation permeability,which can thus range from1% to 100% visibility behind vegetation obstacles.The main advantage of the method is greater accuracy of visibility results and easy implementation on any GIS software.The incorporation of the proposed method in GIS software would facilitate work in many fields,such as architecture,archaeology,radio communication,and the military.  相似文献   

13.
A review of lognormal estimators forin situ reserves   总被引:1,自引:0,他引:1  
The term “lognormal kriging” does not correspond to a single well defined estimator. In fact, several types of lognormal estimators forin situ reserves are available, and this may cause confusion. These estimators are based on different assumptions—that is, different models. This paper presents a review of these estimators.  相似文献   

14.
Coregionalization analysis has been presented as a method of multi-scale analysis for multivariate spatial data. Despite an increasing use of this method in environmental and earth sciences, the uncertainty associated with the estimation of parameters in coregionalization analysis (e.g., sills and functions of sills) is potentially high and has not yet been characterized. This article aims to discuss the theory underlying coregionalization analysis and assess the robustness and limits of the method. A theoretical framework is developed to calculate the ergodic and fluctuation variance-covariance matrices of least-squares estimators of sills in the linear model of coregionalization. To adjust for the positive semidefiniteness constraint on estimated coregionalization matrices, a confidence interval estimation procedure for sills and functions of sills is presented. Thereafter, the relative importance of uncertainty measures (bias and variance) for sills and structural coefficients of correlation and determination is assessed under different scenarios to identify factors controlling their uncertainty. Our results show that the sampling grid density, the choice of the least-squares estimator of sills, the positive semidefiniteness constraint, the presence of scale dependence in the correlations, and the number and range of variogram models, all affect the level of uncertainty, sometimes through multiple interactions. The asymptotic properties of variogram model parameter estimators in a bounded sampling domain impose a theoretical limit to their accuracy and precision. Because of this limit, the uncertainty was found to be high for several scenarios, especially with three variogram models, and was often more dependent on the ratio of variogram range to domain extent than on the sampling grid density. In practice, in the coregionalization analysis of a real dataset, the circular requirement for sill estimates in the calculation of uncertainty measures makes the quantification of uncertainty very problematic, if not impossible. The use of coregionalization analysis must be made with due knowledge of the uncertainty levels and limits of the method.  相似文献   

15.
Upscaling Uncertain Permeability Using Small Cell Renormalization   总被引:1,自引:0,他引:1  
Sedimentary rocks have structures on all length scales from the millimeter to the kilometer. These structures are generally associated with variations in rock permeability. These need to be modeled if we are to make predictions about fluid flow through the rock. However, existing computers are not powerful enough for us to be able to represent all scales of heterogeneity explicitly in our fluid flow models—hence, we need to upscale. Small cell renormalization is a fast method for upscaling permeability, derived from an analogue circuit of resistors. However, it assumes that the small scale permeability distribution is known. In practice, this is unlikely. The only information available about small scale properties is either qualitative, derived from the depositional setting of the reservoir, or local to the wells as a result of coring or logging. The influence of small scale uncertainty on large scale properties is usually modelled by the Monte Carlo method. This is time-consuming and inaccurate if not enough realisations are used. This paper describes a new implementation of renormalization, which enables the direct upscaling of uncertain small-scale permeabilities to produce the statistical properties of the equivalent coarse grid. This is achieved by using a perturbation expansion of the resistor-derived equation. The method is verified by comparison with numerical simulations using the Monte Carlo method. The prediction of expected large-scale permeability and its standard deviation are shown to be accurate for small cell standard deviations of up to 40% of the mean cell value, using just the first nonzero term of the perturbation expansion. Inclusion of higher order terms allows larger standard deviations to be modeled accurately. Evaluation of cross-terms allows correlations of actual cell values, over and above the background structure of mean cell values. The perturbation method is significantly faster than conventional Monte Carlo simulation. It needs just two calculations whereas the Monte Carlo method needs many thousands of realisations to be generated and renormalized to converge. This results in significant savings in computer time.  相似文献   

16.
A recently developed Bayesian interpolation method (BI) and its application to safety assessment of a flood defense structure are described in this paper. We use a one-dimensional Bayesian Monte Carlo method (BMC) that has been proposed in (Rajabalinejad 2009) to develop a weighted logical dependence between neighboring points. The concept of global uncertainty is adequately explained and different uncertainty association models (UAMs) are presented for linking the local and global uncertainty. Based on the global uncertainty, a simplified approach is introduced. By applying the global uncertainty, we apply the Guassian error estimation to general models and the Generalized Beta (GB) distribution to monotonic models. Our main objective in this research is to simplify the newly developed BMC method and demonstrate that it can dramatically improve the simulation efficiency by using prior information from outcomes of the preceding simulations. We provide theory and numerical algorithms for the BI method geared to multi-dimensional problems, integrate it with a probabilistic finite element model, and apply the coupled models to the reliability assessment of a flood defense for the 17th Street Flood Wall system in New Orleans.  相似文献   

17.
In optical image registration, the reference control points (RCPs) used as explanatory variables in the polynomial regression model are generally assumed to be error free. However, this most frequently used assumption is often invalid in practice because RCPs always contain errors. In this situation, the extensively applied estimator, the ordinary least squares (LS) estimator, is biased and incapable of handling the errors in RCPs. Therefore, it is necessary to develop new feasible methods to address such a problem. This paper discusses the scaled total least squares (STLS) estimator, which is a generalization of the LS estimator in optical remote sensing image registration. The basic principle and the computational method of the STLS estimator and the relationship among the LS, total least squares (TLS) and STLS estimators are presented. Simulation experiments and real remotely sensed image experiments are carried out to compare LS and STLS approaches and systematically analyze the effect of the number and accuracy of RCPs on the performances in registration. The results show that the STLS estimator is more effective in estimating the model parameters than the LS estimator. Using this estimator based on the error-in-variables model, more accurate registration results can be obtained. Furthermore, the STLS estimator has superior overall performance in the estimation and correction of measurement errors in RCPs, which is beneficial to the study of error propagation in remote sensing data. The larger the RCP number and error, the more obvious are these advantages of the presented estimator.  相似文献   

18.
Significant uncertainties are associated with the definition of both the exploration targeting criteria and computational algorithms used to generate mineral prospectivity maps. In prospectivity modeling, the input and computational uncertainties are generally made implicit, by making a series of best-guess or best-fit decisions, on the basis of incomplete and imprecise information. The individual uncertainties are then compounded and propagated into the final prospectivity map as an implicit combined uncertainty which is impossible to directly analyze and use for decision making. This paper proposes a new approach to explicitly define uncertainties of individual targeting criteria and propagate them through a computational algorithm to evaluate the combined uncertainty of a prospectivity map. Applied to fuzzy logic prospectivity models, this approach involves replacing point estimates of fuzzy membership values by statistical distributions deemed representative of likely variability of the corresponding fuzzy membership values. Uncertainty is then propagated through a fuzzy logic inference system by applying Monte Carlo simulations. A final prospectivity map is represented by a grid of statistical distributions of fuzzy prospectivity. Such modeling of uncertainty in prospectivity analyses allows better definition of exploration target quality, as understanding of uncertainty is consistently captured, propagated and visualized in a transparent manner. The explicit uncertainty information of prospectivity maps can support further risk analysis and decision making. The proposed probabilistic fuzzy logic approach can be used in any area of geosciences to model uncertainty of complex fuzzy systems.  相似文献   

19.
When estimating the mean value of a variable, or the total amount of a resource, within a specified region it is desirable to report an estimated standard error for the resulting estimate. If the sample sites are selected according to a probability sampling design, it usually is possible to construct an appropriate design-based standard error estimate. One exception is systematic sampling for which no such standard error estimator exists. However, a slight modification of systematic sampling, termed 2-step tessellation stratified (2TS) sampling, does permit the estimation of design-based standard errors. This paper develops a design-based standard error estimator for 2TS sampling. It is shown that the Taylor series approximation to the variance of the sample mean under 2TS sampling may be expressed in terms of either a deterministic variogram or a deterministic covariance function. Variance estimation then can be approached through the estimation of a variogram or a covariance function. The resulting standard error estimators are compared to some more traditional variance estimators through a simulation study. The simulation results show that estimators based on the new approach may perform better than traditional variance estimators.  相似文献   

20.
The least squares Monte Carlo method is a decision evaluation method that can capture the effect of uncertainty and the value of flexibility of a process. The method is a stochastic approximate dynamic programming approach to decision making. It is based on a forward simulation coupled with a recursive algorithm which produces the near-optimal policy. It relies on the Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir engineering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the least squares Monte Carlo method by improving the sampling method used to generate the technical uncertainties used in obtaining the production profiles. The probabilistic collocation method has been proven to be a robust and efficient uncertainty quantification method. By using the sampling methods of the probabilistic collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision evaluation method. Thus, we introduce the least squares probabilistic collocation method. The decision evaluation considered a number of technical and economic uncertainties. Three reservoir case studies were used: a simple homogeneous model, the PUNQ-S3 model, and a modified portion of the SPE10 model. The results show that using the sampling techniques of the probabilistic collocation method produced relatively accurate responses compared with the original method. Different possible enhancements were discussed in order to practically adapt the least squares probabilistic collocation method to more realistic and complex reservoir models. Furthermore, it is desired to perform the method to evaluate high-dimensional decision scenarios for different chemical enhanced oil recovery processes using real reservoir data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号