首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Despite their apparent high dimensionality, spatially distributed hydraulic properties of geologic formations can often be compactly (sparsely) described in a properly designed basis. Hence, the estimation of high-dimensional subsurface flow properties from dynamic performance and monitoring data can be formulated and solved as a sparse reconstruction inverse problem. Recent advances in statistical signal processing, formalized under the compressed sensing paradigm, provide important guidelines on formulating and solving sparse inverse problems, primarily for linear models and using a deterministic framework. Given the uncertainty in describing subsurface physical properties, even after integration of the dynamic data, it is important to develop a practical sparse Bayesian inversion approach to enable uncertainty quantification. In this paper, we use sparse geologic dictionaries to compactly represent uncertain subsurface flow properties and develop a practical sparse Bayesian method for effective data integration and uncertainty quantification. The multi-Gaussian assumption that is widely used in classical probabilistic inverse theory is not appropriate for representing sparse prior models. Following the results presented by the compressed sensing paradigm, the Laplace (or double exponential) probability distribution is found to be more suitable for representing sparse parameters. However, combining Laplace priors with the frequently used Gaussian likelihood functions leads to neither a Laplace nor a Gaussian posterior distribution, which complicates the analytical characterization of the posterior. Here, we first express the form of the Maximum A-Posteriori (MAP) estimate for Laplace priors and then use the Monte-Carlo-based Randomize Maximum Likelihood (RML) method to generate approximate samples from the posterior distribution. The proposed Sparse RML (SpRML) approximate sampling approach can be used to assess the uncertainty in the calibrated model with a relatively modest computational complexity. We demonstrate the suitability and effectiveness of the SpRML formulation using a series of numerical experiments of two-phase flow systems in both Gaussian and non-Gaussian property distributions in petroleum reservoirs and successfully apply the method to an adapted version of the PUNQ-S3 benchmark reservoir model.  相似文献   

2.
We perform global sensitivity analysis (GSA) through polynomial chaos expansion (PCE) on a contaminant transport model for the assessment of radionuclide concentration at a given control location in a heterogeneous aquifer, following a release from a near surface repository of radioactive waste. The aquifer hydraulic conductivity is modeled as a stationary stochastic process in space. We examine the uncertainty in the first two (ensemble) moments of the peak concentration, as a consequence of incomplete knowledge of (a) the parameters characterizing the variogram of hydraulic conductivity, (b) the partition coefficient associated with the migrating radionuclide, and (c) dispersivity parameters at the scale of interest. These quantities are treated as random variables and a variance-based GSA is performed in a numerical Monte Carlo framework. This entails solving groundwater flow and transport processes within an ensemble of hydraulic conductivity realizations generated upon sampling the space of the considered random variables. The Sobol indices are adopted as sensitivity measures to provide an estimate of the role of uncertain parameters on the (ensemble) target moments. Calculation of the indices is performed by employing PCE as a surrogate model of the migration process to reduce the computational burden. We show that the proposed methodology (a) allows identifying the influence of uncertain parameters on key statistical moments of the peak concentration (b) enables extending the number of Monte Carlo iterations to attain convergence of the (ensemble) target moments, and (c) leads to considerable saving of computational time while keeping acceptable accuracy.  相似文献   

3.
The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called “curse of dimensionality”. Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the “restart” scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.  相似文献   

4.
随机地震反演关键参数优选和效果分析(英文)   总被引:2,自引:0,他引:2  
随机地震反演技术是将地质统计理论和地震反演相结合的反演方法,它将地震资料、测井资料和地质统计学信息融合为地下模型的后验概率分布,利用马尔科夫链蒙特卡洛(MCMC)方法对该后验概率分布采样,通过综合分析多个采样结果来研究后验概率分布的性质,进而认识地下情况。本文首先介绍了随机地震反演的原理,然后对影响随机地震反演效果的四个关键参数,即地震资料信噪比、变差函数、后验概率分布的样本个数和井网密度进行分析并给出其优化原则。资料分析表明地震资料信噪比控制地震资料和地质统计规律对反演结果的约束程度,变差函数影响反演结果的平滑程度,后验概率分布的样本个数决定样本统计特征的可靠性,而参与反演的井网密度则影响反演的不确定性。最后通过对比试验工区随机地震反演和基于模型的确定性地震反演结果,指出随机地震反演可以给出更符合地下实际情况的模型。  相似文献   

5.
The Karhunen-Loeve (KL) decomposition and the polynomial chaos (PC) expansion are elegant and efficient tools for uncertainty propagation in porous media. Over recent years, KL/PC-based frameworks have successfully been applied in several contributions for the flow problem in the subsurface context. It was also shown, however, that the accurate solution of the transport problem with KL/PC techniques is more challenging. We propose a framework that utilizes KL/PC in combination with sparse Smolyak quadrature for the flow problem only. In a subsequent step, a Lagrangian sampling technique is used for transport. The flow field samples are calculated based on a PC expansion derived from the solutions at relatively few quadrature points. To increase the computational efficiency of the PC-based flow field sampling, a new reduction method is applied. For advection dominated transport scenarios, where a Lagrangian approach is applicable, the proposed PC/Monte Carlo method (PCMCM) is very efficient and avoids accuracy problems that arise when applying KL/PC techniques to both flow and transport. The applicability of PCMCM is demonstrated for transport simulations in multivariate Gaussian log-conductivity fields that are unconditional and conditional on conductivity measurements.  相似文献   

6.
《水文科学杂志》2013,58(3):582-595
Abstract

This paper explores the potential for seasonal prediction of hydrological variables that are potentially useful for reservoir operation of the Three Gorges Dam, China. The seasonal flow of the primary inflow season and the peak annual flow are investigated at Yichang hydrological station, a proxy for inflows to the Three Gorges Dam. Building on literature and diagnostic results, a prediction model is constructed using sea-surface temperatures and upland snow cover available one season ahead of the prediction period. A hierarchical Bayesian approach is used to estimate uncertainty in the parameters of the prediction model and to propagate these uncertainties to the predictand. The results show skill for both the seasonal flow and the peak annual flow. The peak annual flow model is then used to estimate a design flood (50-year flood or 2% exceedence probability) on a year-to-year basis. The results demonstrate the inter-annual variability in flood risk. The predictability of both the seasonal total inflow and the peak annual flow (or a design flood volume) offers potential for adaptive management of the Three Gorges Dam reservoir through modification of the operating policy in accordance with the year-to-year changes in these variables.  相似文献   

7.
马尔科夫链蒙特卡洛方法(MCMC)是一种启发式的全局寻优算法,可以用来解决概率反演的问题.基于MCMC方法的反演不依赖于准确的初始模型,可以引入任意复杂的先验信息,通过对先验概率密度函数的采样来获得大量的后验概率分布样本,在寻找最优解的过程中可以跳出局部最优得到全局最优解.MCMC方法由于计算量巨大,应用难度较高,在地...  相似文献   

8.
Porous aquifer materials are often characterized by layered heterogeneities that influence groundwater flow and present complexities in contaminant transport modeling. Such flow variations also have the potential to impact the dissolution flux from dense nonaqueous phase liquid (DNAPL) pools. This study examined how these heterogeneous flow conditions affected the dissolution of a tetrachloroethene (PCE) pool in a two-dimensional intermediate-scale flow cell containing coarse sand. A steady-state mass-balance approach was used to calculate the PCE dissolution rate at three different flow rates. As expected, aqueous PCE concentrations increased along the length of the PCE pool and higher flow rates decreased the aqueous PCE concentration in the effluent. Nonreactive tracer studies at two flow rates confirmed the presence of a vertical flow gradient, with the most rapid velocity located at the bottom of the tank. These results suggest that flow focusing occurred near the DNAPL pool. Effluent PCE concentrations and pool dissolution flux rates were compared to model predictions assuming local equilibrium (LE) conditions at the DNAPL pool/aqueous phase interface and a uniform distribution of flow. The LE model did not describe the data well, even over a wide range of PCE solubility and macroscopic transverse dispersivity values. Model predictions assuming nonequilibrium mass-transfer-limited conditions and accounting for vertical flow gradients, however, resulted in a better fit to the data. These results have important implications for evaluating DNAPL pool dissolution in the field where subsurface heterogeneities are likely to be present.  相似文献   

9.
A stochastic thin-layer method is developed for the analysis of wave propagation in a layered half-space. A random field of shear moduli in the layered system is considered in terms of multiple correlated random variables. Expanding the random moduli and uncertain responses by means of Hermite polynomial chaos expansions and applying the Galerkin method in the spatial as well as stochastic domains, stochastic versions of thin-layer methods for a layered half-space in plane strain and antiplane shear are obtained. In order to represent the infinite half-space, continued-fraction absorbing boundary conditions are included in the thin-layer models of the half-space. Using these stochastic methods, dynamic responses of a layered half-space subjected to line loads are examined. Means, coefficients of variance, and probability density functions of the half-space responses with a varying correlation coefficient of the shear moduli are computed and verified by comparison with Monte Carlo simulations. It is demonstrated that accurate probabilistic dynamic analysis is possible using the developed stochastic thin-layer methods for a layered half-space.  相似文献   

10.
Cone penctrometer test (CPT) based Raman spectroscopy was used to identify separate phase tetrachloroethylene (PCE) and trichlorocthylene (TCE) contamination in the subsurface at two locations during field tests conducted at the U.S. Department of Energy's (DOE) Savannah River site. Clear characteristic Raman spectral peaks for PCE and TCE were observed at two sites and several depths during CPT deployment. Because of the uniqueness of a Raman spectrum for a given compound, these data are compelling evidence of the presence of the two compounds. The Raman spectral results correlated with high PCE and TCE concentrations in soil samples collected from the same subsurface zones, confirming that the method is a viable dense nonaqueous phase liquid (DNAPL) characterization technique. The Raman spectroscopic identification of PCE and TCE in these tests represents the first time that DNAPLs have been unequivocally located in the subsurface by an in situ technique.
The detection limit of the Raman spectroscopy is related to the probability of contaminant droplets appearing on the optical window in the path of the probe light. Based on data from this fieldwork the Raman technique may require a threshold quantity of DNAPL to provide an adequate optical cross section for spectroscopic response. The low aqueous solubility of PCE and TCE and relatively weak optical intensity of the Raman signal precludes the detection of aqueous phase contaminants by this method, making it selective for DNAPL contaminants only.  相似文献   

11.
Data assimilation is widely used to improve flood forecasting capability, especially through parameter inference requiring statistical information on the uncertain input parameters (upstream discharge, friction coefficient) as well as on the variability of the water level and its sensitivity with respect to the inputs. For particle filter or ensemble Kalman filter, stochastically estimating probability density function and covariance matrices from a Monte Carlo random sampling requires a large ensemble of model evaluations, limiting their use in real-time application. To tackle this issue, fast surrogate models based on polynomial chaos and Gaussian process can be used to represent the spatially distributed water level in place of solving the shallow water equations. This study investigates the use of these surrogates to estimate probability density functions and covariance matrices at a reduced computational cost and without the loss of accuracy, in the perspective of ensemble-based data assimilation. This study focuses on 1-D steady state flow simulated with MASCARET over the Garonne River (South-West France). Results show that both surrogates feature similar performance to the Monte-Carlo random sampling, but for a much smaller computational budget; a few MASCARET simulations (on the order of 10–100) are sufficient to accurately retrieve covariance matrices and probability density functions all along the river, even where the flow dynamic is more complex due to heterogeneous bathymetry. This paves the way for the design of surrogate strategies suitable for representing unsteady open-channel flows in data assimilation.  相似文献   

12.
A reliability approach is used to develop a probabilistic model of two-dimensional non-reactive and reactive contaminant transport in porous media. The reliability approach provides two important quantitative results: an estimate of the probability that contaminant concentration is exceeded at some location and time, and measures of the sensitivity of the probabilistic outcome to likely changes in the uncertain variables. The method requires that each uncertain variable be assigned at least a mean and variance; in this work we also incorporate and investigate the influence of marginal probability distributions. Uncertain variables includex andy components of average groundwater flow velocity,x andy components of dispersivity, diffusion coefficient, distribution coefficient, porosity and bulk density. The objective is to examine the relative importance of each uncertain variable, the marginal distribution assigned to each variable, and possible correlation between the variables. Results utilizing a two-dimensional analytical solution indicate that the probabilistic outcome is generally very sensitive to likely changes in the uncertain flow velocity. Uncertainty associated with dispersivity and diffusion coefficient is often not a significant issue with respect to the probabilistic analysis; therefore, dispersivity and diffusion coefficient can often be treated for practical analysis as deterministic constants. The probabilistic outcome is sensitive to the uncertainty of the reaction terms for early times in the flow event. At later times, when source contaminants are released at constant rate throughout the study period, the probabilistic outcome may not be sensitive to changes in the reaction terms. These results, although limited at present by assumptions and conceptual restrictions inherent to the closed-form analytical solution, provide insight into the critical issues to consider in a probabilistic analysis of contaminant transport. Such information concerning the most important uncertain parameters can be used to guide field and laboratory investigations.  相似文献   

13.
A reliability approach is used to develop a probabilistic model of two-dimensional non-reactive and reactive contaminant transport in porous media. The reliability approach provides two important quantitative results: an estimate of the probability that contaminant concentration is exceeded at some location and time, and measures of the sensitivity of the probabilistic outcome to likely changes in the uncertain variables. The method requires that each uncertain variable be assigned at least a mean and variance; in this work we also incorporate and investigate the influence of marginal probability distributions. Uncertain variables includex andy components of average groundwater flow velocity,x andy components of dispersivity, diffusion coefficient, distribution coefficient, porosity and bulk density. The objective is to examine the relative importance of each uncertain variable, the marginal distribution assigned to each variable, and possible correlation between the variables. Results utilizing a two-dimensional analytical solution indicate that the probabilistic outcome is generally very sensitive to likely changes in the uncertain flow velocity. Uncertainty associated with dispersivity and diffusion coefficient is often not a significant issue with respect to the probabilistic analysis; therefore, dispersivity and diffusion coefficient can often be treated for practical analysis as deterministic constants. The probabilistic outcome is sensitive to the uncertainty of the reaction terms for early times in the flow event. At later times, when source contaminants are released at constant rate throughout the study period, the probabilistic outcome may not be sensitive to changes in the reaction terms. These results, although limited at present by assumptions and conceptual restrictions inherent to the closed-form analytical solution, provide insight into the critical issues to consider in a probabilistic analysis of contaminant transport. Such information concerning the most important uncertain parameters can be used to guide field and laboratory investigations.  相似文献   

14.
Global climate change models have predicted the intensification of extreme events, and these predictions are already occurring. For disaster management and adaptation of extreme events, it is essential to improve the accuracy of extreme value statistical models. In this study, Bayes' Theorem is introduced to estimate parameters in Generalized Pareto Distribution (GPD), and then the GPD is applied to simulate the distribution of minimum monthly runoff during dry periods in mountain areas of the Ürümqi River, Northwest China. Bayes' Theorem treats parameters as random variables and provides a robust way to convert the prior distribution of parameters into a posterior distribution. Statistical inferences based on posterior distribution can provide a more comprehensive representation of the parameters. An improved Markov Chain Monte Carlo (MCMC) method, which can solve high‐dimensional integral computation in the Bayes equation, is used to generate parameter simulations from the posterior distribution. Model diagnosis plots are made to guarantee the fitted GPD is appropriate. Then based on the GPD with Bayesian parameter estimates, monthly runoff minima corresponding to different return periods can be calculated. The results show that the improved MCMC method is able to make Markov chains converge faster. The monthly runoff minima corresponding to 10a, 25a, 50a and 100a return periods are 0.60 m3/s, 0.44 m3/s, 0.32 m3/s and 0.20 m3/s respectively. The lower boundary of 95% confidence interval of 100a return level is below zero, which implies that the Ürümqi River is likely to cease to flow when 100a return level appears in dry periods. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

15.
Fluid-flow simulators used in the oil industry model the movement of fluids through a porous reservoir rock. These simulators either ignore coupling between the flow and concurring deformation of the solid rock frame or take it into account approximately, in the so-called loose or staggered-in-time mode. In contrast to existing simulators, the one we describe here fully couples two-phase (oil and water) flow to subsurface deformation and simultaneously accounts for all relevant physical phenomena. As such, our flow simulator inherently links time-dependent fluid pressures, saturations, permeabilities and flow velocities to stresses in the whole subsurface. These stresses relate to strains through the non-linear theory of elasticity, allowing us to model time-lapse changes in seismic velocities and anisotropy. The velocity variations manifest themselves in time shifts and reflection amplitudes that are conventionally measured from 4D seismic data. Changes in anisotropy produce time-dependent shear-wave splitting that can be used for monitoring the horizontal stresses.  相似文献   

16.
Advances in seismics acquisition and processing and the widespread use of 4D seismics have made available reliable production‐induced subsurface deformation data in the form of overburden time‐shifts. Inversion of these data is now beginning to be used as an aid to the monitoring of a reservoir's effective stress. Past solutions to this inversion problem have relied upon analytic calculations for an unrealistically simplified subsurface, which can lead to uncertainties. To enhance the accuracy of this approach, a method based on transfer functions is proposed in which the function itself is calibrated using numerically generated overburden strain deformation calculated for a small select group of reference sources. This technique proves to be a good compromise between the faster but more accurate history match of the overburden strain using a geomechanical simulator and the slower, less accurate analytic method. Synthetic tests using a coupled geomechanical and fluid flow simulator for the South Arne field confirm the efficacy of the method. Application to measured time‐shifts from observed 4D seismics indicates compartmentalization in the Tor reservoir, more heterogeneity than is currently considered in the simulation model and moderate connectivity with the overlying Ekofisk formation.  相似文献   

17.
18.
Reservoir system reliability is the ability of reservoir to perform its required functions under stated conditions for a specified period of time. In classical method of reservoir system reliability analysis, the operation policy is used in a simple simulation model, considering the historical/synthetic inflow series and a number of physical bounds on a reservoir system. This type of reliability analysis assumes a reservoir system as fully failed or functioning, called binary state assumption. A number of researchers from various research backgrounds have shown that the binary state assumption in the traditional reliability theory is not extensively acceptable. Our approach to tackle the present problem space is to implement the algorithm of advance first order second moment (AFOSM) method. In this new method, the inflow and reservoir storage are considered as uncertain variables. The mean, variance and covariance of uncertain variables are determined using moment values of reservoir state variables. For this purpose, a stochastic optimization model developed based on the constraint state formulation is applied. The proposed model of reliability analysis is used to a real case study in Iran. As a result, monthly probabilities of water allocation were computed from AFOSM method, and the outputs were compared with those from Monte Carlo method. The comparison shows that the outputs from AFOSM method are similar to those from the Monte Carlo method. In term of practical use of this study, the proposed method is appropriate to determine the monthly probability of failure in water allocation without the aid of simulation.  相似文献   

19.
Data on source conditions for the 14 April 2010 paroxysmal phase of the Eyjafjallaj?kull eruption, Iceland, have been used as inputs to a trajectory-based eruption column model, bent. This model has in turn been adapted to generate output suitable as input to the volcanic ash transport and dispersal model, puff, which was used to propagate the paroxysmal ash cloud toward and over Europe over the following days. Some of the source parameters, specifically vent radius, vent source velocity, mean grain size of ejecta, and standard deviation of ejecta grain size have been assigned probability distributions based on our lack of knowledge of exact conditions at the source. These probability distributions for the input variables have been sampled in a Monte Carlo fashion using a technique that yields what we herein call the polynomial chaos quadrature weighted estimate (PCQWE) of output parameters from the ash transport and dispersal model. The advantage of PCQWE over Monte Carlo is that since it intelligently samples the input parameter space, fewer model runs are needed to yield estimates of moments and probabilities for the output variables. At each of these sample points for the input variables, a model run is performed. Output moments and probabilities are then computed by properly summing the weighted values of the output parameters of interest. Use of a computational eruption column model coupled with known weather conditions as given by radiosonde data gathered near the vent allows us to estimate that initial mass eruption rate on 14 April 2010 may have been as high as 108?kg/s and was almost certainly above 107?kg/s. This estimate is consistent with the probabilistic envelope computed by PCQWE for the downwind plume. The results furthermore show that statistical moments and probabilities can be computed in a reasonable time by using 94?=?6,561 PCQWE model runs as opposed to millions of model runs that might be required by standard Monte Carlo techniques. The output mean ash cloud height plus three standard deviations??encompassing c. 99.7?% of the probability mass??compares well with four-dimensional ash cloud position as retrieved from Meteosat-9 SEVIRI data for 16 April 2010 as the ash cloud drifted over north-central Europe. Finally, the ability to compute statistical moments and probabilities may allow for the better separation of science and decision-making, by making it possible for scientists to better focus on error reduction and decision makers to focus on ??drawing the line?? for risk assessment.  相似文献   

20.
Kil Seong Lee  Sang Ug Kim 《水文研究》2008,22(12):1949-1964
This study employs the Bayesian Markov Chain Monte Carlo (MCMC) method with the Metropolis–Hastings algorithm and maximum likelihood estimation (MLE) using a quadratic approximation of the likelihood function for the evaluation of uncertainties in low flow frequency analysis using a two‐parameter Weibull distribution. The two types of prior distributions, a non‐data‐based distribution and a data‐based distribution using regional information collected from neighbouring stations, are used to establish a posterior distribution. Eight case studies using the synthetic data with a sample size of 100, generated from two‐parameter Weibull distribution, are performed to compare with results of analysis using MLE and Bayesian MCMC. Also, Bayesian MCMC and MLE are applied to 36 years of gauged data to validate the efficiency of the developed scheme. These examples illustrate the advantages of Bayesian MCMC and the limitations of MLE based on a quadratic approximation. From the point of view of uncertainty analysis, Bayesian MCMC is more effective than MLE using a quadratic approximation when the sample size is small. In particular, Bayesian MCMC method is more attractive than MLE based on a quadratic approximation because the sample size of low flow at the site of interest is mostly not enough to perform the low flow frequency analysis. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号