首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In many fields of study, and certainly in hydrogeology, uncertainty propagation is a recurring subject. Usually, parametrized probability density functions (PDFs) are used to represent data uncertainty, which limits their use to particular distributions. Often, this problem is solved by Monte Carlo simulation, with the disadvantage that one needs a large number of calculations to achieve reliable results. In this paper, a method is proposed based on a piecewise linear approximation of PDFs. The uncertainty propagation with these discretized PDFs is distribution independent. The method is applied to the upscaling of transmissivity data, and carried out in two steps: the vertical upscaling of conductivity values from borehole data to aquifer scale, and the spatial interpolation of the transmissivities. The results of this first step are complete PDFs of the transmissivities at borehole locations reflecting the uncertainties of the conductivities and the layer thicknesses. The second step results in a spatially distributed transmissivity field with a complete PDF at every grid cell. We argue that the proposed method is applicable to a wide range of uncertainty propagation problems.  相似文献   

2.
Water levels and water quality of open borehole wells in fractured bedrock are flow-weighted averages that are a function of the hydraulic heads and transmissivities of water contributing fractures, properties that are rarely known. Without such knowledge using water levels and water quality data from fractured bedrock wells to assess groundwater flow and contaminant conditions can be highly misleading. This study demonstrates a cost-effective single packer method to determine the hydraulic heads and transmissivities of water contributing fracture zones in crystalline bedrock wells. The method entails inflating a pipe plug to isolate sections of an open borehole at different depths and monitoring changes in the water level with time. At each depth, the change in water level with time was used to determine the sum of fracture transmissivities above the packer and then to solve for individual fracture transmissivity. Steady-state wellbore heads along with the transmissivities were used to determine individual fracture heads using the weighted average head equation. The method was tested in five wells in crystalline bedrock located at the University of Connecticut in Storrs. The single packer head and transmissivity results were found to agree closely with those determined using conventional logging methods and the dissolved oxygen alteration method. The method appears to be a simple and cost-effective alternative in obtaining important information on flow conditions in fractured crystalline bedrock wells.  相似文献   

3.
The interactive multi-objective genetic algorithm (IMOGA) combines traditional optimization with an interactive framework that considers the subjective knowledge of hydro-geological experts in addition to quantitative calibration measures such as calibration errors and regularization to solve the groundwater inverse problem. The IMOGA is inherently a deterministic framework and identifies multiple large-scale parameter fields (typically head and transmissivity data are used to identify transmissivity fields). These large-scale parameter fields represent the optimal trade-offs between the different criteria (quantitative and qualitative) used in the IMOGA. This paper further extends the IMOGA to incorporate uncertainty both in the large-scale trends as well as the small-scale variability (which can not be resolved using the field data) in the parameter fields. The different parameter fields identified by the IMOGA represent the uncertainty in large-scale trends, and this uncertainty is modeled using a Bayesian approach where calibration error, regularization, and the expert’s subjective preference are combined to compute a likelihood metric for each parameter field. Small-scale (stochastic) variability is modeled using a geostatistical approach and added onto the large-scale trends identified by the IMOGA. This approach is applied to the Waste Isolation Pilot Plant (WIPP) case-study. Results, with and without expert interaction, are analyzed and the impact that expert judgment has on predictive uncertainty at the WIPP site is discussed. It is shown that for this case, expert interaction leads to more conservative solutions as the expert compensates for some of the lack of data and modeling approximations introduced in the formulation of the problem.  相似文献   

4.
In this paper, we face the problem of upscaling transmissivity from the macroscopic to the megascopic scale; here the macroscopic scale is that of the continuous flow equations, whereas the megascopic scale is that of the flow models on a coarse grid. In this paper, we introduce the non-local inverse based scaling (NIBS) and compare it with the simplified renormalization (SR). The latter is a classical technique that we adapt to compute internode transmissivities for a finite differences flow model in a direct way. NIBS is implemented in three steps: in the first step, the macroscopic transmissivity, together with arbitrarily chosen auxiliary boundary conditions and sources, is used to solve forward problems (FPs) at the macroscopic scale; in the second step, the resulting heads are sampled at the megascopic scale; in the third step, the upscaled internode transmissivities are obtained by solving an inverse problem with the differential system method (DS) for which the heads resulting from the second step are used. NIBS is a non-local technique, because the computation of the internode transmissivities relies upon the whole transmissivity field at the macroscopic scale. We test NIBS against SR in the case of synthetic, isotropic, confined aquifers under the assumptions of two-dimensional (2D) and steady-state flow; the aquifers differ for the degree of heterogeneity, which is represented by a normally distributed uncorrelated component of lnT. For the comparison, the reference heads and fluxes at the megascopic scale are computed from the solution of FPs at the macroscopic scale. These reference values are compared with the heads and the fluxes predicted from models at the megascopic scale using the upscaled parameters of SR and NIBS. For the class of aquifers considered in this paper, the results of SR are better than those of NIBS, which hints that non-local effects can be disregarded at the megascopic scale. The two techniques provide comparable results when the heterogeneity increases, when the megascopic scale is large with respect to the heterogeneity length scale, or when the source terms are relevant.  相似文献   

5.
Nonparametric method for transmissivity distributions along boreholes   总被引:4,自引:0,他引:4  
Fransson A 《Ground water》2002,40(2):201-204
The transmissivities of individual fractures along a borehole are difficult to obtain unless each fracture is tested. To estimate a fracture-transmissivity distribution from section transmissivities, a method was developed based on fixed-interval-length transmissivities and the corresponding number of fractures for each interval. The method is nonparametric and iterative, and the fractures are viewed as two-dimensional features, in which the total transmissivity of a borehole is equal to the sum of individual fracture transmissivities. Initially, a linear a priori assumption of the transmissivity distribution is made, and from this a so-called mean transmissivity function is derived. Subsequently, the mean transmissivity of the Nj fractures within a section, j, of the borehole is estimated, and the same value of the mean transmissivity function represents Nj possible fracture transmissivities from the initial distribution. This is repeated for each borehole section, and, eventually, all fracture transmissivities are sorted to give the next iteration's transmissivity distribution and the corresponding mean transmissivity function. Finally, the distributions converge, yielding a possible fracture-transmissivity distribution. The method was verified for a synthetic data sample and then tested on a sample from a borehole at the Asp? Hard Rock Laboratory, Sweden. For the synthetic data, the method gave a distribution that was fairly close to the original one; for the Asp? data, 15% of the fractures had a transmissivity larger than the measurement limit (1 x 10(-9) m2/sec), and these transmissivities follow a log-normal distribution.  相似文献   

6.
Determination of hydraulic head, H, as a function of spatial coordinates and time, in ground water flow is the basis for aquifer management and for prediction of contaminant transport. Several computer codes are available for this purpose. Spatial distribution of the transmissivity, T(x,y), is a required input to these codes. In most aquifers, T varies in an erratic manner, and it can be characterized statistically in terms of a few moments: the expected value, the variance, and the variogram. Knowledge of these moments, combined with a few measurements, permits one to estimate T at any point using geostatistical methods. In a review of transmissivity data from 19 unconsolidated aquifers, Hoeksema and Kitanidis (1985) identified two types of the logtransmissivity Y= ln(T) variations: correlated variations with variance sigma2Yc and correlation scale, I(Y), on the order of kilometers, and uncorrelated variations with variance sigma2Yn. Direct identification of the logtransmissivity variogram, Gamma(Y), from measurements is difficult because T data are generally scarce. However, many head measurements are commonly available. The aim of the paper is to introduce a methodology to identify the transmissivity variogram parameters (sigma2Yc, I(Y), and sigma2Yn) using head data in formations characterized by large logtransmissivity variance. The identification methodology uses a combination of precise numerical simulations (carried out using analytic element method) and a theoretical model. The main objective is to demonstrate the application of the methodology to a regional ground water flow in Eagle Valley basin in west-central Nevada for which abundant transmissivity and head measurements are available.  相似文献   

7.
We analyze the impact of the choice of the variogram model adopted to characterize the spatial variability of natural log-transmissivity on the evaluation of leading (statistical) moments of hydraulic heads and contaminant travel times and trajectories within mildly (randomly) heterogeneous two-dimensional porous systems. The study is motivated by the fact that in several practical situations the differences between various variogram types and a typical noisy sample variogram are small enough to suggest that one would often have a hard time deciding which of the tested models provides the best fit. Likewise, choosing amongst a set of seemingly likely variogram models estimated by means of geostatistical inverse models of flow equations can be difficult due to lack of sensitivity of available model discrimination criteria. We tackle the problem within the framework of numerical Monte Carlo simulations for mean uniform and radial flow scenarios. The effect of three commonly used isotropic variogram models, i.e., Gaussian, Exponential and Spherical, is analyzed. Our analysis clearly shows that (ensemble) mean values of the quantities of interest are not considerably influenced by the variogram shape for the range of parameters examined. Contrariwise, prediction variances of the quantities examined are significantly affected by the choice of the variogram model of the log-transmissivity field. The spatial distribution of the largest/lowest values of the relative differences observed amongst the tested models depends on a combination of variogram shape and parameters and relative distance from internal sources and the outer domain boundary. Our findings suggest the need of developing robust techniques to discriminate amongst a set of seemingly equally likely alternative variogram models in order to provide reliable uncertainty estimates of state variables.  相似文献   

8.
The performances of kriging, stochastic simulations and sequential self-calibration inversion are assessed when characterizing a non-multiGaussian synthetic 2D braided channel aquifer. The comparison is based on a series of criteria such as the reproduction of the original reference transmissivity or head fields, but also in terms of accuracy of flow and transport (capture zone) forecasts when the flow conditions are modified. We observe that the errors remain large even for a dense data network. In addition some unexpected behaviours are observed when large transmissivity datasets are used. In particular, we observe an increase of the bias with the number of transmissivity data and an increasing uncertainty with the number of head data. This is interpreted as a consequence of the use of an inadequate multiGaussian stochastic model that is not able to reproduce the connectivity of the original field.  相似文献   

9.
A Monte Carlo approach is described for the quantification of uncertainty on travel time estimates. A real (non synthetic) and exhaustive data set of natural genesis is used for reference. Using an approach based on binary indicators, constraint interval data are easily accommodated in the modeling process. It is shown how the incorporation of imprecise data can reduce drastically the uncertainty in the estimates. It is also shown that unrealistic results are obtained when a deterministic modeling is carried out using a kriging estimate of the transmissivity field. Problems related with using sequential indicator simulation for the generation of fields incorporating constraint interval data are discussed. The final results consists of 95% probability intervals of arrival times at selected control planes reflecting the original uncertainty on the transmissivity maps.  相似文献   

10.
A Monte Carlo approach is described for the quantification of uncertainty on travel time estimates. A real (non synthetic) and exhaustive data set of natural genesis is used for reference. Using an approach based on binary indicators, constraint interval data are easily accommodated in the modeling process. It is shown how the incorporation of imprecise data can reduce drastically the uncertainty in the estimates. It is also shown that unrealistic results are obtained when a deterministic modeling is carried out using a kriging estimate of the transmissivity field. Problems related with using sequential indicator simulation for the generation of fields incorporating constraint interval data are discussed. The final results consists of 95% probability intervals of arrival times at selected control planes reflecting the original uncertainty on the transmissivity maps.  相似文献   

11.
Abstract

In order to calculate the transmissivity from the inverse problem corresponding to the groundwater flow in an isotropic horizontal aquifer, a numerical conservative approach is tested. The method deals with triangulation of the domain and applies the conservation of mass to elements of the mesh using the harmonic mean for internodal transmissivities. An optimal sweeping algorithm is used to evaluate nodal transmissivities from one element to another with a minimal relative error accumulation. The practical importance of the method is demonstrated through two synthetic examples representing those experienced in the field, then through application to a Moroccan aquifer. The computed hydraulic head is well fitted to the reference one, which confirms the validity of the identified transmissivity model.  相似文献   

12.
Stochastic delineation of capture zones: classical versus Bayesian approach   总被引:1,自引:0,他引:1  
A Bayesian approach to characterize the predictive uncertainty in the delineation of time-related well capture zones in heterogeneous formations is presented and compared with the classical or non-Bayesian approach. The transmissivity field is modelled as a random space function and conditioned on distributed measurements of the transmissivity. In conventional geostatistical methods the mean value of the log transmissivity and the functional form of the covariance and its parameters are estimated from the available measurements, and then entered into the prediction equations as if they are the true values. However, this classical approach accounts only for the uncertainty that stems from the lack of ability to exactly predict the transmissivity at unmeasured locations. In reality, the number of measurements used to infer the statistical properties of the transmissvity field is often limited, which introduces error in the estimation of the structural parameters. The method presented accounts for the uncertainty that originates from the imperfect knowledge of the parameters by treating them as random variables. In particular, we use Bayesian methods of inference so as to make proper allowance for the uncertainty associated with estimating the unknown values of the parameters. The classical and Bayesian approach to stochastic capture zone delineation are detailed and applied to a hypothetical flow field. Two different sampling densities on a regular grid are considered to evaluate the effect of data density in both methods. Results indicate that the predictions of the Bayesian approach are more conservative.  相似文献   

13.
Abstract

This paper describes a study of groundwater flow in a coastal Miliolite limestone aquifer in western India. An examination of field information suggested that the transmissivity of the aquifer varies significantly between high and low groundwater heads. Pumping tests indicate that this is due to the development of major fissures in the upper part of the aquifer. A regional groundwater model with varying transmissivities is used to represent the field behaviour. The model is also used to examine the effect of artificial recharge on the alleviation of saline intrusion problems in the coastal area.  相似文献   

14.
Abstract

Abstract Characterization of heterogeneity at the field scale generally requires detailed aquifer properties such as transmissivity and hydraulic head. An accurate delineation of these properties is expensive and time consuming, and for many if not most groundwater systems, is not practical. As an alternative approach, stochastic representation of random fields is used and presented in this paper. Specifically, an iterative stochastic conditional simulation approach was applied to a hypothetical and highly heterogeneous pre-designed aquifer system. The approach is similar to the classical co-kriging technique; it uses a linear estimator that depends on the covariance functions of transmissivity (T), and hydraulic head (h), as well as their cross-covariances. A linearized flow equation along with a conditional random field generator constitutes the iterative process of the conditional simulation. One hundred equally likely realizations of transmissivity fields with pre-specified geostatistical parameters were generated, and conditioned to both limited transmissivity and head data. The successful implementation of the approach resulted in conditioned flow paths and travel-time distribution under different degrees of aquifer heterogeneity. This approach worked well for fields exhibiting small variances. However, for random fields exhibiting large variances (greater than 1.0), an iterative procedure was used. The results show that, as the variance of the ln[T] increases, the flow paths tend to diverge, resulting in a wide spectrum of flow conditions, with no direct discernable relationship between the degree of heterogeneity and travel time. The applied approach indicates that high errors may result when estimation of particle travel times in a heterogeneous medium is approximated by an equivalent homogeneous medium.  相似文献   

15.
In this study, we examine the effects of conditioning spatially variable transmissivity fields using head and/or transmissivity measurements on well-capture zones. In order to address the challenge posed by conditioning a flow model with spatially varying parameters, an innovative inverse algorithm, the Representers method, is employed. The method explicitly considers this spatial variability.

A number of uniform measurement grids with different densities are used to condition transmissivity fields using the Representers method. Deterministic and stochastic analysis of well-capture zones are then examined. The deterministic study focuses on comparison between reference well-capture zones and their estimated mean conditioned on head data. It shows that model performance due to head conditioning on well-capture zone estimation is related to pumping rate. At moderate pumping rates transmissivity observations are more crucial to identify effects arising from small-scale variations in pore water velocity. However, with more aggressive pumping these effects are reduced, consequently model performance, through incorporating head observations, markedly improves. In the stochastic study, the effect of conditioning using head and/or transmissivity data on well-capture zone uncertainty is examined. The Representers method is coupled with the Monte Carlo method to propagate uncertainty in transmissivity fields to well-capture zones. For the scenario studied, the results showed that a combination of 48 head and transmissivity data could reduce the area of uncertainty (95% confidence interval) in well-capture zone location by over 50%, compared to a 40% reduction using either head or transmissivity data. This performance was comparable to that obtained through calibrating on three and a half times the number of head observations alone.  相似文献   


16.
A new version of the computer program FLASH (Flow-Log Analysis of Single Holes) is presented for the analysis of borehole vertical flow logs to estimate fracture (or layer) transmissivities and far-field hydraulic heads. The program is written in R, an open-source environment. All previous features have been retained and new features incorporated including more rigorous parameter estimation, uncertainty analysis, and improved data import. The program has a dynamic user interface compatible with most operating systems.  相似文献   

17.
Accurate estimation of aquifer parameters, especially from crystalline hard rock area, assumes a special significance for management of groundwater resources. The aquifer parameters are usually estimated through pumping tests carried out on water wells. While it may be costly and time consuming for carrying out pumping tests at a number of sites, the application of geophysical methods in combination with hydro-geochemical information proves to be potential and cost effective to estimate aquifer parameters. Here a method to estimate aquifer parameters such as hydraulic conductivity, formation factor, porosity and transmissivity is presented by utilizing electrical conductivity values analysed via hydro-geochemical analysis of existing wells and the respective vertical electrical sounding (VES) points of Sindhudurg district, western Maharashtra, India. Further, prior to interpolating the distribution of aquifer parameters of the study area, variogram modelling was carried out using data driven techniques of kriging, automatic relevance determination based Bayesian neural networks (ARD-BNN) and adaptive neuro-fuzzy neural networks (ANFIS). In total, four variogram model fitting techniques such as spherical, exponential, ARD-BNN and ANFIS were compared. According to the obtained results, the spherical variogram model in interpolating transmissivity, ARD-BNN variogram model in interpolating porosity, exponential variogram model in interpolating aquifer thickness and ANFIS variogram model in interpolating hydraulic conductivity outperformed rest of the variogram models. Accordingly, the accurate aquifer parameters maps of the study area were produced by using the best variogram model. The present results suggest that there are relatively high value of hydraulic conductivity, porosity and transmissivity at Parule, Mogarne, Kudal, and Zarap, which would be useful to characterize the aquifer system over western Maharashtra.  相似文献   

18.
A method is presented for quantifying the uncertainty of the semivariogram of transmissivity and determining the required number of measurements. In this method, the estimated semivariogram and its 95% confidence limits are first determined from a finite number of measurements. The uncertainty of the estimated semivariogram is then quantified using the random field simulation technique. For a given value of the quantitative index of uncertainty, the required number of measured data can finally be obtained. Actual transmissivity data of an existing groundwater monitoring network are used in the application of the proposed method. The required numbers of measurements of transmissivity for four different values of the quantitative index of uncertainty are provided, from which reliable semivariograms of the transmissivity can be obtained. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

19.
We focus on the Bayesian estimation of strongly heterogeneous transmissivity fields conditional on data sampled at a set of locations in an aquifer. Log-transmissivity, Y, is modeled as a stochastic Gaussian process, parameterized through a truncated Karhunen–Loève (KL) expansion. We consider Y fields characterized by a short correlation scale as compared to the size of the observed domain. These systems are associated with a KL decomposition which still requires a high number of parameters, thus hampering the efficiency of the Bayesian estimation of the underlying stochastic field. The distinctive aim of this work is to present an efficient approach for the stochastic inverse modeling of fully saturated groundwater flow in these types of strongly heterogeneous domains. The methodology is grounded on the construction of an optimal sparse KL decomposition which is achieved by retaining only a limited set of modes in the expansion. Mode selection is driven by model selection criteria and is conditional on available data of hydraulic heads and (optionally) Y. Bayesian inversion of the optimal sparse KLE is then inferred using Markov Chain Monte Carlo (MCMC) samplers. As a test bed, we illustrate our approach by way of a suite of computational examples where noisy head and Y values are sampled from a given randomly generated system. Our findings suggest that the proposed methodology yields a globally satisfactory inversion of the stochastic head and Y fields. Comparison of reference values against the corresponding MCMC predictive distributions suggests that observed values are well reproduced in a probabilistic sense. In a few cases, reference values at some unsampled locations (typically far from measurements) are not captured by the posterior probability distributions. In these cases, the quality of the estimation could be improved, e.g., by increasing the number of measurements and/or the threshold for the selection of KL modes.  相似文献   

20.
随机地震反演关键参数优选和效果分析(英文)   总被引:2,自引:0,他引:2  
随机地震反演技术是将地质统计理论和地震反演相结合的反演方法,它将地震资料、测井资料和地质统计学信息融合为地下模型的后验概率分布,利用马尔科夫链蒙特卡洛(MCMC)方法对该后验概率分布采样,通过综合分析多个采样结果来研究后验概率分布的性质,进而认识地下情况。本文首先介绍了随机地震反演的原理,然后对影响随机地震反演效果的四个关键参数,即地震资料信噪比、变差函数、后验概率分布的样本个数和井网密度进行分析并给出其优化原则。资料分析表明地震资料信噪比控制地震资料和地质统计规律对反演结果的约束程度,变差函数影响反演结果的平滑程度,后验概率分布的样本个数决定样本统计特征的可靠性,而参与反演的井网密度则影响反演的不确定性。最后通过对比试验工区随机地震反演和基于模型的确定性地震反演结果,指出随机地震反演可以给出更符合地下实际情况的模型。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号