首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Stochastic environmental risk assessment considers the effects of numerous biological, chemical, physical, behavioral and physiological processes that involve elements of uncertainty and variability. A methodology for predicting health risks to individuals from contaminated groundwater is presented that incorporates the elements of uncertainty and variability in geological heterogeneity, physiological exposure parameters, and in cancer potency. An idealized groundwater basin is used to perform a parametric sensitivity study to assess the relative impact of (a) geologic uncertainty, (b) behavioral and physiological variability in human exposure and (c) uncertainty in cancer potency on the prediction of increased cancer risk to individuals in a population exposed to contaminants in household water supplied from groundwater. A two-dimensional distribution (or surface) of human health risk was generated as a result of the simulations. Cuts in this surface (fractiles of variability and percentiles of uncertainty) are then used as a measure of relative importance of various model components on total uncertainty and variability. A case study for perchloroethylene or PCE, shows that uncertainty and variability in hydraulic conductivity play an important role in predicting human health risk that is on the same order of influence as uncertainty of cancer potency.  相似文献   

2.
Probabilistic-fuzzy health risk modeling   总被引:3,自引:2,他引:1  
Health risk analysis of multi-pathway exposure to contaminated water involves the use of mechanistic models that include many uncertain and highly variable parameters. Currently, the uncertainties in these models are treated using statistical approaches. However, not all uncertainties in data or model parameters are due to randomness. Other sources of imprecision that may lead to uncertainty include scarce or incomplete data, measurement error, data obtained from expert judgment, or subjective interpretation of available information. These kinds of uncertainties and also the non-random uncertainty cannot be treated solely by statistical methods. In this paper we propose the use of fuzzy set theory together with probability theory to incorporate uncertainties into the health risk analysis. We identify this approach as probabilistic-fuzzy risk assessment (PFRA). Based on the form of available information, fuzzy set theory, probability theory, or a combination of both can be used to incorporate parameter uncertainty and variability into mechanistic risk assessment models. In this study, tap water concentration is used as the source of contamination in the human exposure model. Ingestion, inhalation and dermal contact are considered as multiple exposure pathways. The tap water concentration of the contaminant and cancer potency factors for ingestion, inhalation and dermal contact are treated as fuzzy variables while the remaining model parameters are treated using probability density functions. Combined utilization of fuzzy and random variables produces membership functions of risk to individuals at different fractiles of risk as well as probability distributions of risk for various alpha-cut levels of the membership function. The proposed method provides a robust approach in evaluating human health risk to exposure when there is both uncertainty and variability in model parameters. PFRA allows utilization of certain types of information which have not been used directly in existing risk assessment methods.  相似文献   

3.
Karst Aquifer GIS‐based model (KAGIS model) is developed and applied to Mela aquifer, a small karst aquifer located in a Mediterranean region (SE Spain). This model considers different variables, such as precipitation, land use, surface slope and lithology, and their geographical heterogeneity to calculate both, the run‐off coefficients and the fraction of precipitation which contributes to fill the soil water reservoir existing above the aquifer. Evapotranspiration uptakes deplete water, exclusively, from this soil water reservoir and aquifer recharge occurs when water in the soil reservoir exceeds the soil field capacity. The proposed model also obtains variations of the effective porosity in a vertical profile, an intrinsic consequence of the karstification processes. A new proposal from the Nash–Sutcliffe efficiency index, adapted to arid environments, is presented and employed to evaluate the model's ability to predict the water table oscillations. The uncertainty in the model parameters is determined by the Generalized Likelihood Uncertainty Estimation method. Afterwards, when KAGIS is calibrated, wavelet analysis is applied to the resulting data in order to evaluate the variability in the aquifer behaviour. Wavelet analysis reveals that the rapid hydrogeological response, typical of a wide variety of karst systems, is the prevailing feature of Mela aquifer. This study proves that KAGIS is a useful tool to quantify recharge and discharge rates of karst aquifers and can be effectively applied to develop a proper management of water resources in Mediterranean areas.  相似文献   

4.
We present an efficient methodology for assessing leakage detectability at geologic carbon sequestration sites under parameter uncertainty. Uncertainty quantification (UQ) and risk assessment are integral and, in many countries, mandatory components of geologic carbon sequestration projects. A primary goal of risk assessment is to evaluate leakage potential from anthropogenic and natural features, which constitute one of the greatest threats to the integrity of carbon sequestration repositories. The backbone of our detectability assessment framework is the probability collocation method (PCM), an efficient, nonintrusive, uncertainty-quantification technique that can enable large-scale stochastic simulations that are based on results from only a small number of forward-model runs. The metric for detectability is expressed through an extended signal-to-noise ratio (SNR), which incorporates epistemic uncertainty associated with both reservoir and aquifer parameters. The spatially heterogeneous aquifer hydraulic conductivity is parameterized using Karhunen–Loève (KL) expansion. Our methodology is demonstrated numerically for generating probability maps of pressure anomalies and for calculating SNRs. Results indicate that the likelihood of detecting anomalies depends on the level of uncertainty and location of monitoring wells. A monitoring well located close to leaky locations may not always yield the strongest signal of leakage when the level of uncertainty is high. Therefore, our results highlight the need for closed-loop site characterization, monitoring network design, and leakage source detection.  相似文献   

5.
In response to recent activity and legislation concerning lead and its role in electric vehicle development, a model has been developed to assess the health risks to residents from environmental lead emissions. This model may be used to predict the risks to residents in the vicinity of facilities discharging lead into the air. This model is also important for risk management, allowing for risk-based regulations regarding limits on lead emissions. The model is comprehensive, linking together a source term, air dispersion model, household exposure model, physiologically-based pharmakokinetic blood-lead model, and a determination of reference dose. Parameters are treated as distributions, and are considered either uncertain or variable. A range of physiological and behavioral parameters are used to distinguish between various age and gender groups, to reflect the variability in risk of adverse effect to these subsets of the exposed population. A sensitivity study is performed, including a case considering the uncertainty in reference dose which is compared to the case of a deterministic reference dose. Different types of variability are investigated, the variability across sensitive sub-populations of age and gender, and the individual variability within these populations. We found that the differentiation between uncertainty and variability in predicting non-cancer risk human health risk was important, and that methods that combined uncertainty and variability were not expected to be protective to sensitive individuals within a sub-population.  相似文献   

6.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment   总被引:15,自引:4,他引:11  
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated.  相似文献   

7.
 There exist many sites with contaminated groundwater because of inappropriate handling or disposal of hazardous materials or wastes. Health risk assessment is an important tool to evaluate the potential environmental and health impacts of these contaminated sites. It is also becoming an important basis for determining whether risk reduction is needed and what actions should be initiated. However, in research related to groundwater risk assessment and management, consideration of multimedia risk assessment and the separation of the uncertainty due to lack of knowledge and the variability due to natural heterogeneity are rare. This study presents a multimedia risk assessment framework with the integration of multimedia transfer and multi-pathway exposure of groundwater contaminants, and investigates whether multimedia risk assessment and the separation of uncertainty and variability can provide a better basis for risk management decisions. The results of the case study show that a decision based on multimedia risk assessment may differ from one based on risk resulting from groundwater only. In particular, the transfer from groundwater to air imposes a health threat to some degree. By using a methodology that combines Monte Carlo simulation, a rank correlation coefficient, and an explicit decision criterion to identify information important to the decision, the results obtained when uncertainty and variability are separate differ from the ones without such separation. In particular, when higher percentiles of uncertainty and variability distributions are considered, the method separating uncertainty and variability identifies TCE concentration as the single most important input parameter, while the method that does not distinguish the two identifies four input parameters as the important information that would influence a decision on risk reduction.  相似文献   

8.
This paper investigates the development of flood hazard and flood risk delineations that account for uncertainty as improvements to standard floodplain maps for coastal watersheds. Current regulatory floodplain maps for the Gulf Coastal United States present 1% flood hazards as polygon features developed using deterministic, steady‐state models that do not consider data uncertainty or natural variability of input parameters. Using the techniques presented here, a standard binary deterministic floodplain delineation is replaced with a flood inundation map showing the underlying flood hazard structure. Additionally, the hazard uncertainty is further transformed to show flood risk as a spatially distributed probable flood depth using concepts familiar to practicing engineers and software tools accepted and understood by regulators. A case study of the proposed hazard and risk assessment methodology is presented for a Gulf Coast watershed, which suggests that storm duration and stage boundary conditions are important variable parameters, whereas rainfall distribution, storm movement, and roughness coefficients contribute less variability. The floodplain with uncertainty for this coastal watershed showed the highest variability in the tidally influenced reaches and showed little variability in the inland riverine reaches. Additionally, comparison of flood hazard maps to flood risk maps shows that they are not directly correlated, as areas of high hazard do not always represent high risk. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.  相似文献   

10.
 The selection of optimal management strategies for environmental contaminants requires detailed information on the risks imposed on populations. These risks are characterized by both inter-subject variability (different individuals having different levels of risk) and by uncertainty (there is uncertainty about the risk associated with the Yth percentile of the variability distribution). In addition, there is uncertainty introduced by the inability to agree fully on the appropriate decision criteria. This paper presents a methodology for incorporating uncertainty and variability into a multi-medium, multi-pathway, multi-contaminant risk assessment, and for placing this assessment into an optimization framework to identify optimal management strategies. The framework is applied to a case study of a sludge management system proposed for North Carolina and the impact of stochasticity on selection of an optimal strategy considered. Different sets of decision criteria reflecting different ways of treating stochasticity are shown to lead to different selections of optimal management strategies.  相似文献   

11.
We present a workflow to estimate geostatistical aquifer parameters from pumping test data using the Python package welltestpy . The procedure of pumping test analysis is exemplified for two data sets from the Horkheimer Insel site and from the Lauswiesen site, Germany. The analysis is based on a semi-analytical drawdown solution from the upscaling approach Radial Coarse Graining, which enables to infer log-transmissivity variance and horizontal correlation length, beside mean transmissivity, and storativity, from pumping test data. We estimate these parameters of aquifer heterogeneity from type-curve analysis and determine their sensitivity. This procedure, implemented in welltestpy , is a template for analyzing any pumping test. It goes beyond the possibilities of standard methods, for example, based on Theis' equation, which are limited to mean transmissivity and storativity. A sensitivity study showed the impact of observation well positions on the parameter estimation quality. The insights of this study help to optimize future test setups for geostatistical aquifer analysis and provides guidance for investigating pumping tests with regard to aquifer statistics using the open-source software package welltestpy .  相似文献   

12.
The variance of collapse capacity is an important constituent of probabilistic methodologies used to evaluate the probability of collapse of structures subjected to earthquake ground motions. This study evaluates the effect of ground motion randomness (i.e. record‐to‐record (RTR) variability) and uncertainty in the deterioration parameters of single‐degree‐of‐freedom (SDOF) systems on the variance of collapse capacity. Collapse capacity is evaluated in terms of a relative intensity defined as the ratio of ground motion intensity to a structure strength parameter. The effect of RTR variability on the variance of collapse capacity is directly obtained by performing dynamic analyses of deteriorating hysteretic models for a set of representative ground motions. The first‐order second‐moment (FOSM) method is used to quantify the effect of deterioration parameter uncertainty. In addition to RTR variability, the results indicate that uncertainty in the displacement at the peak (cap) strength and the post‐capping stiffness significantly contribute to the variance of collapse capacity. If large dispersion of these parameters exists, the effect of uncertainty in deterioration parameters on the variance of collapse capacity may be comparable to that caused by RTR variability. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

13.
Two stochastic models are developed to describe the BOD output (i.e. effluent) variation of facultative aerated lagoons in series. One of the models uses the uncertainty analysis (UA) technique and the other is based on the moment equation solution methodology of stochastic differential equations (SDE's). The former considers a second-order approximation of the expectation (SOAE) and a first-order approximation of the variance (FOAV). The SDE model considers that output variability is accounted for by random variations in the rate coefficient. Comparisons are provided. Calibration and verification of the two models are aciieved by using field observations from two different lagoon systems in series. The predictive performances of the two models are compared with each other and with another SDE model, presented in a previous paper, that considers input randomness. The three methods show similar predictive performances and provide good predictions of the mean and standard deviation of the lagoon effluent BOD concentrations and thus are considered as appropriate methodologies.  相似文献   

14.
Tomas Perina 《Ground water》2020,58(6):993-999
Hydraulic testing for aquifer characterization at contaminated sites often includes tests of short duration and of different types, such as slug tests and pumping tests, conducted at different phases of investigation. Tests conducted on a well cluster installed in a single aquifer can be combined in aggregate inverse analysis using an analytical model for groundwater flow near a test well. A genetic algorithm performs parallel search of the parameter space and provides starting parameter values for a Markov chain Monte Carlo simulation to estimate the parameter distribution. This sequence of inverse methods avoids guessing of the initial parameter vector and the often encountered difficult convergence of gradient-based methods and estimates the parameter covariance matrix from a distribution rather than from a single point in the parameter space. Combination of different tests improves the resolution of the estimated aquifer properties and allows an assessment of the uniformity of the aquifer. Estimated parameter correlations and standard deviations are used as relative metrics to distinguish well resolved and poorly resolved parameters. The methodology is demonstrated on example field tests in unconfined and leaky aquifers.  相似文献   

15.
This study proposes an inverse solution algorithm through which both the aquifer parameters and the zone structure of these parameters can be determined based on a given set of observations on piezometric heads. In the zone structure identification problem fuzzy c-means (FCM) clustering method is used. The association of the zone structure with the transmissivity distribution is accomplished through an optimization model. The meta-heuristic harmony search (HS) algorithm, which is conceptualized using the musical process of searching for a perfect state of harmony, is used as an optimization technique. The optimum parameter zone structure is identified based on three criteria which are the residual error, parameter uncertainty, and structure discrimination. A numerical example given in the literature is solved to demonstrate the performance of the proposed algorithm. Also, a sensitivity analysis is performed to test the performance of the HS algorithm for different sets of solution parameters. Results indicate that the proposed solution algorithm is an effective way in the simultaneous identification of aquifer parameters and their corresponding zone structures.  相似文献   

16.
Aquifers show troubling signs of irreversible depletion as climate change, population growth, and urbanization lead to reduced natural recharge rates and overuse. One strategy to sustain the groundwater supply is to recharge aquifers artificially with reclaimed water or stormwater via managed aquifer recharge and recovery (MAR) systems. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data‐driven, real‐time control. This paper presents a laboratory scale proof‐of‐concept study that demonstrates the capability of a real‐time, simulation‐based control optimization algorithm to ease the operational challenges of MAR systems. Central to the algorithm is a model that simulates water flow and transport of dissolved chemical constituents in the aquifer. The algorithm compensates for model parameter uncertainty by continually collecting data from a network of sensors embedded within the aquifer. At regular intervals the sensor data is fed into an inversion algorithm, which calibrates the uncertain parameters and generates the initial conditions required to model the system behavior. The calibrated model is then incorporated into a genetic algorithm that executes simulations and determines the best management action, for example, the optimal pumping policy for current aquifer management goals. Experiments to calibrate and validate the simulation‐optimization algorithm were conducted in a small two‐dimensional synthetic aquifer under both homogeneous and heterogeneous packing configurations. Results from initial experiments validated the feasibility of the approach and suggested that our system could improve the operation of full‐scale MAR facilities.  相似文献   

17.
Issues in sediment toxicity and ecological risk assessment   总被引:8,自引:0,他引:8  
This paper is based on a facilitated Workshop and Roundtable Discussion of key issues in sediment toxicology and ecological risk assessment (ERA) as applied to sediments that was held at the Conference on Dredged Material Management: Options and Environmental Considerations. The issues addressed included how toxicity is defined and perceived, how it is measured, and how it should be used within the context of ERA to support management decisions. The following conclusions were reached regarding scientific considerations of these issues. Toxicity is a measure of hazard and not a risk per se. Thus, toxicity testing is a means but not the end to understand risks of sediments. Toxicity testing cannot presently be replaced by chemical analyses to define hazard. Toxicity test organisms need to be appropriate to the problem being addressed, and the results put into context relative to both reference and baseline comparisons to understand hazard. Use of toxicity tests in sediment ERAs requires appropriate endpoints and risk hypotheses, considering ecological not just statistical significance, and recognizing that hazard does not equate to risk. Toxicity should be linked to population and community response to support decision-making, assessing possible genotypic adaptations that can influence risk estimates, and addressing uncertainty. Additionally, several key scientific issues were identified to improve future sediment ERAs, including the need to improve basic understanding of ecological mechanisms and processes, recognition of variability in the assessment process, and an improved focus and ability to assess risks to populations and communities.  相似文献   

18.
Stauffer F 《Ground water》2005,43(6):843-849
A method is proposed to estimate the uncertainty of the location of pathlines in two-dimensional, steady-state confined or unconfined flow in aquifers due to the uncertainty of the spatially variable unconditional hydraulic conductivity or transmissivity field. The method is based on concepts of the semianalytical first-order theory given in Stauffer et al. (2002, 2004), which allows estimates of the lateral second moment (variance) of the location of a moving particle. However, this method is reformulated in order to account for nonuniform recharge and nonuniform aquifer thickness. One prominent application is the uncertainty estimation of the catchment of a pumping well by considering the boundary pathlines starting at a stagnation point. In this method, the advective transport of particles is considered, based on the velocity field. In the case of a well catchment, backtracking is applied by using the reversed velocity field. Spatial variability of hydraulic conductivity or transmissivity is considered by taking into account an isotropic exponential covariance function of log-transformed values with parameters describing the variance and correlation length. The method allows postprocessing of results from ground water models with respect to uncertainty estimation. The code PPPath, which was developed for this purpose, provides a postprocessing of pathline computations under PMWIN, which is based on MODFLOW. In order to test the methodology, it was applied to results from Monte Carlo simulations for catchments of pumping wells. The results correspond well. Practical applications illustrate the use of the method in aquifers.  相似文献   

19.
Empirical fragility curves, constructed from databases of thousands of building-damage observations, are commonly used for earthquake risk assessments, particularly in Europe and Japan, where building stocks are often difficult to model analytically (e.g. old masonry structures or timber dwellings). Curves from different studies, however, display considerable differences, which lead to high uncertainty in the assessed seismic risk. One potential reason for this dispersion is the almost universal neglect of the spatial variability in ground motions and the epistemic uncertainty in ground-motion prediction. In this paper, databases of building damage are simulated using ground-motion fields that take account of spatial variability and a known fragility curve. These databases are then inverted, applying a standard approach for the derivation of empirical fragility curves, and the difference with the known curve is studied. A parametric analysis is conducted to investigate the impact of various assumptions on the results. By this approach, it is concluded that ground-motion variability leads to flatter fragility curves and that the epistemic uncertainty in the ground-motion prediction equation used can have a dramatic impact on the derived curves. Without dense ground-motion recording networks in the epicentral area empirical curves will remain highly uncertain. Moreover, the use of aggregated damage observations appears to substantially increase uncertainty in the empirical fragility assessment. In contrast, the use of limited randomly-chosen un-aggregated samples in the affected area can result in good predictions of fragility.  相似文献   

20.
Using semivariogram parameter uncertainty in hydrogeological applications   总被引:1,自引:0,他引:1  
Geostatistical estimation (kriging) and geostatistical simulation are routinely used in ground water hydrology for optimal spatial interpolation and Monte Carlo risk assessment, respectively. Both techniques are based on a model of spatial variability (semivariogram or covariance) that generally is not known but must be inferred from the experimental data. Where the number of experimental data is small (say, several tens), as is not unusual in ground water hydrology, the model fitted to the empirical semivariogram entails considerable uncertainty. If all the practical results are based on this unique fitted model, the final results will be biased. We propose that, instead of using a unique semivariogram model, the full range of models that are inside a given confidence region should be used, and the weight that each semivariogram model has on the final result should depend on its plausibility. The first task, then, is to evaluate the uncertainty of the model, which can be efficiently done by using maximum likelihood inference. The second task is to use the range of plausible models in applications and to show the effect observed on the final results. This procedure is put forth here with kriging and simulation applications, where the uncertainty in semivariogram parameters is propagated into the final results (e.g., the prediction of ground water head). A case study using log-transmissivity data from the Vega de Granada aquifer, in southern Spain, is given to illustrate the methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号