首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
Methodology for comparing source and plume remediation alternatives   总被引:2,自引:0,他引:2  
Falta RW 《Ground water》2008,46(2):272-285
It is often difficult at contaminated sites to decide whether remediation effort should be focused on the contaminant source, the dissolved plume, or on both zones. The decision process at these sites is hampered by a lack of quantitative tools for comparing remediation alternatives. A new screening-level mass balance approach is developed for simulating the transient effects of simultaneous ground water source and plume remediation. The contaminant source model is based on a power function relationship between source mass and source discharge, and it can consider partial source remediation at any time after the initial release. The source model serves as a time-dependent mass flux boundary condition to a new analytical plume model, where flow is assumed to be one dimensional, with three-dimensional dispersion. The plume model simulates first-order sequential decay and production of several species, and the decay rates and parent/daughter yield coefficients are variable functions of time and distance. This new method allows for flexible simulation of natural attenuation or remediation efforts that enhance plume degradation. The plume remediation effort may be temporary or delayed in time, limited in space, and it may have different chemical effects on different contaminant species in the decay chain.  相似文献   

2.
Variable spatial and temporal weighting of the advective contaminant mole fraction term is explored as a means of reducing numerical dispersion of contaminant plumes in a multi-phase compositional simulator. The spatial schemes considered are upstream, central and a non-linear flux limiter, while fully-implicit and Crank-Nicolson time weighting are examined. The performance of each weighting scheme, in terms of stability of the Newton iteration and computational cost, is assessed for simplified problems designed to be representative of various aspects of more complex subsurface remediation problems. Results indicate that for problems with an homogeneous permeability field, the non-linear flux limited along with fully-implicit weighting gives superior performance to any other combination of spatial and temporal weighting schemes. For heterogeneous permeability fields, the macrodispersion imparted by heterogeneity dominates numerical dispersion so that smearing of contaminant mole fraction fronts does not appear to be a serious problem.  相似文献   

3.
Desorption is one of the most critical processes affecting the effectiveness of soil and ground water remediation. None of the currently adopted desorption models can accurately quantify desorption of low-hydrophobicity organic chemicals, and thus could potentially mislead remediation design and decision-making. A recently developed dual-equilibrium desorption (DED) model was found to be much more accurate in quantifying desorption. A screening-level transport model, DED-Transport, was developed to simulate the DED effect on behaviors of organic contaminant plumes during remediation. DED-Transport requires only simple parameters, but is applicable to many remediation scenarios. DED-Transport can be used as a decision-support tool in site remediation to more precisely predict the time required for cleanup.  相似文献   

4.
We examine the effect of uncertainty due to limited information on the remediation design of a contaminated aquifer using the pump and treat method. The hydraulic conductivity and contaminant concentration distributions for a fictitious contaminated aquifer are generated assuming a limited number of sampling locations. Stochastic optimization with multiple realizations is used to account for aquifer uncertainty. The optimization process involves a genetic algorithm (GA). As the number of realizations increases, a greater extraction rate and more wells are needed. There was a total cost increase, but the optimal remediation designs became more reliable. Stochastic optimization analysis also determines the locations for extraction wells, the variation in extraction rates as a function of the change of well locations, and the reliability of the optimal designs. The number of realizations (stack number) that caused the design factors to converge could be determined. Effective stochastic optimization may be achieved by reducing computational resources. An increase in the variability of the conductivity distribution requires more extraction wells. Information about potential extraction wells can be used to prevent failure of the remediation task.  相似文献   

5.
A new methodology is proposed to optimize monitoring networks for identification of the extent of contaminant plumes. The optimal locations for monitoring wells are determined as the points where maximal decreases are expected in the quantified uncertainty about contaminant existence after well installation. In this study, hydraulic conductivity is considered to be the factor that causes uncertainty. The successive random addition (SRA) method is used to generate random fields of hydraulic conductivity. The expected value of information criterion for the existence of a contaminant plume is evaluated based on how much the uncertainty of plume distribution reduces with increases in the size of the monitoring network. The minimum array of monitoring wells that yields the maximum information is selected as the optimal monitoring network. In order to quantify the uncertainty of the plume distribution, the probability map of contaminant existence is made for all generated contaminant plume realizations on the domain field. The uncertainty is defined as the sum of the areas where the probability of contaminant existence or nonexistence is uncertain. Results of numerical experiments for determination of optimal monitoring networks in heterogeneous conductivity fields are presented.  相似文献   

6.
Sun AY 《Ground water》2008,46(4):638-641
Model-based contaminant source identification plays an important role in effective site remediation. In this article, a contaminant source identification toolbox (CONSID) is introduced as a framework for solving contaminant source identification problems. It is known that the presence of various types of model uncertainties can severely undermine the performance of many existing source estimators. The current version of CONSID consists of two robust estimators for recovering source release histories under model uncertainty; one was developed in the deterministic framework and the other in the stochastic framework. To use the robust estimators provided in CONSID, the user is required to have only modest prior knowledge about the model uncertainty and be able to estimate the bound of model deviations resulting from the uncertainty. The toolbox is designed so that other source estimators can be added easily. A step-by-step guidance for using CONSID is described and an example is provided.  相似文献   

7.
A stochastic optimization model based on an adaptive feedback correction process and surrogate model uncertainty was proposed and applied for remediation strategy design at a dense non-aqueous phase liquids (DNAPL)-contaminated groundwater site. One hundred initial training samples were obtained using the Latin hypercube sampling method. A surrogate model of a multiphase flow simulation model was constructed based on these samples employing the self-adaptive particle swarm optimization kriging (SAPSOKRG) method. An optimization model was built, using the SAPSOKRG surrogate model as a constraint. Then, an adaptive feedback correction process was designed and applied to iteratively update the training samples, surrogate model, and optimization model. Results showed that the training samples, the surrogate model, and the optimization model were effectively ameliorated. However, the surrogate model is an approximation of the simulation model, and some degree of uncertainty exists even though the surrogate model was ameliorated. Therefore, residuals between the surrogate model and the simulation model were calculated, and an uncertainty analysis was conducted. Based on the uncertainty analysis results, a stochastic optimization model was constructed and solved to obtain optimal remediation strategies at different confidence levels (60, 70, 80, 90, 95%) and under different remediation objectives (average DNAPL removal rate ≥?70,?≥?75,?≥?80,?≥?85,?≥?90%). The optimization results demonstrated that the higher the confidence level and remediation objective, the more expensive was remediation. Therefore, decision makers can weigh remediation costs, confidence levels, and remediation objectives to make an informed choice. This also allows decision makers to determine the reliability of a selected strategy and provides a new tool for DNAPL-contaminated groundwater remediation design.  相似文献   

8.
This paper presents a hybrid information fusion approach that integrates the cloud model and the D–S evidence theory to perceiving safety risks using sensor data under uncertainty. The cloud model provides an uncertain transforming tool between qualitative concepts and their quantitative expressions and uses the measurement of correlation to construct Basic Probability Assignments. An improved evidence aggregation strategy that combines the Dempster’ rule and the weighted mean rule is developed to get rid of counter-intuitive dilemma existing in a combination of high-conflict evidence. A three-layer information fusion framework consisting of sensor fusion, factor fusion, and area fusion is proposed to synthesize multi-source information to get the final fusion results. The developed cloud D–S approach is applied to the assessment of the safety of a real tailings dam in operation in China as a case study. Data information acquired from 28 monitoring sensors is fused in a continuous manner in order to obtain the overall safety level of the tailings dam. Results indicate that the developed approach is capable of achieving multi-layer information fusion and identifying global sensitivities of input factors under uncertainty. The developed approach proves to perform a strong robustness and fault-tolerant capacity, and can be used by practitioners in the industry as a decision tool to perceive and anticipate the safety risks in tailings dams.  相似文献   

9.
Traditional single-objective programs cannot deal with the tradeoffs between the decision makers who represent different perspectives and have inconsistent decision goals. Multi-objective ones can hardly represent a complex dominant-subordinate relationship between the leader and the follower. This study presents a new bilevel programming model with considering leader–follower-related health-risk and economic goals for optimal groundwater remediation management. The bilevel model is formulated by integrating health-risk assessment and environmental standards (the leader or the environmental concern) and remediation cost (the follower or the economic concern) into a general framework. In addition, stochastic uncertainty in health risk assessment is considered into the decision-making process. The developed bilevel model is then applied to a petroleum-contaminated aquifer in Canada. Results indicate that the performance of bilevel programming can not only meet the low remediation cost as the expectation from the follower but also simultaneously conform to the low contamination level as the expectation from the leader. Furthermore, comparative analyses show that the bilevel model with two-level concerns has the advantage of maximizing the interests and satisfaction degrees of decision makers, which can avoid the extreme results generated from the single-level models.  相似文献   

10.
Methods are developed to use data collected during cyclic operation of soil vapor extraction (SVE) systems to help characterize the magnitudes and time scales of mass flux associated with vadose zone contaminant sources. Operational data collected at the Department of Energy’s Hanford site are used to illustrate the use of such data. An analysis was conducted of carbon tetrachloride vapor concentrations collected during and between SVE operations. The objective of the analysis was to evaluate changes in concentrations measured during periods of operation and nonoperation of SVE, with a focus on quantifying temporal dynamics of the vadose zone contaminant mass flux, and associated source strength. Three mass flux terms, representing mass flux during the initial period of an SVE cycle, during the asymptotic period of a cycle, and during the rebound period, were calculated and compared. It was shown that it is possible to use the data to estimate time frames for effective operation of an SVE system if a sufficient set of historical cyclic operational data exists. This information could then be used to help evaluate changes in SVE operations, including system closure. The mass flux data would also be useful for risk assessments of the impact of vadose zone sources on groundwater contamination or vapor intrusion.  相似文献   

11.
Site closure for soil vacuum extraction (SVE) application typically requires attainment or specified soil concentration standards based on the premise that mass flux from the vadose zone to ground water not result in levels exceeding maximum contaminant levels (MCLs). Unfortunately, realization of MCLs in ground water may not be attainable at many sites. This results in soil remediation efforts that may be in excess of what is necessary for future protection of ground water and soil remediation goals which often cannot be achieved within a reasonable time period. Soil venting practitioners have attempted to circumvent these problems by basing closure on some predefined percent total mass removal, or an approach to a vapor concentration asymptote. These approaches, however, are subjective and influenced by venting design. We propose an alternative strategy based on evaluation of five components: (1) site characterization, (2) design. (3) performance monitoring, (4) rule-limited vapor transport, and (5) mass flux to and from ground water. Demonstration of closure is dependent on satisfactory assessment of all five components. The focus of this paper is to support mass flux evaluation. We present a plan based on monitoring of three subsurface zones and develop an analytical one-dimensional vertical flux model we term VFLUX. VFLUX is a significant improvement over the well-known numerical one-dimensional model. VLEACH, which is often used for estimation of mass flux to ground water, because it allows for the presence of nonaqueous phase liquids (NAPLs) in soil, degradation, and a lime-dependent boundary condition at the water table inter-face. The time-dependent boundary condition is the center-piece of our mass flux approach because it dynamically links performance of ground water remediation lo SVE closure. Progress or lack of progress in ground water remediation results in either increasingly or decreasingly stringent closure requirements, respectively.  相似文献   

12.
This study investigates stochastic optimization of dense nonaqueous phase liquid (DNAPL) remediation design at Dover Air Force Base Area 5 using emulsified vegetable oil (EVO) injection. The Stochastic Cost Optimization Toolkit (SCOToolkit) is used for the study, which couples semianalytical DNAPL source depletion and transport models with parameter estimation, error propagation, and stochastic optimization modules that can consider multiple sources and remediation strategies. Model parameters are calibrated to field data conditions on prior estimates of parameters and their uncertainty. Monte Carlo simulations are then performed to identify optimal remediation decisions that minimize the expected net present value (NPV) cleanup cost while maintaining concentrations at compliance wells under the maximum contaminant level (MCL). The results show that annual operating costs could be reduced by approximately 50% by implementing the identified optimal remediation strategy. We also show that recalibration and reoptimization after 50 years using additional monitoring data could lead to a further 60% reduction in annual operating cost increases the reliability of the proposed remediation actions.  相似文献   

13.
Most established methods to characterize aquifer structure and hydraulic conductivities of hydrostratigraphical units are not capable of delivering sufficient information in the spatial resolution that is desired for sophisticated numerical contaminant transport modeling and adapted remediation design. With hydraulic investigation methods based on the direct-push (DP) technology such as DP slug tests, DP injection logging, and the hydraulic profiling tool, it is possible to rapidly delineate hydrogeological structures and estimate their hydraulic conductivity in shallow unconsolidated aquifers without the need for wells. A combined application of these tools was used for the investigation of a contaminated German refinery site and for the setup of hydraulic aquifer models. The quality of DP investigation and the models was evaluated by comparisons of tracer transport simulations using these models and measured breakthroughs of two natural gradient tracer tests. Model scenarios considering the information of all tools together showed good reproduction of the measured breakthroughs, indicating the suitability of the approach and a minor impact of potential technical limitations. Using the DP slug tests alone yielded significantly higher deviations for the determined hydraulic conductivities compared to considering two or three of the tools. Realistic aquifer models developed on basis of such combined DP investigation approaches can help optimize remediation concepts or identify flow regimes for aquifers with a complex structure.  相似文献   

14.
Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to theex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971–1988.  相似文献   

15.
A robust risk analysis method (RRAM) is developed for water resource decision making under uncertainty. This method incorporates interval-parameter programming and robust optimization within a stochastic programming framework. In the RRAM formulation, penalties are exercised with the recourse against any infeasibility, and robustness measures are introduced to examine the variability of the second stage costs which are above the expected levels. In this study, a number of weighting levels are considered which correspond to the robustness levels of risk control. Generally, a plan with a higher robust level would better resist from system risk. Thus, decision with a lower robust level can correspond to a higher risk of system failure. There is a tradeoff between system cost and system reliability. The RRAM is applied to a case of water resource management. The modeling results can help generate desired decision alternatives that will be particularly useful for risk-aversive decision makers in handling high-variability conditions. The results provide opportunities to managers to make decisions based on their own preferences on system stability and economy, and ensure that the management policies and plans be made with reasonable consideration of both system cost and risk.  相似文献   

16.
Soil and groundwater contamination are often managed by establishing on‐site cleanup targets within the context of risk assessment or risk management measures. Decision‐makers rely on modeling tools to provide insight; however, it is recognized that all models are subject to uncertainty. This case study compares suggested remediation requirements using a site‐specific numerical model and a standardized analytical tool to evaluate risk to a downgradient wetland receptor posed by on‐site chloride impacts. The base case model, calibrated to observed non‐pumping and pumping conditions, predicts a peak concentration well above regulatory criteria. Remediation scenarios are iteratively evaluated to determine a remediation design that adheres to practical site constraints, while minimizing the potential for risk to the downgradient receptor. A nonlinear uncertainty analysis is applied to each remediation scenario to stochastically evaluate the risk and find a solution that meets the site‐owner risk tolerance, which in this case required a risk‐averse solution. This approach, which couples nonlinear uncertainty analysis with a site‐specific numerical model provides an enhanced level of knowledge to foster informed decision‐making (i.e., risk‐of‐success) and also increases stakeholder confidence in the remediation design.  相似文献   

17.
Funnel-and-Gate Performance in a Moderately Heterogeneous Flow Domain   总被引:1,自引:0,他引:1  
The funnel-and-gate ground water remediation technology (Starr and Cherry 1994) has received increased attention and application as an in situ alternative to the typical pump-and-treat system. Understanding the effects of heterogeneity on system performance can mean the difference between a successful remediation project and one that fails to meet its cleanup goals.
In an attempt to characterize and quantify the effects of heterogeneity on funnel-and-gate system performance, a numerical modeling study of 15 simulated heterogeneous flow domains was conducted. Each realization was tested to determine if the predicted capture width met the capture width expected for a homogeneous flow domain with the same hulk properties. This study revealed that the capture width of the funnel-and-gate system varied significantly with the level of heterogeneity of the aquifer.
Two possible remedies were investigated for bringing systems with less than acceptable capture widths to acceptable levels of performance. First, it was determined that enlarging the funnel and gate via a factor of safety applied to the design capture width could compensate for the capture width variation in the heterogeneous flow domains. In addition, it was shown that the use of a pumping well downstream of the funnel and gate could compensate for the effects of aquifer heterogeneity on the funnel-and-gate capture width. However, if a pumping well is placed downstream of the funnel and gate to control the hydraulic gradient through the gate, consideration should be given to the gate residence time in relation to the geochemistry of the contaminant removal or destruction process in the gate.  相似文献   

18.
Fine-scale hydrostratigraphic features often play a critical role in controlling ground water flow and contaminant transport. Unfortunately, many conventional drilling- and geophysics-based approaches are rarely capable of describing these features at the level of detail needed for contaminant predictions and remediation designs. Previous work has shown that direct-push electrical conductivity (EC) logging can provide information about site hydrostratigraphy at a scale of relevance for contaminant transport investigations in many unconsolidated settings. In this study, we evaluate the resolution and quality of that information at a well-studied research site that is underlain by highly stratified alluvial sediments. Geologic and hydrologic data, conventional geophysical logs, and particle-size analyses are used to demonstrate the capability of direct-push EC logging for the delineation of fine-scale hydrostratigraphic features in saturated unconsolidated formations. When variations in pore-fluid chemistry are small, the electrical conductivity of saturated media is primarily a function of clay content, and hydrostratigraphic features can be described at a level of detail (<2.5 cm in thickness) that has not previously been possible in the absence of continuous cores. Series of direct-push EC logs can be used to map the lateral continuity of layers with non-negligible clay content and to develop important new insights into flow and transport at a site. However, in sand and gravel intervals with negligible clay, EC logging provides little information about hydrostratigraphic features. As with all electrical logging methods, some site-specific information about the relative importance of fluid and sediment contributions to electrical conductivity is needed. Ongoing research is directed at developing direct-push methods that allow EC logging, water sampling, and hydraulic testing to be done concurrently.  相似文献   

19.
 We illustrate a method of global sensitivity analysis and we test it on a preliminary case study in the field of environmental assessment to quantify uncertainty importance in poorly-known model parameters and spatially referenced input data. The focus of the paper is to show how the methodology provides guidance to improve the quality of environmental assessment practices and decision support systems employed in environmental policy. Global sensitivity analysis, coupled with uncertainty analysis, is a tool to assess the robustness of decisions, to understand whether the current state of knowledge on input data and parametric uncertainties is sufficient to enable a decision to be taken. The methodology is applied to a preliminary case study, which is based on a numerical model that employs GIS-based soil data and expert consultation to evaluate an index that joins environmental and economic aspects of land depletion. The index is used as a yardstick by decision-makers involved in the planning of highways to identify the route that minimises the overall impact.  相似文献   

20.
Innovative remediation studies were conducted between 1994 and 2004 at sites contaminated by nonaqueous phase liquids (NAPLs) at Hill and Dover AFB, and included technologies that mobilize, solubilize, and volatilize NAPL: air sparging (AS), surfactant flushing, cosolvent flooding, and flushing with a complexing-sugar solution. The experiments proved that aggressive remedial efforts tailored to the contaminant can remove more than 90% of the NAPL-phase contaminant mass. Site-characterization methods were tested as part of these field efforts, including partitioning tracer tests, biotracer tests, and mass-flux measurements. A significant reduction in the groundwater contaminant mass flux was achieved despite incomplete removal of the source. The effectiveness of soil, groundwater, and tracer based characterization methods may be site and technology specific. Employing multiple methods can improve characterization. The studies elucidated the importance of small-scale heterogeneities on remediation effectiveness, and fomented research on enhanced-delivery methods. Most contaminant removal occurs in hydraulically accessible zones, and complete removal is limited by contaminant mass stored in inaccessible zones. These studies illustrated the importance of understanding the fluid dynamics and interfacial behavior of injected fluids on remediation design and implementation. The importance of understanding the dynamics of NAPL-mixture dissolution and removal was highlighted. The results from these studies helped researchers better understand what processes and scales are most important to include in mathematical models used for design and data analysis. Finally, the work at these sites emphasized the importance and feasibility of recycling and reusing chemical agents, and enabled the implementation and success of follow-on full-scale efforts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号