首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
A groundwater model characterized by a lack of field data about hydraulic model parameters and boundary conditions combined with many observation data sets for calibration purpose was investigated concerning model uncertainty. Seven different conceptual models with a stepwise increase from 0 to 30 adjustable parameters were calibrated using PEST. Residuals, sensitivities, the Akaike information criterion (AIC and AICc), Bayesian information criterion (BIC), and Kashyap's information criterion (KIC) were calculated for a set of seven inverse calibrated models with increasing complexity. Finally, the likelihood of each model was computed. Comparing only residuals of the different conceptual models leads to an overparameterization and certainty loss in the conceptual model approach. The model employing only uncalibrated hydraulic parameters, estimated from sedimentological information, obtained the worst AIC, BIC, and KIC values. Using only sedimentological data to derive hydraulic parameters introduces a systematic error into the simulation results and cannot be recommended for generating a valuable model. For numerical investigations with high numbers of calibration data the BIC and KIC select as optimal a simpler model than the AIC. The model with 15 adjusted parameters was evaluated by AIC as the best option and obtained a likelihood of 98%. The AIC disregards the potential model structure error and the selection of the KIC is, therefore, more appropriate. Sensitivities to piezometric heads were highest for the model with only five adjustable parameters and sensitivity coefficients were directly influenced by the changes in extracted groundwater volumes.  相似文献   

2.
This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.  相似文献   

3.
Soil heterogeneity and data sparsity combine to render estimates of infiltration rates uncertain. We develop reduced complexity models for the probabilistic forecasting of infiltration rates in heterogeneous soils during surface runoff and/or flooding events. These models yield closed-form semi-analytical expressions for the single- and multi-point infiltration-rate PDFs (probability density functions), which quantify predictive uncertainty stemming from uncertainty in soil properties. These solutions enable us to investigate the relative importance of uncertainty in various hydraulic parameters and the effects of their cross-correlation. At early times, the infiltration-rate PDFs computed with the reduced complexity models are in close agreement with their counterparts obtained from a full infiltration model based on the Richards equation. At all times, the reduced complexity models provide conservative estimates of predictive uncertainty.  相似文献   

4.
The level of model complexity that can be effectively supported by available information has long been a subject of many studies in hydrologic modelling. In particular, distributed parameter models tend to be regarded as overparameterized because of numerous parameters used to describe spatially heterogeneous hydrologic processes. However, it is not clear how parameters and observations influence the degree of overparameterization, equifinality of parameter values, and uncertainty. This study investigated the impact of the numbers of observations and parameters on calibration quality including equifinality among calibrated parameter values, model performance, and output/parameter uncertainty using the Soil and Water Assessment Tool model. In the experiments, the number of observations was increased by expanding the calibration period or by including measurements made at inner points of a watershed. Similarly, additional calibration parameters were included in the order of their sensitivity. Then, unique sets of parameters were calibrated with the same objective function, optimization algorithm, and stopping criteria but different numbers of observations. The calibration quality was quantified with statistics calculated based on the ‘behavioural’ parameter sets, identified using 1% and 5% cut‐off thresholds in a generalized likelihood uncertainty estimation framework. The study demonstrated that equifinality, model performance, and output/parameter uncertainty were responsive to the numbers of observations and calibration parameters; however, the relationship between the numbers, equifinality, and uncertainty was not always conclusive. Model performance improved with increased numbers of calibration parameters and observations, and substantial equifinality did neither necessarily mean bad model performance nor large uncertainty in the model outputs and parameters. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

5.
Information theory is the basis for understanding how information is transmitted as observations. Observation data can be used to compare uncertainty on parameter estimates and predictions between models. Jacobian Information (JI) is quantified as the determinant of the weighted Jacobian (sensitivity) matrix. Fisher Information (FI) is quantified as the determinant of the weighted FI matrix. FI measures the relative disorder of a model (entropy) in a set of models. One‐dimensional models are used to demonstrate the relationship between JI and FI, and the resulting uncertainty on estimated parameter values and model predictions for increasing model complexity, different model structures, different boundary conditions, and over‐fitted models. Greater model complexity results in increased JI accompanied by an increase in parameter and prediction uncertainty. FI generally increases with increasing model complexity unless model error is large. Models with lower FI have a higher level of disorder (increase in entropy) which results in greater uncertainty of parameter estimates and model predictions. A constant‐head boundary constrains the heads in the area near the boundary, reducing sensitivity of simulated equivalents to estimated parameters. JI and FI are lower for this boundary condition as compared to a constant‐outflow boundary in which the heads in the area of the boundary can adjust freely. Complex, over‐fitted models, in which the structure of the model is not supported by the observation dataset, result in lower JI and FI because there is insufficient information to estimate all parameters in the model.  相似文献   

6.
This paper investigates the effects of uncertainty in rock-physics models on reservoir parameter estimation using seismic amplitude variation with angle and controlled-source electromagnetics data. The reservoir parameters are related to electrical resistivity by the Poupon model and to elastic moduli and density by the Xu-White model. To handle uncertainty in the rock-physics models, we consider their outputs to be random functions with modes or means given by the predictions of those rock-physics models and we consider the parameters of the rock-physics models to be random variables defined by specified probability distributions. Using a Bayesian framework and Markov Chain Monte Carlo sampling methods, we are able to obtain estimates of reservoir parameters and information on the uncertainty in the estimation. The developed method is applied to a synthetic case study based on a layered reservoir model and the results show that uncertainty in both rock-physics models and in their parameters may have significant effects on reservoir parameter estimation. When the biases in rock-physics models and in their associated parameters are unknown, conventional joint inversion approaches, which consider rock-physics models as deterministic functions and the model parameters as fixed values, may produce misleading results. The developed stochastic method in this study provides an integrated approach for quantifying how uncertainty and biases in rock-physics models and in their associated parameters affect the estimates of reservoir parameters and therefore is a more robust method for reservoir parameter estimation.  相似文献   

7.
Multiple criteria decision making (MCDM) is a collection of methodologies to compare, select, or rank multiple alternatives that typically involve incommensurate attributes. MCDM is well-suited for eliciting and modeling the flood preferences of stakeholders and for improving the coordination among flood agencies, organizations and affected citizens. A flood decision support system (DSS) architecture is put forth that integrates the latest advances in MCDM, remote sensing, GIS, hydrologic models, and real-time flood information systems. The analytic network process (ANP) is discussed with application to short-term flood management options for the middle reaches of the Yangtze River. It is shown that DSS and MCDM can improve flood risk planning and management under uncertainty by providing data displays, analytical results, and model output to summarize critical flood information.  相似文献   

8.
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.  相似文献   

9.
Parameter uncertainty in hydrologic modeling is crucial to the flood simulation and forecasting. The Bayesian approach allows one to estimate parameters according to prior expert knowledge as well as observational data about model parameter values. This study assesses the performance of two popular uncertainty analysis (UA) techniques, i.e., generalized likelihood uncertainty estimation (GLUE) and Bayesian method implemented with the Markov chain Monte Carlo sampling algorithm, in evaluating model parameter uncertainty in flood simulations. These two methods were applied to the semi-distributed Topographic hydrologic model (TOPMODEL) that includes five parameters. A case study was carried out for a small humid catchment in the southeastern China. The performance assessment of the GLUE and Bayesian methods were conducted with advanced tools suited for probabilistic simulations of continuous variables such as streamflow. Graphical tools and scalar metrics were used to test several attributes of the simulation quality of selected flood events: deterministic accuracy and the accuracy of 95 % prediction probability uncertainty band (95PPU). Sensitivity analysis was conducted to identify sensitive parameters that largely affect the model output results. Subsequently, the GLUE and Bayesian methods were used to analyze the uncertainty of sensitive parameters and further to produce their posterior distributions. Based on their posterior parameter samples, TOPMODEL’s simulations and the corresponding UA results were conducted. Results show that the form of exponential decline in conductivity and the overland flow routing velocity were sensitive parameters in TOPMODEL in our case. Small changes in these two parameters would lead to large differences in flood simulation results. Results also suggest that, for both UA techniques, most of streamflow observations were bracketed by 95PPU with the containing ratio value larger than 80 %. In comparison, GLUE gave narrower prediction uncertainty bands than the Bayesian method. It was found that the mode estimates of parameter posterior distributions are suitable to result in better performance of deterministic outputs than the 50 % percentiles for both the GLUE and Bayesian analyses. In addition, the simulation results calibrated with Rosenbrock optimization algorithm show a better agreement with the observations than the UA’s 50 % percentiles but slightly worse than the hydrographs from the mode estimates. The results clearly emphasize the importance of using model uncertainty diagnostic approaches in flood simulations.  相似文献   

10.
Model uncertainty is rarely considered in the field of biogeochemical modeling. The standard biogeochemical modeling approach is to proceed based on one selected model with the “right” complexity level based on data availability. However, other plausible models can result in dissimilar answer to the scientific question in hand using the same set of data. Relying on a single model can lead to underestimation of uncertainty associated with the results and therefore lead to unreliable conclusions. Multi-model ensemble strategy is a means to exploit the diversity of skillful predictions from different models with multiple levels of complexity. The aim of this paper is two fold, first to explore the impact of a model’s complexity level on the accuracy of the end results and second to introduce a probabilistic multi-model strategy in the context of a process-based biogeochemical model. We developed three different versions of a biogeochemical model, TOUGHREACT-N, with various complexity levels. Each one of these models was calibrated against the observed data from a tomato field in Western Sacramento County, California, and considered two different weighting sets on the objective function. This way we created a set of six ensemble members. The Bayesian Model Averaging (BMA) approach was then used to combine these ensemble members by the likelihood that an individual model is correct given the observations. Our results demonstrated that none of the models regardless of their complexity level under both weighting schemes were capable of representing all the different processes within our study field. Later we found that it is also valuable to explore BMA to assess the structural inadequacy inherent in each model. The performance of BMA expected prediction is generally superior to the individual models included in the ensemble especially when it comes to predicting gas emissions. The BMA assessed 95% uncertainty bounds bracket 90–100% of the observations. The results clearly indicate the need to consider a multi-model ensemble strategy over a single model selection in biogeochemical modeling study.  相似文献   

11.
Highly detailed physically based groundwater models are often applied to make predictions of system states under unknown forcing. The required analysis of uncertainty is often unfeasible due to the high computational demand. We combine two possible solution strategies: (1) the use of faster surrogate models; and (2) a robust data worth analysis combining quick first-order second-moment uncertainty quantification with null-space Monte Carlo techniques to account for parametric uncertainty. A structurally and parametrically simplified model and a proper orthogonal decomposition (POD) surrogate are investigated. Data worth estimations by both surrogates are compared against estimates by a complex MODFLOW benchmark model of an aquifer in New Zealand. Data worth is defined as the change in post-calibration predictive uncertainty of groundwater head, river-groundwater exchange flux, and drain flux data, compared to the calibrated model. It incorporates existing observations, potential new measurements of system states (“additional” data) as well as knowledge of model parameters (“parametric” data). The data worth analysis is extended to account for non-uniqueness of model parameters by null-space Monte Carlo sampling. Data worth estimates of the surrogates and the benchmark suggest good agreement for both surrogates in estimating worth of existing data. The structural simplification surrogate only partially reproduces the worth of “additional” data and is unable to estimate “parametric” data, while the POD model is in agreement with the complex benchmark for both “additional” and “parametric” data. The variance of the POD data worth estimates suggests the need to account for parameter non-uniqueness, like presented here, for robust results.  相似文献   

12.
Eight one-dimensional steady-state models with different complexity, which describe the phosphate concentration as a function of the distance along a river, were examined with respect to accuracy and uncertainty of the model results and identifiability of the model parameters by means of combined calibration and sensitivity analysis using Monte Carlo simulations. In addition, the models were evaluated by the Akaike information criterion (AIC). All eight models were calibrated on the same data set from the Biebrza River, Poland. Although the accuracy increases with model complexity, the percentage of explained variance is not significantly improved in comparison with the model that describes the phosphate concentration by means of three parameters. This model also yields the minimum value of the AIC and the parameters could be well identified. Identification of the model parameters becomes poorer with increasing model complexity; in other words the parameters become increasingly correlated. This scarcely affects the uncertainty of the model results if correlation is taken into account. If correlation is not taken into account, the uncertainty of model results increases with model complexity. © 1997 by John Wiley & Sons, Ltd.  相似文献   

13.
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.  相似文献   

14.
《水文科学杂志》2013,58(4):685-695
Abstract

Employing 1-, 2-, 4-, 6-, 12- and 24-hourly data sets for two catchments (10.6 and 298 km2) in Wales, the calibrated parameters of a unit hydrograph-based model are shown to change substantially over that range of data time steps. For the smaller basin, each model parameter reaches, or approaches, a stable value as the data time step decreases, providing a straightforward method of estimating time-step independent model parameter values. For the larger basin, the model parameters also reach, or approach, stable values using hourly data, but, for reasons given in the paper, interpretation of the results is more difficult. Model parameter sensitivity analyses are presented that give insights into the relative precision on the parameters for both catchments. The paper discusses the importance of accounting for model parameter data time-step dependency in pursuit of a reduction in the uncertainty associated with estimates of flow in ungauged basins, and suggests that further work along these lines be undertaken using different catchments and models.  相似文献   

15.
Complexity   总被引:1,自引:0,他引:1  
It is difficult to define complexity in modeling. Complexity is often associated with uncertainty since modeling uncertainty is an intrinsically difficult task. However, modeling uncertainty does not require, necessarily, complex models, in the sense of a model requiring an unmanageable number of degrees of freedom to characterize the aquifer. The relationship between complexity, uncertainty, heterogeneity, and stochastic modeling is not simple. Aquifer models should be able to quantify the uncertainty of their predictions, which can be done using stochastic models that produce heterogeneous realizations of aquifer parameters. This is the type of complexity addressed in this article.  相似文献   

16.
17.
Environmental risk management is an integral part of risk analyses. The selection of different mitigating or preventive alternatives often involve competing and conflicting criteria, which requires sophisticated multi-criteria decision-making (MCDM) methods. Analytic hierarchy process (AHP) is one of the most commonly used MCDM methods, which integrates subjective and personal preferences in performing analyses. AHP works on a premise that decision-making of complex problems can be handled by structuring the complex problem into a simple and comprehensible hierarchical structure. However, AHP involves human subjectivity, which introduces vagueness type uncertainty and necessitates the use of decision-making under uncertainty. In this paper, vagueness type uncertainty is considered using fuzzy-based techniques. The traditional AHP is modified to fuzzy AHP using fuzzy arithmetic operations. The concept of risk attitude and associated confidence of a decision maker on the estimates of pairwise comparisons are also discussed. The methodology of the proposed technique is built on a hypothetical example and its efficacy is demonstrated through an application dealing with the selection of drilling fluid/mud for offshore oil and gas operations.  相似文献   

18.
The success of modeling groundwater is strongly influenced by the accuracy of the model parameters that are used to characterize the subsurface system. However, the presence of uncertainty and possibly bias in groundwater model source/sink terms may lead to biased estimates of model parameters and model predictions when the standard regression‐based inverse modeling techniques are used. This study first quantifies the levels of bias in groundwater model parameters and predictions due to the presence of errors in irrigation data. Then, a new inverse modeling technique called input uncertainty weighted least‐squares (IUWLS) is presented for unbiased estimation of the parameters when pumping and other source/sink data are uncertain. The approach uses the concept of generalized least‐squares method with the weight of the objective function depending on the level of pumping uncertainty and iteratively adjusted during the parameter optimization process. We have conducted both analytical and numerical experiments, using irrigation pumping data from the Republican River Basin in Nebraska, to evaluate the performance of ordinary least‐squares (OLS) and IUWLS calibration methods under different levels of uncertainty of irrigation data and calibration conditions. The result from the OLS method shows the presence of statistically significant (p < 0.05) bias in estimated parameters and model predictions that persist despite calibrating the models to different calibration data and sample sizes. However, by directly accounting for the irrigation pumping uncertainties during the calibration procedures, the proposed IUWLS is able to minimize the bias effectively without adding significant computational burden to the calibration processes.  相似文献   

19.
《国际泥沙研究》2020,35(2):157-170
Mitigation of sediment deposition in lined open channels is an essential issue in hydraulic engineering practice.Hence,the limiting velocity should be determined to keep the channel bottom clean from sediment deposits.Recently,sediment transport modeling using various artificial intelligence(AI) techniques has attracted the interest of many researchers.The current integrated study highlights unique insight for modeling of sediment transport in sewer and urban drainage systems.A novel methodology based on the combination of sensitivity and uncertainty analyses with a machine learning technique is proposed as a tool for selection of the best input combination for modeling process at non-deposition conditions of sediment transport.Utilizing one to seven dimensionless parameters,127 models are developed in the current study.In order to evaluate the different parameter co mbinations and select the training and te sting data,four strategies are considered.Considering the densimetric Froude number(Fr) as the dependent parameter,a model with independent parameters of volumetric sediment concentration(C_V) and relative particle size(d/R) gave the best results with a mean absolute relative error(MARE) of 0.1 and a root means square error(RMSE) of 0.67.Uncertainty analysis is applied with a machine learning technique to investigate the credibility of the proposed methods.The percentage of the observed sample data bracketed by95% predicted uncertainty bound(95 PPU) is computed to assess the uncertainty of the best models.  相似文献   

20.
In 1988, an important publication moved model calibration and forecasting beyond case studies and theoretical analysis. It reported on a somewhat idyllic graduate student modeling exercise where many of the system properties were known; the primary forecasts of interest were heads in pumping wells after a river was modified. The model was calibrated using manual trial-and-error approaches where a model's forecast quality was not related to how well it was calibrated. Here, we investigate whether tools widely available today obviate the shortcomings identified 30 years ago. A reconstructed version of the 1988 true model was tested using increasing parameter estimation sophistication. The parameter estimation demonstrated the inverse problem was non-unique because only head data were available for calibration. When a flux observation was included, current parameter estimation approaches were able to overcome all calibration and forecast issues noted in 1988. The best forecasts were obtained from a highly parameterized model that used pilot points for hydraulic conductivity and was constrained with soft knowledge. Like the 1988 results, however, the best calibrated model did not produce the best forecasts due to parameter overfitting. Finally, a computationally frugal linear uncertainty analysis demonstrated that the single-zone model was oversimplified, with only half of the forecasts falling within the calculated uncertainty bounds. Uncertainties from the highly parameterized models had all six forecasts within the calculated uncertainty. The current results outperformed those of the 1988 effort, demonstrating the value of quantitative parameter estimation and uncertainty analysis methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号