首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
ABSTRACT

The calibration of hydrological models is formulated as a blackbox optimization problem where the only information available is the objective function value. Distributed hydrological models are generally computationally intensive, and their calibration may require several hours or days which can be an issue for many operational contexts. Different optimization algorithms have been developed over the years and exhibit different strengths when applied to the calibration of computationally intensive hydrological models. This paper shows how the dynamically dimensioned search (DDS) and the mesh adaptive direct search (MADS) algorithms can be combined to significantly reduce the computational time of calibrating distributed hydrological models while ensuring robustness and stability regarding the final objective function values. Five transitional features are described to adequately merge both algorithms. The hybrid approach is applied to the distributed and computationally intensive HYDROTEL model on three different river basins located in Québec (Canada).  相似文献   

2.
This study compares formal Bayesian inference to the informal generalized likelihood uncertainty estimation (GLUE) approach for uncertainty-based calibration of rainfall-runoff models in a multi-criteria context. Bayesian inference is accomplished through Markov Chain Monte Carlo (MCMC) sampling based on an auto-regressive multi-criteria likelihood formulation. Non-converged MCMC sampling is also considered as an alternative method. These methods are compared along multiple comparative measures calculated over the calibration and validation periods of two case studies. Results demonstrate that there can be considerable differences in hydrograph prediction intervals generated by formal and informal strategies for uncertainty-based multi-criteria calibration. Also, the formal approach generates definitely preferable validation period results compared to GLUE (i.e., tighter prediction intervals that show higher reliability) considering identical computational budgets. Moreover, non-converged MCMC (based on the standard Gelman–Rubin metric) performance is reasonably consistent with those given by a formal and fully-converged Bayesian approach even though fully-converged results requires significantly larger number of samples (model evaluations) for the two case studies. Therefore, research to define alternative and more practical convergence criteria for MCMC applications to computationally intensive hydrologic models may be warranted.  相似文献   

3.
Abstract

Abstract The role of accuracy in the representation of infiltration on the effectiveness of real-time flood forecasting models was investigated. A simple semi-distributed model of conceptual type with adaptive estimate of hydraulic characteristics included in the infiltration component was selected. Infiltration was described by a very accurate approach recently formulated for complex rainfall patterns, or alternatively through a simpler formulation known as an extension of the classical time compression approximation. The results indicated that, for situations involving a significant rainfall variability in space, the inaccuracy in the representation of infiltration cannot be corrected by the adaptive component of the rainfall–runoff model. A preliminary analysis of the role of an approximation of saturated hydraulic conductivity to be used in each homogeneous area of the semi-distributed model used both in non-adaptive version and in real-time is also presented.  相似文献   

4.
The estimation of missing rainfall data is an important problem for data analysis and modelling studies in hydrology. This paper develops a Bayesian method to address missing rainfall estimation from runoff measurements based on a pre-calibrated conceptual rainfall–runoff model. The Bayesian method assigns posterior probability of rainfall estimates proportional to the likelihood function of measured runoff flows and prior rainfall information, which is presented by uniform distributions in the absence of rainfall data. The likelihood function of measured runoff can be determined via the test of different residual error models in the calibration phase. The application of this method to a French urban catchment indicates that the proposed Bayesian method is able to assess missing rainfall and its uncertainty based only on runoff measurements, which provides an alternative to the reverse model for missing rainfall estimates.  相似文献   

5.
Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations.  相似文献   

6.
Pump‐and‐treat systems can prevent the migration of groundwater contaminants and candidate systems are typically evaluated with groundwater models. Such models should be rigorously assessed to determine predictive capabilities and numerous tools and techniques for model assessment are available. While various assessment methodologies (e.g., model calibration, uncertainty analysis, and Bayesian inference) are well‐established for groundwater modeling, this paper calls attention to an alternative assessment technique known as screening‐level sensitivity analysis (SLSA). SLSA can quickly quantify first‐order (i.e., main effects) measures of parameter influence in connection with various model outputs. Subsequent comparisons of parameter influence with respect to calibration vs. prediction outputs can suggest gaps in model structure and/or data. Thus, while SLSA has received little attention in the context of groundwater modeling and remedial system design, it can nonetheless serve as a useful and computationally efficient tool for preliminary model assessment. To illustrate the use of SLSA in the context of designing groundwater remediation systems, four SLSA techniques were applied to a hypothetical, yet realistic, pump‐and‐treat case study to determine the relative influence of six hydraulic conductivity parameters. Considered methods were: Taguchi design‐of‐experiments (TDOE); Monte Carlo statistical independence (MCSI) tests; average composite scaled sensitivities (ACSS); and elementary effects sensitivity analysis (EESA). In terms of performance, the various methods identified the same parameters as being the most influential for a given simulation output. Furthermore, results indicate that the background hydraulic conductivity is important for predicting system performance, but calibration outputs are insensitive to this parameter (KBK). The observed insensitivity is attributed to a nonphysical specified‐head boundary condition used in the model formulation which effectively “staples” head values located within the conductivity zone. Thus, potential strategies for improving model predictive capabilities include additional data collection targeting the KBK parameter and/or revision of model structure to reduce the influence of the specified head boundary.  相似文献   

7.
Multi-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box–Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney’s main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box–Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.  相似文献   

8.
9.
The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called “curse of dimensionality”. Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the “restart” scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.  相似文献   

10.
While seasonal time-varying models should generally be used to predict the daily concentration of ground-level ozone given its strong seasonal cycles, the sudden switching of models according to their designated period in an annual operational forecasting system may affect their performance, especially during the season’s transitional period in which the starting date and duration time can vary from year to year. This paper studies the effectiveness of an adaptive Bayesian Model Averaging scheme with the support of a transitional prediction model in solving the problem. The scheme continuously evaluates the probabilities of all the ozone prediction models (ozone season, nonozone season, and the transitional period) in a forecasting system, which are then used to provide a weighted average forecast. The scheme has been adopted in predicting the daily maximum of 8-h averaged ozone concentration in Macau for a period of 2 years (2008 and 2009), with results proved to be satisfactory.  相似文献   

11.
压缩感知在医学图像重建中的最新进展   总被引:2,自引:0,他引:2  
CS理论是一种新兴的信号获取与处理理论,通过减少信号重建所需的数据(少于奈奎斯特定理所要求的最小数目),来缩短信号采样时间,减少计算量,并在一定程度上保持原有图像的重建质量。由于该理论的这些显著优点,使得其在医学成像领域引起了广泛关注,取得了很大进展。本文介绍了压缩感知理论在医学成像中的发展历程和最新进展,详细介绍一种基于字典学习的新型压缩感知自适应重建算法,最后通过计算机模拟实验对该方法进行了初步验证。  相似文献   

12.
The igneous rocks of the Pongola Supergroup (PS) and Usushwana Intrusive Suite (UIS) represent a case of late Archaean continental magmatism in the southeastern part of the Kaapvaal craton of South Africa and Swaziland.

U-Pb dating on zircons from felsic volcanic rocks of the PS yields a concordia intercept age of 2940 ± 22Ma that is consistent with a Sm-Nd whole rock age of 2934 ± 114Ma determined on the PS basalt-rhyolite suite. The initial εNd of−2.6 ± 0.9 is the lowest value so far reported for Archaean mantle-derived rocks. Rb-Sr whole rock dating of the PS yields a younger isochron age of 2883 ± 69Ma, which is not significantly different form the accepted U-Pb zircon age.

An internal (cpx-opx-plag-whole rock) isochron for a pyroxenite from the younger UIS yields an age of 2871 ± 30 Ma and initial 143Nd/144Nd that lies off the CHUR growth curve by εNd −2.9 ± 0.2. However, Sm-Nd whole-rock data for the UIS yield an excessively high age of 3.1 Ga that conflicts with firm geological evidence showing the UIS to be intrusive into the PS.

The negative deviations of initialεNd from the chondritic Nd evolution curve suggest significant contamination of the PS and UIS melts by older continental crust. A mixing process with continental crust after magma segregation is supported by a high initial 87Sr/86Sr ratio of0.703024 ± 24 for a clinopyroxene sample from a UIS pyroxenite, compared with an expected value of 0.701 for the 2.9 Ga mantle. We therefore interpret the linear array of data points for the UIS gabbros as a mixing line between 2.87 Ga old magma and older continental crust.

Parallel LREE-enriched REE patterns, negative Nb-Ti anomalies, a distinctive and uniform ratio of Ti/Zr 46 and a narrow span of initial Nd indicate a common source for both the PS and UIS suites which is different from primitive mantle.  相似文献   


13.
With the availability of spatially distributed data, distributed hydrologic models are increasingly used for simulation of spatially varied hydrologic processes to understand and manage natural and human activities that affect watershed systems. Multi‐objective optimization methods have been applied to calibrate distributed hydrologic models using observed data from multiple sites. As the time consumed by running these complex models is increasing substantially, selecting efficient and effective multi‐objective optimization algorithms is becoming a nontrivial issue. In this study, we evaluated a multi‐algorithm, genetically adaptive multi‐objective method (AMALGAM) for multi‐site calibration of a distributed hydrologic model—Soil and Water Assessment Tool (SWAT), and compared its performance with two widely used evolutionary multi‐objective optimization (EMO) algorithms (i.e. Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Non‐dominated Sorted Genetic Algorithm II (NSGA‐II)). In order to provide insights into each method's overall performance, these three methods were tested in four watersheds with various characteristics. The test results indicate that the AMALGAM can consistently provide competitive or superior results compared with the other two methods. The multi‐method search framework of AMALGAM, which can flexibly and adaptively utilize multiple optimization algorithms, makes it a promising tool for multi‐site calibration of the distributed SWAT. For practical use of AMALGAM, it is suggested to implement this method in multiple trials with relatively small number of model runs rather than run it once with long iterations. In addition, incorporating different multi‐objective optimization algorithms and multi‐mode search operators into AMALGAM deserves further research. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

14.
A simple local error estimator is presented for time integration schemes in dynamic analysis. This error estimator involves only a small computational cost. The time step size is adaptively adjusted so that the local error at each time step is within a prescribed accuracy. It is found that the estimator performs well under various circumstances and provides an economical adaptive process. Attempts to estimate the global time integration error are also reported.  相似文献   

15.
A common way to simulate fluid flow in porous media is to use Lattice Boltzmann (LB) methods. Permeability predictions from such flow simulations are controlled by parameters whose settings must be calibrated in order to produce realistic modelling results. Herein we focus on the simplest and most commonly used implementation of the LB method: the single-relaxation-time BGK model. A key parameter in the BGK model is the relaxation time τ which controls flow velocity and has a substantial influence on the permeability calculation. Currently there is no rigorous scheme to calibrate its value for models of real media. We show that the standard method of calibration, by matching the flow profile of the analytic Hagen-Poiseuille pipe-flow model, results in a BGK-LB model that is unable to accurately predict permeability even in simple realistic porous media (herein, Fontainebleau sandstone). In order to reconcile the differences between predicted permeability and experimental data, we propose a method to calibrate τ using an enhanced Transitional Markov Chain Monte Carlo method, which is suitable for parallel computer architectures. We also propose a porosity-dependent τ calibration that provides an excellent fit to experimental data and which creates an empirical model that can be used to choose τ for new samples of known porosity. Our Bayesian framework thus provides robust predictions of permeability of realistic porous media, herein demonstrated on the BGK-LB model, and should therefore replace the standard pipe-flow based methods of calibration for more complex media. The calibration methodology can also be extended to more advanced LB methods.  相似文献   

16.
Fu  Yongshuo  Li  Xinxi  Zhou  Xuancheng  Geng  Xiaojun  Guo  Yahui  Zhang  Yaru 《中国科学:地球科学(英文版)》2020,63(9):1237-1247
Plant phenology is the study of the timing of recurrent biological events and the causes of their timing with regard to biotic and abiotic forces. Plant phenology affects the structure and function of terrestrial ecosystems and determines vegetation feedback to the climate system by altering the carbon, water and energy fluxes between the vegetation and near-surface atmosphere. Therefore, an accurate simulation of plant phenology is essential to improve our understanding of the response of ecosystems to climate change and the carbon, water and energy balance of terrestrial ecosystems. Phenological studies have developed rapidly under global change conditions, while the research of phenology modeling is largely lagged. Inaccurate phenology modeling has become the primary limiting factor for the accurate simulation of terrestrial carbon and water cycles.Understanding the mechanism of phenological response to climate change and building process-based plant phenology models are thus important frontier issues. In this review, we first summarized the drivers of plant phenology and overviewed the development of plant phenology models. Finally, we addressed the challenges in the development of plant phenology models and highlighted that coupling machine learning and Bayesian calibration into process-based models could be a potential approach to improve the accuracy of phenology simulation and prediction under future global change conditions.  相似文献   

17.
Uncertainty is inherent in modelling studies. However, the quantification of uncertainties associated with a model is a challenging task, and hence, such studies are somewhat limited. As distributed or semi‐distributed hydrological models are being increasingly used these days to simulate hydrological processes, it is vital that these models should be equipped with robust calibration and uncertainty analysis techniques. The goal of the present study was to calibrate and validate the Soil and Water Assessment Tool (SWAT) model for simulating streamflow in a river basin of Eastern India, and to evaluate the performance of salient optimization techniques in quantifying uncertainties. The SWAT model for the study basin was developed and calibrated using Parameter Solution (ParaSol), Sequential Uncertainty Fitting Algorithm (SUFI‐2) and Generalized Likelihood Uncertainty Estimation (GLUE) optimization techniques. The daily observed streamflow data from 1998 to 2003 were used for model calibration, and those for 2004–2005 were used for model validation. Modelling results indicated that all the three techniques invariably yield better results for the monthly time step than for the daily time step during both calibration and validation. The model performances for the daily streamflow simulation using ParaSol and SUFI‐2 during calibration are reasonably good with a Nash–Sutcliffe efficiency and mean absolute error (MAE) of 0.88 and 9.70 m3/s for ParaSol, and 0.86 and 10.07 m3/s for SUFI‐2, respectively. The simulation results of GLUE revealed that the model simulates daily streamflow during calibration with the highest accuracy in the case of GLUE (R2 = 0.88, MAE = 9.56 m3/s and root mean square error = 19.70 m3/s). The results of uncertainty analyses by SUFI‐2 and GLUE were compared in terms of parameter uncertainty. It was found that SUFI‐2 is capable of estimating uncertainties in complex hydrological models like SWAT, but it warrants sound knowledge of the parameters and their effects on the model output. On the other hand, GLUE predicts more reliable uncertainty ranges (R‐factor = 0.52 for daily calibration and 0.48 for validation) compared to SUFI‐2 (R‐factor = 0.59 for daily calibration and 0.55 for validation), though it is computationally demanding. Although both SUFI‐2 and GLUE appear to be promising techniques for the uncertainty analysis of modelling results, more and more studies in this direction are required under varying agro‐climatic conditions for assessing their generic capability. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
Automatic calibration of complex subsurface reaction models involves numerous difficulties, including the existence of multiple plausible models, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study investigated a novel procedure for performing simultaneous calibration of multiple models (SCMM). By combining a hybrid global-plus-polishing search heuristic with a biased-but-random adaptive model evaluation step, the new SCMM method calibrates multiple models via efficient exploration of the multi-model calibration space. Central algorithm components are an adaptive assignment of model preference weights, mapping functions relating the uncertain parameters of the alternative models, and a shuffling step that efficiently exploits pseudo-optimal configurations of the alternative models. The SCMM approach was applied to two nitrate contamination problems involving batch reactions and one-dimensional reactive transport. For the chosen problems, the new method produced improved model fits (i.e. up to 35% reduction in objective function) at significantly reduced computational expense (i.e. 40–90% reduction in model evaluations), relative to previously established benchmarks. Although the method was effective for the test cases, SCMM relies on a relatively ad-hoc approach to assigning intermediate preference weights and parameter mapping functions. Despite these limitations, the results of the numerical experiments are empirically promising and the reasoning and structure of the approach provide a strong foundation for further development.  相似文献   

19.
20.
A key aim of most extreme value analyses is the estimation of the r-year return level; the wind speed, or sea-surge, or rainfall level (for example), we might expect to see once (on average) every r years. There are compelling arguments for working within the Bayesian setting here, not least the natural extension to prediction via the posterior predictive distribution. Indeed, for practitioners the posterior predictive return level has been cited as perhaps the most useful point summary from a Bayesian analysis of extremes, and yet little is known of the properties of this statistic. In this paper, we attempt to assess the performance of predictive return levels relative to their estimative counterparts obtained directly from the return level posterior distribution; in particular, we make comparisons with the return level posterior mean, mode and 95% credible upper bound. Differences between the predictive return level and standard summaries from the return level posterior distribution, for wind speed extremes observed in the UK, motivates this work. A large scale simulation study then reveals the superiority of the predictive return level over the other posterior summaries in many cases of practical interest.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号