首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Regional flood frequency analysis (RFFA) is widely used in practice to estimate flood quantiles in ungauged catchments. Most commonly adopted RFFA methods such as quantile regression technique (QRT) assume a log-linear relationship between the dependent and a set of predictor variables. As non-linear models and universal approximators, artificial neural networks (ANN) have been widely adopted in rainfall runoff modeling and hydrologic forecasting, but there have been relatively few studies involving the application of ANN to RFFA for estimating flood quantiles in ungauged catchments. This paper thus focuses on the development and testing of an ANN-based RFFA model using an extensive Australian database consisting of 452 gauged catchments. Based on an independent testing, it has been found that ANN-based RFFA model with only two predictor variables can provide flood quantile estimates that are more accurate than the traditional QRT. Seven different regions have been compared with the ANN-based RFFA model and it has been shown that when the data from all the eastern Australian states are combined together to form a single region, the ANN presents the best performing RFFA model. This indicates that a relatively larger dataset is better suited for successful training and testing of the ANN-based RFFA models.  相似文献   

2.
The index flood method is widely used in regional flood frequency analysis (RFFA) but explicitly relies on the identification of ‘acceptable homogeneous regions’. This paper presents an alternative RFFA method, which is particularly useful when ‘acceptably homogeneous regions’ cannot be identified. The new RFFA method is based on the region of influence (ROI) approach where a ‘local region’ can be formed to estimate statistics at the site of interest. The new method is applied here to regionalize the parameters of the log‐Pearson 3 (LP3) flood probability model using Bayesian generalized least squares (GLS) regression. The ROI approach is used to reduce model error arising from the heterogeneity unaccounted for by the predictor variables in the traditional fixed‐region GLS analysis. A case study was undertaken for 55 catchments located in eastern New South Wales, Australia. The selection of predictor variables was guided by minimizing model error. Using an approach similar to stepwise regression, the best model for the LP3 mean was found to use catchment area and 50‐year, 12‐h rainfall intensity as explanatory variables, whereas the models for the LP3 standard deviation and skewness only had a constant term for the derived ROIs. Diagnostics based on leave‐one‐out cross validation show that the regression model assumptions were not inconsistent with the data and, importantly, no genuine outlier sites were identified. Significantly, the ROI GLS approach produced more accurate and consistent results than a fixed‐region GLS model, highlighting the superior ability of the ROI approach to deal with heterogeneity. This method is particularly applicable to regions that show a high degree of regional heterogeneity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
Regression‐based regional flood frequency analysis (RFFA) methods are widely adopted in hydrology. This paper compares two regression‐based RFFA methods using a Bayesian generalized least squares (GLS) modelling framework; the two are quantile regression technique (QRT) and parameter regression technique (PRT). In this study, the QRT focuses on the development of prediction equations for a flood quantile in the range of 2 to 100 years average recurrence intervals (ARI), while the PRT develops prediction equations for the first three moments of the log Pearson Type 3 (LP3) distribution, which are the mean, standard deviation and skew of the logarithms of the annual maximum flows; these regional parameters are then used to fit the LP3 distribution to estimate the desired flood quantiles at a given site. It has been shown that using a method similar to stepwise regression and by employing a number of statistics such as the model error variance, average variance of prediction, Bayesian information criterion and Akaike information criterion, the best set of explanatory variables in the GLS regression can be identified. In this study, a range of statistics and diagnostic plots have been adopted to evaluate the regression models. The method has been applied to 53 catchments in Tasmania, Australia. It has been found that catchment area and design rainfall intensity are the most important explanatory variables in predicting flood quantiles using the QRT. For the PRT, a total of four explanatory variables were adopted for predicting the mean, standard deviation and skew. The developed regression models satisfy the underlying model assumptions quite well; of importance, no outlier sites are detected in the plots of the regression diagnostics of the adopted regression equations. Based on ‘one‐at‐a‐time cross validation’ and a number of evaluation statistics, it has been found that for Tasmania the QRT provides more accurate flood quantile estimates for the higher ARIs while the PRT provides relatively better estimates for the smaller ARIs. The RFFA techniques presented here can easily be adapted to other Australian states and countries to derive more accurate regional flood predictions. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

4.
Regional flood frequency analysis (RFFA) was carried out on data for 55 hydrometric stations in Namak Lake basin, Iran, for the period 1992–2012. Flood discharge of specific return periods was computed based on the log Pearson Type III distribution, selected as the best regional distribution. Independent variables, including physiographic, meteorological, geological and land-use variables, were derived and, using three strategies – gamma test (GT), GT plus classification and expert opinion – the best input combination was selected. To select the best technique for regionalization, support vector regression (SVR), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and nonlinear regression (NLR) techniques were applied to predict peak flood discharge for 2-, 5-, 10-, 25-, 50- and 100-year return periods. The GT + ANFIS and GT + SVR models gave better performance than the ANN and NLR models in the RFFA. The results of the input variable selection showed that the GT technique improved the model performance.  相似文献   

5.
A generalized additive model (GAM) was used to model the spatial distribution of snow depth in the central Spanish Pyrenees. Statistically significant non‐linear relationships were found between distinct location and topographical variables and the average depth of the April snowpack at 76 snow poles from 1985 to 2000. The joint effect of the predictor variables explained more than 73% of the variance of the dependent variable. The performance of the model was assessed by applying a number of quantitative approaches to the residuals from a cross‐validation test. The relatively low estimated errors and the possibility of understanding the processes that control snow accumulation, through the response curves of each independent variable, indicate that GAMs may be a useful tool for interpolating local snow depth or other climate parameters. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

6.
7.
8.
9.
Identifying physical catchment processes from streamflow data, such as quick- and slow-flow paths, remains challenging. This study is designed to explore whether a flexible nonparametric regression model (generalized additive model, GAM) can be used to infer different flow paths. This assumes that the data relationship in data-driven models is also a reflection of catchment physical processes. The GAM, using time-lagged flow covariates, was fitted to synthetic rainfall–runoff data simulated using simple linear reservoirs. Partial plots of the time-lagged covariates show that the model could differentiate simple and more complex flow paths in simulated synthetic data with short and long memory systems and varying between dry and wet climates. Further analysis of data from real catchments showed that the model could differentiate catchments dominated by slow flow and by quick flow. Therefore, this study indicates that GAM can be used to identify catchment storages and delay processes from streamflow data.  相似文献   

10.
Regional frequency analysis is an important tool in estimating design flood for ungauged catchments. Index flood is an important component in regionalized flood formulas. In the past, many formulas have been developed based on various numbers of calibration catchments (e.g. from less than 20 to several hundred). However, there is a lack of systematic research on the model uncertainties caused by the number of calibration catchments (i.e. what is the minimum number of calibration catchment? and how should we choose the calibration catchments?). This study uses the statistical resampling technique to explore the impact of calibration catchment numbers on the index flood estimation. The study is based on 182 catchments in England and an index flood formula has been developed using the input variable selection technique in the data mining field. The formula has been used to explore the model uncertainty due to a range of calibration catchment numbers (from 15 to 130). It is found that (1) as expected, the more catchments are used in the calibration, the more reliable of the models developed are (i.e. with a narrower band of uncertainty); (2) however, poor models are still possible with a large number of calibration catchments (e.g. 130). In contrast, good models with a small number of calibration catchments are also achievable (with as low as 15 calibration catchments). This indicates that the number of calibration catchments is only one of the factors influencing the model performance. The hydrological community should explore why a smaller calibration data set could produce a better model than a large calibration data set. It is clear from this study that the information content in the calibration data set is equally if not more important than the number of calibration data. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

11.
ABSTRACT

Flood peaks and volumes are essential design variables and can be simulated by precipitation–runoff (P–R) modelling. The high-resolution precipitation time series that are often required for this purpose can be generated by various temporal disaggregation methods. Here, we compare a simple method (M1, one parameter), focusing on the effective precipitation duration for flood simulations, with a multiplicative cascade model (M2, 32/36 parameters). While M2 aims at generating realistic characteristics of precipitation time series, M1 aims only at accurately reproducing flood variables by P–R modelling. Both disaggregation methods were tested on precipitation time series of nine Swiss mesoscale catchments. The generated high-resolution time series served as input for P–R modelling using a lumped HBV model. The results indicate that differences identified in precipitation characteristics of disaggregated time series vanish when introduced into the lumped hydrological model. Moreover, flood peaks were more sensitive than flood volumes to the choice of disaggregation method.  相似文献   

12.
水文资料匮乏流域的洪水预报(PUBs)是水文科学与工程中一个尚未解决的重大挑战.中国湿润山区中小流域大多是水文资料匮乏的流域,在此地区进行洪水预报的重要手段之一就是水文模型参数的估计.对基于参数物理意义的估算方法(以下简称物理估算法)及两种区域化方法进行了研究,将其用于新安江模型参数的估算及移植.皖南山区的29个中小流...  相似文献   

13.
14.
The rainfall–runoff modelling being a stochastic process in nature is dependent on various climatological variables and catchment characteristics and therefore numerous hydrological models have been developed to simulate this complex process. One approach to modelling this complex non-linear rainfall–runoff process is to combine the outputs of various models to get more accurate and reliable results. This multi-model combination approach relies on the fact that various models capture different features of the data, and hence combination of these features would yield better result. This study for the first time presented a novel wavelet based combination approach for estimating combined runoff The simulated daily output (Runoff) of five selected conventional rainfall–runoff models from seven different catchments located in different parts of the world was used in current study for estimating combined runoff for each time period. Five selected rainfall–runoff models used in this study included four data driven models, namely, the simple linear model, the linear perturbation model, the linearly varying variable gain factor model, the constrained linear systems with a single threshold and one conceptual model, namely, the soil moisture accounting and routing model. The multilayer perceptron neural network method was used to develop combined wavelet coupled models to evaluate the effect of wavelet transformation (WT). The performance of the developed wavelet coupled combination models was compared with their counterpart simple combination models developed without WT. It was concluded that the presented wavelet coupled combination approach outperformed the existing approaches of combining different models without applying input WT. The study also recommended that different models in a combination approach should be selected on the basis of their individual performance.  相似文献   

15.
Despite uncertainties and errors in measurement, observed peak discharges are the best estimate of the true peak discharge from a catchment. However, in ungauged catchments, the catchment response time is a fundamental input to all methods of estimating peak discharges; hence, errors in estimated catchment response time directly impact on estimated peak discharges. In South Africa, this is particularly the case in ungauged medium to large catchments where practitioners are limited to use empirical methods that were calibrated on small catchments not located in South Africa. The time to peak (TP), time of concentration (TC) and lag time (TL) are internationally the most frequently used catchment response time parameters and are normally estimated using either hydraulic or empirical methods. Almost 95% of all the time parameter estimation methods developed internationally are empirically based. This paper presents the derivation and verification of empirical TP equations in a pilot scale study using 74 catchments located in four climatologically different regions of South Africa, with catchment areas ranging from 20 km2 to 35 000 km2. The objective is to develop unique relationships between observed TP values and key climatological and geomorphological catchment predictor variables in order to estimate catchment TP values at ungauged catchments. The results show that the derived empirical TP equation(s) meet the requirement of consistency and ease of application. Independent verification tests confirmed the consistency, while the statistically significant independent predictor variables included in the regressions provide a good estimation of catchment response times and are also easy to determine by practitioners when required for future applications in ungauged catchments. It is recommended that the methodology used in this study should be expanded to other catchments to enable the development of a regional approach to improve estimation of time parameters on a national‐scale. However, such a national‐scale application would not only increase the confidence in using the suggested methodology and equation(s) in South Africa, but also highlights that a similar approach could be adopted internationally. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

16.
Abstract

There has been a trend in recent years towards the development and popularity of physically-based deterministic models. However, the application of such models is not without difficulties. This paper investigates the usefulness of a conceptual single-event model for simulating floods from catchments covering a wide variety of climatic and physiographic areas. The model has been calibrated on a group of catchments and the calibrated parameter values related to physical catchment indices. The resulting quantitative relationships are assessed with respect to their value for estimating the parameter values of the model when calibration is not possible. The results indicate that the technique is likely to provide flood estimations for medium sized catchments (5–150 km2) that are more reliable than several flood estimation methods currently in use in South Africa.  相似文献   

17.
Due to the severity related to extreme flood events, recent efforts have focused on the development of reliable methods for design flood estimation. Historical streamflow series correspond to the most reliable information source for such estimation; however, they have temporal and spatial limitations that may be minimized by means of regional flood frequency analysis (RFFA). Several studies have emphasized that the identification of hydrologically homogeneous regions is the most important and challenging step in an RFFA. This study aims to identify state‐of‐the‐art clustering techniques (e.g., K ‐means, partition around medoids, fuzzy C‐means, K ‐harmonic means, and genetic K ‐means) with potential to form hydrologically homogeneous regions for flood regionalization in Southern Brazil. The applicability of some probability density function, such as generalized extreme value, generalized logistic, generalized normal, and Pearson type 3, was evaluated based on the regions formed. Among all the 15 possible combinations of the aforementioned clustering techniques and the Euclidian, Mahalanobis, and Manhattan distance measures, the five best were selected. Several watersheds' physiographic and climatological attributes were chosen to derive multiple regression equations for all the combinations. The accuracy of the equations was quantified with respect to adjusted coefficient of determination, root mean square error, and Nash–Sutcliffe coefficient, whereas, a cross‐validation procedure was applied to check their reliability. It was concluded that reliable results were obtained when using robust clustering techniques based on fuzzy logic (e.g., K ‐harmonic means), which have not been commonly used in RFFA. Furthermore, the probability density functions were capable of representing the regional annual maximum streamflows. Drainage area, main river length, and mean altitude of the watershed were the most recurrent attributes for modelling of mean annual maximum streamflow. Finally, an integration of all the five best combinations stands out as a robust, reliable, and simple tool for estimation of design floods.  相似文献   

18.
Streamflow response of Belgian catchments to IPCC climate change scenarios   总被引:8,自引:0,他引:8  
The IRMB (Integrated Runoff Model—F. Bultot) daily step conceptual model has been applied to eight Belgian catchments with areas ranging from 100 to 1200 km2. These catchments are characterized by various infiltration rates and ground water storage capacities. The outputs of six GCMs (General Circulation Model) distributed by the IPCC (Intergovernmental Panel on Climate Change) and an earlier scenario have been used to perturb time series of hydrometeorological input data relevant to simulate the water cycle. This paper focuses on the impacts on streamflow and its surface and underground components, as well as on the occurrence of flood days and low flow days. Impacts are shown to be catchment and scenario dependent. Due to the scenario diversity, streamflow impacts are found to be either positive or negative. The trends are common to scenarios with the same patterns or to catchments with similar characteristics. For all but two scenarios, all the catchments present an increase of flood frequency. Nevertheless, for all the scenarios, catchments with prevailing surface flow are undergoing an increase in flood frequency during winter months.  相似文献   

19.
Understanding, analysing, and predicting the erosion mechanisms and sedimentary flows produced by catchments plays a key role in environmental conservation and restoration management and policies. Numerical case-testing studies are generally undertaken to analyse the sensitivity of flood and soil erosion processes to the physical characteristics of catchments. Most analyses are conducted on simple virtual catchments with physical characteristics that, unlike real catchments, are perfectly controlled. Virtual catchments generally correspond to V-shaped valley catchments. However, although these catchments are suitable for methodical analysis of the results, they do not provide a realistic representation of the spatial structures of the landscape and field conditions. They can, therefore, lead to potential modelling errors and can make it difficult to extend or generalize their results. Our proposed method bridges the gap between real and traditional virtual catchments by creating realistic virtual catchments with perfectly controllable physical characteristics. Our approach represents a real alternative to traditional test case procedures and provides a new framework for geomorphological and hydrological communities. It combines a field procedural generation approach, geographic information system processing procedures, and the CAESAR-Lisflood landscape evolution model. We illustrate how each of these components acts in the process of generating virtual catchments. Five physical parameters were adjusted and tested for each virtual catchment: drainage density, hypsometric integral, mean slope of the main channel, granulometry, and land use. One of our virtual catchments is compared with a real catchment and a virtual catchment produced by a standard method. This comparison indicates that our approach can produce more realistic virtual catchments than those produced by more traditional methods, while a high degree of controllability is maintained. This new method of generating virtual catchments therefore offers significant research potential to identify the impacts of the physical characteristics of catchments on hydro-sedimentary dynamics and responses.  相似文献   

20.
《水文科学杂志》2013,58(1):86-87
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号