首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Abstract

Pooling of flood data is widely used to provide a framework to estimate design floods by the Index Flood method. Design flood estimation with this approach involves derivation of a growth curve which shows the relationship between XT and the return period T, where XT ?=?QT /QI and QI is the index flood at the site of interest. An implicit assumption with the Index Flood procedure of pooling analysis is that the XT T relationship is the same at all sites in a homogeneous pooling group, although this assumption would generally be violated to some extent in practical cases, i.e. some degree of heterogeneity exists. In fact, in only some cases is the homogeneity criterion effectively satisfied for Irish conditions. In this paper, the performance of the index-flood pooling analysis is assessed in the Irish low CV (coefficient of variation) hydrology context considering that heterogeneity is taken into account. It is found that the performance of the pooling method is satisfactory provided there are at least 350 station years of data included. Also it is found that, in a highly heterogeneous group, it is more desirable to have many sites with short record lengths than a smaller number of sites with long record lengths. Increased heterogeneity decreases the advantage of pooling group-based estimation over at-site estimation. Only a heterogeneity measure (H1) less than 4.0 can render the pooled estimation preferable to that obtained for at-site estimation for the estimation of 100-year flood. In moderately to highly heterogeneous regions it is preferable to conduct at-site analysis for the estimation of 100-year flood if the record length at the site concerned exceeds 50.

Editor Z.W. Kundzewicz; Associate editor A. Carsteanu

Citation Das, S. and Cunnane, C., 2012. Performance of flood frequency pooling analysis in a low CV context. Hydrological Sciences Journal, 57 (3), 433–444.  相似文献   

3.
4.
Rainfall intensity–duration–frequency (IDF) curves are used in the design of urban infrastructure. Their estimation is based on rainfall frequency analysis, usually performed on rainfall records from a single gauged station. However, available at‐site record length is often too short to provide accurate estimates for long return periods. In the present study, a general framework for pooled rainfall frequency analysis based on the index‐event model is proposed for IDF estimation at gauged stations. Pooling group formation is defined by the region of influence approach on the basis of the geographical distance similarity measure. Several pooled approaches are defined and evaluated by a procedure through which quantile estimation and uncertainty are assessed. Alternate approaches for the definition of a pooling group are based on different criteria regarding initial pooling group size (and the relationship between size and return period), approaches for assessing pooling group homogeneity, and the use of macroregions in pooling group formation. The proposed framework is applied to identify the preferred approach for pooled rainfall intensity frequency analysis in Canada. Pooled approaches are found to provide more precise estimates than the at‐site approach, especially for long return periods. Pooled parent distribution selection supported the use of the generalized extreme value distribution across the country. Recommendations for pooling group formation include increasing the pooling group size with increases in return period and identifying an appropriate trade‐off between pooling group homogeneity and size for long return periods.  相似文献   

5.
Flood frequency analysis is usually based on the fitting of an extreme value distribution to the local streamflow series. However, when the local data series is short, frequency analysis results become unreliable. Regional frequency analysis is a convenient way to reduce the estimation uncertainty. In this work, we propose a regional Bayesian model for short record length sites. This model is less restrictive than the index flood model while preserving the formalism of “homogeneous regions”. The performance of the proposed model is assessed on a set of gauging stations in France. The accuracy of quantile estimates as a function of the degree of homogeneity of the pooling group is also analysed. The results indicate that the regional Bayesian model outperforms the index flood model and local estimators. Furthermore, it seems that working with relatively large and homogeneous regions may lead to more accurate results than working with smaller and highly homogeneous regions.  相似文献   

6.
ABSTRACT

Flood quantile estimation based on partial duration series (peak over threshold, POT) represents a noteworthy alternative to the classical annual maximum approach since it enlarges the available information spectrum. Here the POT approach is discussed with reference to its benefits in increasing the robustness of flood quantile estimations. The classical POT approach is based on a Poisson distribution for the annual number of exceedences, although this can be questionable in some cases. Therefore, the Poisson distribution is compared with two other distributions (binomial and Gumbel-Schelling). The results show that only rarely is there a difference from the Poisson distribution. In the second part we investigate the robustness of flood quantiles derived from different approaches in the sense of their temporal stability against the occurrence of extreme events. Besides the classical approach using annual maxima series (AMS) with the generalized extreme value distribution and different parameter estimation methods, two different applications of POT are tested. Both are based on monthly maxima above a threshold, but one also uses trimmed L-moments (TL-moments). It is shown how quantile estimations based on this “robust” POT approach (rPOT) become more robust than AMS-based methods, even in the case of occasional extraordinary extreme events.
Editor M.C. Acreman Associate editor A. Viglione  相似文献   

7.
This study analyses the differences in significant trends in magnitude and frequency of floods detected in annual maximum flood (AMF) and peak over threshold (POT) flood peak series, for the period 1965–2005. Flood peaks are identified from European daily discharge data using a baseflow-based algorithm and significant trends in the AMF series are compared with those in the POT series, derived for six different exceedence thresholds. The results show that more trends in flood magnitude are detected in the AMF than in the POT series and for the POT series more significant trends are detected in flood frequency than in flood magnitude. Spatially coherent patterns of significant trends are detected, which are further investigated by stratifying the results into five regions based on catchment and hydro-climatic characteristics. All data and tools used in this study are open-access and the results are fully reproducible.  相似文献   

8.
Various regional flood frequency analysis procedures are used in hydrology to estimate hydrological variables at ungauged or partially gauged sites. Relatively few studies have been conducted to evaluate the accuracy of these procedures and estimate the error induced in regional flood frequency estimation models. The objective of this paper is to assess the overall error induced in the residual kriging (RK) regional flood frequency estimation model. The two main error sources in specific flood quantile estimation using RK are the error induced in the quantiles local estimation procedure and the error resulting from the regional quantile estimation process. Therefore, for an overall error assessment, the corresponding errors associated with these two steps must be quantified. Results show that the main source of error in RK is the error induced into the regional quantile estimation method. Results also indicate that the accuracy of the regional estimates increases with decreasing return periods. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

9.
Abstract

Flood frequency analysis can be made by using two types of flood peak series, i.e. the annual maximum (AM) and peaks-over-threshold (POT) series. This study presents a comparison of the results of both methods for data from the Litija 1 gauging station on the Sava River in Slovenia. Six commonly used distribution functions and three different parameter estimation techniques were considered in the AM analyses. The results showed a better performance for the method of L-moments (ML) when compared with the conventional moments and maximum likelihood estimation. The combination of the ML and the log-Pearson type 3 distribution gave the best results of all the considered AM cases. The POT method gave better results than the AM method. The binomial distribution did not offer any noticeable improvement over the Poisson distribution for modelling the annual number of exceedences above the threshold.
Editor D. Koutsoyiannis

Citation Bezak, N., Brilly, M., and ?raj, M., 2014. Comparison between the peaks-over-threshold method and the annual maximum method for flood frequency analysis. Hydrological Sciences Journal, 59 (5), 959–977.  相似文献   

10.
The log-Gumbel distribution is one of the extreme value distributions which has been widely used in flood frequency analysis. This distribution has been examined in this paper regarding quantile estimation and confidence intervals of quantiles. Specific estimation algorithms based on the methods of moments (MOM), probability weighted moments (PWM) and maximum likelihood (ML) are presented. The applicability of the estimation procedures and comparison among the methods have been illustrated based on an application example considering the flood data of the St. Mary's River.  相似文献   

11.
12.
Abstract

Abstract The identification of flood seasonality is a procedure with many practical applications in hydrology and water resources management. Several statistical methods for capturing flood seasonality have emerged during the last decade. So far, however, little attention has been paid to the uncertainty involved in the use of these methods, as well as to the reliability of their estimates. This paper compares the performance of annual maximum (AM) and peaks-over-threshold (POT) sampling models in flood seasonality estimation. Flood seasonality is determined by two most frequently used methods, one based on directional statistics (DS) and the other on the distribution of monthly relative frequencies of flood occurrence (RF). The performance is evaluated for the AM and three common POT sampling models depending on the estimation method, flood seasonality type and sample record length. The results demonstrate that the POT models outperform the AM model in most analysed scenarios. The POT sampling provides significantly more information on flood seasonality than the AM sampling. For certain flood seasonality types, POT samples can lead to estimation uncertainty that is found in up to ten-times longer AM samples. The performance of the RF method does not depend on the flood seasonality type as much as that of the DS method, which performs poorly on samples generated from complex seasonality distributions.  相似文献   

13.
《水文科学杂志》2013,58(5):974-991
Abstract

The aim is to build a seasonal flood frequency analysis model and estimate seasonal design floods. The importance of seasonal flood frequency analysis and the advantages of considering seasonal design floods in the derivation of reservoir planning and operating rules are discussed, recognising that seasonal flood frequency models have been in use for over 30 years. A set of non-identical models with non-constant parameters is proposed and developed to describe flows that reflect seasonal flood variation. The peak-over-threshold (POT) sampling method was used, as it is considered to provide significantly more information on flood seasonality than annual maximum (AM) sampling and has better performance in flood seasonality estimation. The number of exceedences is assumed to follow the Poisson distribution (Po), while the peak exceedences are described by the exponential (Ex) and generalized Pareto (GP) distributions and a combination of both, resulting in three models, viz. Po-Ex, Po-GP and Po-Ex/GP. Their performances are analysed and compared. The Geheyan and the Baiyunshan reservoirs were chosen for the case study. The application and statistical experiment results show that each model has its merits and that the Po-Ex/GP model performs best. Use of the Po-Ex/GP model is recommended in seasonal flood frequency analysis for the purpose of deriving reservoir operation rules.  相似文献   

14.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

15.
The key problem in nonparametric frequency analysis of flood and droughts is the estimation of the bandwidth parameter which defines the degree of smoothing. Most of the proposed bandwidth estimators have been based on the density function rather than the cumulative distribution function or the quantile that are the primary interest in frequency analysis. We propose a new bandwidth estimator derived from properties of quantile estimators. The estimator builds on work by Altman and Léger (1995). The estimator is compared to the well-known method of least squares cross-validation (LSCV) using synthetic data generated from various parametric distributions used in hydrologic frequency analysis. Simulations suggest that our estimator performs at least as well as, and in many cases better than, the method of LSCV. In particular, the use of the proposed plug-in estimator reduces bias in the estimation as compared to LSCV. When applied to data sets containing observations with identical values, typically the result of rounding or truncation, the LSCV and most other techniques generally underestimates the bandwidth. The proposed technique performs very well in such situations.  相似文献   

16.
17.
Abstract

Flood frequency estimation is crucial in both engineering practice and hydrological research. Regional analysis of flood peak discharges is used for more accurate estimates of flood quantiles in ungauged or poorly gauged catchments. This is based on the identification of homogeneous zones, where the probability distribution of annual maximum peak flows is invariant, except for a scale factor represented by an index flood. The numerous applications of this method have highlighted obtaining accurate estimates of index flood as a critical step, especially in ungauged or poorly gauged sections, where direct estimation by sample mean of annual flood series (AFS) is not possible, or inaccurate. Therein indirect methods have to be used. Most indirect methods are based upon empirical relationships that link index flood to hydrological, climatological and morphological catchment characteristics, developed by means of multi-regression analysis, or simplified lumped representation of rainfall–runoff processes. The limits of these approaches are increasingly evident as the size and spatial variability of the catchment increases. In these cases, the use of a spatially-distributed, physically-based hydrological model, and time continuous simulation of discharge can improve estimation of the index flood. This work presents an application of the FEST-WB model for the reconstruction of 29 years of hourly streamflows for an Alpine snow-fed catchment in northern Italy, to be used for index flood estimation. To extend the length of the simulated discharge time series, meteorological forcings given by daily precipitation and temperature at ground automatic weather stations are disaggregated hourly, and then fed to FEST-WB. The accuracy of the method in estimating index flood depending upon length of the simulated series is discussed, and suggestions for use of the methodology provided.
Editor D. Koutsoyiannis  相似文献   

18.
Regression‐based regional flood frequency analysis (RFFA) methods are widely adopted in hydrology. This paper compares two regression‐based RFFA methods using a Bayesian generalized least squares (GLS) modelling framework; the two are quantile regression technique (QRT) and parameter regression technique (PRT). In this study, the QRT focuses on the development of prediction equations for a flood quantile in the range of 2 to 100 years average recurrence intervals (ARI), while the PRT develops prediction equations for the first three moments of the log Pearson Type 3 (LP3) distribution, which are the mean, standard deviation and skew of the logarithms of the annual maximum flows; these regional parameters are then used to fit the LP3 distribution to estimate the desired flood quantiles at a given site. It has been shown that using a method similar to stepwise regression and by employing a number of statistics such as the model error variance, average variance of prediction, Bayesian information criterion and Akaike information criterion, the best set of explanatory variables in the GLS regression can be identified. In this study, a range of statistics and diagnostic plots have been adopted to evaluate the regression models. The method has been applied to 53 catchments in Tasmania, Australia. It has been found that catchment area and design rainfall intensity are the most important explanatory variables in predicting flood quantiles using the QRT. For the PRT, a total of four explanatory variables were adopted for predicting the mean, standard deviation and skew. The developed regression models satisfy the underlying model assumptions quite well; of importance, no outlier sites are detected in the plots of the regression diagnostics of the adopted regression equations. Based on ‘one‐at‐a‐time cross validation’ and a number of evaluation statistics, it has been found that for Tasmania the QRT provides more accurate flood quantile estimates for the higher ARIs while the PRT provides relatively better estimates for the smaller ARIs. The RFFA techniques presented here can easily be adapted to other Australian states and countries to derive more accurate regional flood predictions. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

19.
This paper studies the links between scaling properties of river flow time series by comparing the results of three techniques applied to an extended data set of 34 French discharge gauging stations. The three approaches used are based on different mathematical tools and hypotheses: (1) shape analysis of flood hydrographs; (2) a multifractal framework through spectral and moment analyses, and (3) flood frequency analysis through the fitting of flood duration frequency curves (QdF). The general aim is to test the hypothesis of scaling invariance of river flow and the shape invariance of the hydrographs, in order to investigate the link between scaling properties and flow dynamics. In particular, the coherence between different approaches widely used in the literature to describe these characteristics is evaluated through the estimation of parameters defining the range of time‐scales on which the scaling properties are valid. The results show that most of these timescale parameters are linked to the flow dynamics and suggest that the approaches applied are interrelated. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

20.
Studies have illustrated the performance of at-site and regional flood quantile estimators. For realistic generalized extreme value (GEV) distributions and short records, a simple index-flood quantile estimator performs better than two-parameter (2P) GEV quantile estimators with probability weighted moment (PWM) estimation using a regional shape parameter and at-site mean and L-coefficient of variation (L-CV), and full three-parameter at-site GEV/PWM quantile estimators. However, as regional heterogeneity or record lengths increase, the 2P-estimator quickly dominates. This paper generalizes the index flood procedure by employing regression with physiographic information to refine a normalized T-year flood estimator. A linear empirical Bayes estimator uses the normalized quantile regression estimator to define a prior distribution which is employed with the normalized 2P-quantile estimator. Monte Carlo simulations indicate that this empirical Bayes estimator does essentially as well as or better than the simpler normalized quantile regression estimator at sites with short records, and performs as well as or better than the 2P-estimator at sites with longer records or smaller L-CV.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号