首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Regression‐based regional flood frequency analysis (RFFA) methods are widely adopted in hydrology. This paper compares two regression‐based RFFA methods using a Bayesian generalized least squares (GLS) modelling framework; the two are quantile regression technique (QRT) and parameter regression technique (PRT). In this study, the QRT focuses on the development of prediction equations for a flood quantile in the range of 2 to 100 years average recurrence intervals (ARI), while the PRT develops prediction equations for the first three moments of the log Pearson Type 3 (LP3) distribution, which are the mean, standard deviation and skew of the logarithms of the annual maximum flows; these regional parameters are then used to fit the LP3 distribution to estimate the desired flood quantiles at a given site. It has been shown that using a method similar to stepwise regression and by employing a number of statistics such as the model error variance, average variance of prediction, Bayesian information criterion and Akaike information criterion, the best set of explanatory variables in the GLS regression can be identified. In this study, a range of statistics and diagnostic plots have been adopted to evaluate the regression models. The method has been applied to 53 catchments in Tasmania, Australia. It has been found that catchment area and design rainfall intensity are the most important explanatory variables in predicting flood quantiles using the QRT. For the PRT, a total of four explanatory variables were adopted for predicting the mean, standard deviation and skew. The developed regression models satisfy the underlying model assumptions quite well; of importance, no outlier sites are detected in the plots of the regression diagnostics of the adopted regression equations. Based on ‘one‐at‐a‐time cross validation’ and a number of evaluation statistics, it has been found that for Tasmania the QRT provides more accurate flood quantile estimates for the higher ARIs while the PRT provides relatively better estimates for the smaller ARIs. The RFFA techniques presented here can easily be adapted to other Australian states and countries to derive more accurate regional flood predictions. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

2.
《水文科学杂志》2012,57(15):1867-1892
ABSTRACT

The flood peak is the dominating characteristic in nearly all flood-statistical analyses. Contrary to the general assumptions of design flood estimation, the peak is not closely related to other flood characteristics. Differentiation of floods into types provides a more realistic view. Often different parts of the probability distribution function of annual flood peaks are dominated by different flood types, which raises the question how shifts in flood regimes would modify the statistics of annual maxima. To answer this, a distinction into five flood types is proposed; then, temporal changes in flood-type frequencies are investigated. We show that the frequency of floods caused by heavy rain has increased significantly in recent years. A statistical model is developed that simulates peaks for each event type by type-specific peak–volume relationships. In a simulation study, we show how changes in frequency of flood event type lead to changes in the quantiles of annual maximum series.  相似文献   

3.
This paper presents an approach to estimating the probability distribution of annual discharges Q based on rainfall-runoff modelling using multiple rainfall events. The approach is based on the prior knowledge about the probability distribution of annual maximum daily totals of rainfall P in a natural catchment, random disaggregation of the totals into hourly values, and rainfall-runoff modelling. The presented Multi-Event Simulation of Extreme Flood method (MESEF) combines design event method based on single-rainfall event modelling, and continuous simulation method used for estimating the maximum discharges of a given exceedance probability using rainfall-runoff models. In the paper, the flood quantiles were estimated using the MESEF method, and then compared to the flood quantiles estimated using classical statistical method based on observed data.  相似文献   

4.
Clustering stochastic point process model for flood risk analysis   总被引:7,自引:0,他引:7  
Since the introduction into flood risk analysis, the partial duration series method has gained increasing acceptance as an appealing alternative to the annual maximum series method. However, when the base flow is low, there is clustering in the flood peak or flow volume point process. In this case, the general stochastic point process model is not suitable to risk analysis. Therefore, two types of models for flood risk analysis are derived on the basis of clustering stochastic point process theory in this paper. The most remarkable characteristic of these models is that the flood risk is considered directly within the time domain. The acceptability of different models are also discussed with the combination of the flood peak counted process in twenty years at Yichang station on the Yangtze river. The result shows that the two kinds of models are suitable ones for flood risk analysis, which are more flexible compared with the traditional flood risk models derived on the basis of annual maximum series method or the general stochastic point process theory. Received: September 29, 1997  相似文献   

5.
Since the introduction into flood risk analysis, the partial duration series method has gained increasing acceptance as an appealing alternative to the annual maximum series method. However, when the base flow is low, there is clustering in the flood peak or flow volume point process. In this case, the general stochastic point process model is not suitable to risk analysis. Therefore, two types of models for flood risk analysis are derived on the basis of clustering stochastic point process theory in this paper. The most remarkable characteristic of these models is that the flood risk is considered directly within the time domain. The acceptability of different models are also discussed with the combination of the flood peak counted process in twenty years at Yichang station on the Yangtze river. The result shows that the two kinds of models are suitable ones for flood risk analysis, which are more flexible compared with the traditional flood risk models derived on the basis of annual maximum series method or the general stochastic point process theory. Received: September 29, 1997  相似文献   

6.
This article discusses the method of higher-order L-moment (LH-moment) estimation for the Wakeby distribution (WAD), and describes and formulates details of parameter estimation using LH-moments for WAD. Monte Carlo simulation is performed, to illustrate the performance of the LH-moment method via heavy-tail quantiles (over all quantiles) using WAD. The LH-moment method proves as useful and effective as the L-moment approach in handling data that follow WAD, and it is then applied to annual maximum flood and wave height data.  相似文献   

7.
8.
Abstract

The effect of land-use change on the flood frequency curve (FFC) in a natural catchment is analysed. To achieve this, a simple methodology for the derivation of FFCs in land-use change scenarios is proposed. The adopted methodology, using a stochastic model in Monte Carlo simulation of FFCs, was found to provide a useful framework for detecting changes in flood magnitudes in both pre- and post-fire conditions. In particular, the importance of the antecedent soil moisture condition in the determination of the flood frequency distribution was analysed. The analysis of FFCs for pre- and post-fire conditions shows an increase in the average value of Curve Number and a decrease in the catchment time lag. The derivation of FFCs shows a clear increase in flood quantiles. For the post-fire conditions, the FFC exhibits higher quantiles of the peak discharges showing a reduction in frequency of occurrence. This variation is more significant for low-return period quantiles than for high-return period quantiles. The results of the catchment studies reported here support the hypothesis that the hydrological response of the watershed changes as a result of fire, especially during the first years following a fire event.  相似文献   

9.
The record length and quality of instantaneous peak flows (IPFs) have a great influence on flood design, but these high resolution flow data are not always available. The primary aim of this study is to compare different strategies to derive frequency distributions of IPFs using the Hydrologiska Byråns Vattenbalansavdelning (HBV) hydrologic model. The model is operated on a daily and an hourly time step for 18 catchments in the Aller‐Leine basin, Germany. Subsequently, general extreme value (GEV) distributions are fitted to the simulated annual series of daily and hourly extreme flows. The resulting maximum mean daily flow (MDF) quantiles from daily simulations are transferred into IPF quantiles using a multiple regression model, which enables a direct comparison with the simulated hourly quantiles. As long climate records with a high temporal resolution are not available, the hourly simulations require a disaggregation of the daily rainfall. Additionally, two calibrations strategies are applied: (1) a calibration on flow statistics; (2) a calibration on hydrographs. The results show that: (1) the multiple regression model is capable of predicting IPFs with the simulated MDFs; (2) both daily simulations with post‐correction of flows and hourly simulations with pre‐processing of precipitation enable a reasonable estimation of IPFs; (3) the best results are achieved using disaggregated rainfall for hourly modelling with calibration on flow statistics; and (4) if the IPF observations are not sufficient for model calibration on flow statistics, the transfer of MDFs via multiple regressions is a good alternative for estimating IPFs. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

10.
Temporal and spatial variability in extreme quantile anomalies of seasonal and annual maximum river flows was studied for 41 gauging stations at rivers in the Upper Vistula River basin, Poland. Using the quantile perturbation method, the temporal variability in anomalies was analysed. Interdecadal oscillating components were extracted from the series of anomalies using the Hilbert‐Huang transform method. Period length, part of variance of each component, and part of unexplained variance were assessed. Results show an oscillating pattern in the temporal occurrence of extreme flow quantiles with clusters of high values in the 1960–1970s and since the late 1990s and of low values in the 1980s and at the beginning of the 1990s. The anomalies show a high variability on the right bank of the Upper Vistula River basin during the summer season with the highest values in catchments located in the western and south‐western parts of the basin. River flow extreme quantiles were found to be associated with large‐scale climatic variables from the regions of the North Atlantic Ocean, Scandinavia, Eastern Europe, Asia, and, to a lesser extent, the Pacific Ocean. Similarities between temporal variability of river flows and climatic factors were revealed. Results of the study are important for flood frequency analysis because a long observation period is necessary to capture clusters of high and low river flows.  相似文献   

11.
Large observed datasets are not stationary and/or depend on covariates, especially, in the case of extreme hydrometeorological variables. This causes the difficulty in estimation, using classical hydrological frequency analysis. A number of non-stationary models have been developed using linear or quadratic polynomial functions or B-splines functions to estimate the relationship between parameters and covariates. In this article, we propose regularised generalized extreme value model with B-splines (GEV-B-splines models) in a Bayesian framework to estimate quantiles. Regularisation is based on penalty and aims to favour parsimonious model especially in the case of large dimension space. Penalties are introduced in a Bayesian framework and the corresponding priors are detailed. Five penalties are considered and the corresponding priors are developed for comparison purpose as: Least absolute shrinkage and selection (Lasso and Ridge) and smoothing clipped absolute deviations (SCAD) methods (SCAD1, SCAD2 and SCAD3). Markov chain Monte Carlo (MCMC) algorithms have been developed for each model to estimate quantiles and their posterior distributions. Those approaches are tested and illustrated using simulated data with different sample sizes. A first simulation was made on polynomial B-splines functions in order to choose the most efficient model in terms of relative mean biais (RMB) and the relative mean-error (RME) criteria. A second simulation was performed with the SCAD1 penalty for sinusoidal dependence to illustrate the flexibility of the proposed approach. Results show clearly that the regularized approaches leads to a significant reduction of the bias and the mean square error, especially for small sample sizes (n < 100). A case study has been considered to model annual peak flows at Fort-Kent catchment with the total annual precipitations as covariates. The conditional quantile curves were given for the regularized and the maximum likelihood methods.  相似文献   

12.
Joint Monte Carlo and possibilistic simulation for flood damage assessment   总被引:7,自引:5,他引:2  
A joint Monte Carlo and fuzzy possibilistic simulation (MC-FPS) approach was proposed for flood risk assessment. Monte Carlo simulation was used to evaluate parameter uncertainties associated with inundation modeling, and fuzzy vertex analysis was applied for promulgating human-induced uncertainty in flood damage estimation. A study case was selected to show how to apply the proposed method. The results indicate that the outputs from MC-FPS would present as fuzzy flood damage estimate and probabilistic-possibilistic damage contour maps. The stochastic uncertainty in the flood inundation model and fuzziness in the depth-damage functions derivation would cause similar levels of influence on the final flood damage estimate. Under the worst scenario (i.e. a combined probabilistic and possibilistic uncertainty), the estimated flood damage could be 2.4 times higher than that computed from conventional deterministic approach; considering only the pure stochastic effect, the flood loss would be 1.4 times higher. It was also indicated that uncertainty in the flood inundation modeling has a major influence on the standard deviation of the simulated damage, and that in the damage-depth function has more notable impact on the mean of the fitted distributions. Through applying MC-FPS, rich information could be derived under various α-cut levels and cumulative probabilities, and it forms an important basis for supporting rational decision making for flood risk management under complex uncertainties.  相似文献   

13.
This work examines future flood risk within the context of integrated climate and hydrologic modelling uncertainty. The research questions investigated are (1) whether hydrologic uncertainties are a significant source of uncertainty relative to other sources such as climate variability and change and (2) whether a statistical characterization of uncertainty from a lumped, conceptual hydrologic model is sufficient to account for hydrologic uncertainties in the modelling process. To investigate these questions, an ensemble of climate simulations are propagated through hydrologic models and then through a reservoir simulation model to delimit the range of flood protection under a wide array of climate conditions. Uncertainty in mean climate changes and internal climate variability are framed using a risk‐based methodology and are explored using a stochastic weather generator. To account for hydrologic uncertainty, two hydrologic models are considered, a conceptual, lumped parameter model and a distributed, physically based model. In the conceptual model, parameter and residual error uncertainties are quantified and propagated through the analysis using a Bayesian modelling framework. The approach is demonstrated in a case study for the Coralville Dam on the Iowa River, where recent, intense flooding has raised questions about potential impacts of climate change on flood protection adequacy. Results indicate that the uncertainty surrounding future flood risk from hydrologic modelling and internal climate variability can be of the same order of magnitude as climate change. Furthermore, statistical uncertainty in the conceptual hydrological model can capture the primary structural differences that emerge in flood damage estimates between the two hydrologic models. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
15.
Reservoirs are the most important constructions for water resources management and flood control. Great concern has been paid to the effects of reservoir on downstream area and the differences between inflows and dam site floods due to the changes of upstream flow generation and concentration conditions after reservoir’s impoundment. These differences result in inconsistency between inflow quantiles and the reservoir design criteria derived by dam site flood series, which can be a potential risk and must be quantificationally evaluated. In this study, flood frequency analysis (FFA) and flood control risk analysis (FCRA) methods are used with the long reservoir inflow series derived from a multiple inputs and single output model and a copula-based inflow estimation model. The results of FFA and FCRA are compared and the influences on reservoir flood management are also discussed. The Three Gorges Reservoir (TGR) in China is selected as a case study. Results show that the differences between the TGR inflow and dam site floods are significant which result in changes on its flood control risk rates. The mean values of TGR’s annual maximum inflow peak discharge and 3 days flood volume have increased 5.58 and 3.85% than the dam site ones, while declined by 1.82 and 1.72% for the annual maximum 7 and 15 days flood volumes. The flood control risk rates of middle and small flood events are increased while extreme flood events are declined. It is shown that the TGR can satisfy the flood control task under current hydrologic regime and the results can offer references for better management of the TGR.  相似文献   

16.
Abstract

The impulse response of a linear convective-diffusion analogy (LD) model used for flow routing in open channels is proposed as a probability distribution for flood frequency analysis. The flood frequency model has two parameters, which are derived using the methods of moments and maximum likelihood. Also derived are errors in quantiles for these parameter estimation methods. The distribution shows that the two methods are equivalent in terms of producing mean values—the important property in case of unknown true distribution function. The flood frequency model is tested using annual peak discharges for the gauging sections of 39 Polish rivers where the average value of the ratio of the coefficient of skewness to the coefficient of variation equals about 2.52, a value closer to the ratio of the LD model than to the gamma or the lognormal model. The likelihood ratio indicates the preference of the LD over the lognormal for 27 out of 39 cases. It is found that the proposed flood frequency model represents flood frequency characteristics well (measured by the moment ratio) when the LD flood routing model is likely to be the best of all linear flow routing models.  相似文献   

17.
There are two basic approaches for estimating flood quantiles: a parametric and a nonparametric method. In this study, the comparisons of parametric and nonparametric models for annual maximum flood data of Goan gauging station in Korea were performed based on Monte Carlo simulation. In order to consider uncertainties that can arise from model and data errors, kernel density estimation for fitting the sampling distributions was chosen to determine safety factors (SFs) that depend on the probability model used to fit the real data. The relative biases of Sheater and Jones plug-in (SJ) are the smallest in most cases among seven bandwidth selectors applied. The relative root mean square errors (RRMSEs) of the Gumbel (GUM) are smaller than those of any other models regardless of parent models considered. When the Weibull-2 is assumed as a parent model, the RRMSEs of kernel density estimation are relatively small, while those of kernel density estimation are much bigger than those of parametric methods for other parent models. However, the RRMSEs of kernel density estimation within interpolation range are much smaller than those for extrapolation range in comparison with those of parametric methods. Among the applied distributions, the GUM model has the smallest SFs for all parent models, and the general extreme value model has the largest values for all parent models considered.  相似文献   

18.
Sewer inlet structures are vital components of urban drainage systems and their operational conditions can largely affect the overall performance of the system. However, their hydraulic behaviour and the way in which it is affected by clogging is often overlooked in urban drainage models, thus leading to misrepresentation of system performance and, in particular, of flooding occurrence. In the present paper, a novel methodology is proposed to stochastically model stormwater urban drainage systems, taking the impact of sewer inlet operational conditions (e.g. clogging due to debris accumulation) on urban pluvial flooding into account. The proposed methodology comprises three main steps: (i) identification of sewer inlets most prone to clogging based upon a spatial analysis of their proximity to trees and evaluation of sewer inlet locations; (ii) Monte Carlo simulation of the capacity of inlets prone to clogging and subsequent simulation of flooding for each sewer inlet capacity scenario, and (iii) delineation of stochastic flood hazard maps. The proposed methodology was demonstrated using as case study design storms as well as two real storm events observed in the city of Coimbra (Portugal), which reportedly led to flooding in different areas of the catchment. The results show that sewer inlet capacity can indeed have a large impact on the occurrence of urban pluvial flooding and that it is essential to account for variations in sewer inlet capacity in urban drainage models. Overall, the stochastic methodology proposed in this study constitutes a useful tool for dealing with uncertainties in sewer inlet operational conditions and, as compared to more traditional deterministic approaches, it allows a more comprehensive assessment of urban pluvial flood hazard, which in turn enables better-informed flood risk assessment and management decisions.  相似文献   

19.
Long flood series are required to accurately estimate flood quantiles associated with high return periods, in order to design and assess the risk in hydraulic structures such as dams. However, observed flood series are commonly short. Flood series can be extended through hydro-meteorological modelling, yet the computational effort can be very demanding in case of a distributed model with a short time step is considered to obtain an accurate flood hydrograph characterisation. Statistical models can also be used, where the copula approach is spreading for performing multivariate flood frequency analyses. Nevertheless, the selection of the copula to characterise the dependence structure of short data series involves a large uncertainty. In the present study, a methodology to extend flood series by combining both approaches is introduced. First, the minimum number of flood hydrographs required to be simulated by a spatially distributed hydro-meteorological model is identified in terms of the uncertainty of quantile estimates obtained by both copula and marginal distributions. Second, a large synthetic sample is generated by a bivariate copula-based model, reducing the computation time required by the hydro-meteorological model. The hydro-meteorological modelling chain consists of the RainSim stochastic rainfall generator and the Real-time Interactive Basin Simulator (RIBS) rainfall-runoff model. The proposed procedure is applied to a case study in Spain. As a result, a large synthetic sample of peak-volume pairs is stochastically generated, keeping the statistical properties of the simulated series generated by the hydro-meteorological model. This method reduces the computation time consumed. The extended sample, consisting of the joint simulated and synthetic sample, can be used for improving flood risk assessment studies.  相似文献   

20.
The stochastic model has been widely used for the simulation study. However, there was a difficulty in the reproduction of the skewness of observed series and so the stochastic model for the skewness preservation was appeared. While the skewness in the residuals of the stochastic model has been considered for the skewness preservation this study uses a random resampling technique of residuals from the stochastic models for the simulation study and for the investigation of the skewness coefficient. The main advantage of this resampling scheme, called the bootstrap method is that it does not rely on the assumption of population distribution and this study uses the combined model of the stochastic and bootstrapped models. The stochastic and bootstrapped stochastic (or combined) models are used for the investigations of skewness preservation and of the reproduction of probability density function between the simulated series. The models are applied to the annual and monthly streamflows of Yongdam site in Korea and Yakima river, Washington, USA for the streamflow simulation study then the statistics and probability density functions for the observed and simulated streamflows are compared. As the results the bootstrapped stochastic model reproduces the skewness and probability density function much better than the stochastic model. This evidences suggest that the bootstrapped stochastic model might be more appropriate than the stochastic model for the preservation of skewness and for simulation purposes of the series.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号