首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Hydrological risk analysis is essential and provides useful information for dam safety management and decision-making. This study presents the application of bivariate flood frequency analysis to risk analysis of dam overtopping for Geheyan Reservoir in China. The dependence between the flood peak and volume is modelled with the copula function. A Monte Carlo procedure is conducted to generate 100,000 random flood peak-volume pairs, which are subsequently transformed to corresponding design flood hydrographs (DFHs) by amplifying the selected annual maximum flood hydrographs (AMFHs). These synthetic DFHs are routed through the reservoir to obtain the frequency curve of maximum water level and assess the risk of dam overtopping. Sensitive analysis is performed to investigate the influence of different AMFH shapes and correlation coefficients of flood peak and volume on estimated overtopping risks. The results show that synthetic DFH with AMFH shape characterized by a delayed time to peak results in higher risk, and therefore highlight the importance of including a range of possible AMFH shapes in the dam risk analysis. It is also demonstrated that the overtopping risk is increased as the correlation coefficient of flood peak and volume increases and underestimated in the independence case (i.e. traditional univariate approach), while overestimated in the full dependence case. The bivariate statistical approach based on copulas can effectively capture the actual dependence between flood peak and volume, which should be preferred in the dam risk analysis practice.  相似文献   

2.
Multiple breaches of a dam resulting from wind-generated waves and wave overtopping are studied for a hypothetical long non-cohesive earthen dam with an uneven crest. Both wind speed and direction affect breach locations and outflow for a particular reservoir surface geometry. Locations on the dam with longer fetches along the wind direction are more subject to wave overtopping and breaching than other locations. Higher wind speeds lead to wave overtopping and dam breaches under larger freeboards than lower wind speeds. For a specified inflow hydrograph and spillway configuration, there exists a location at which the smallest estimated peak outflow occurs among all possible breach locations and the pool drops too quickly for additional breaches to develop. Using this location for a fuse plug or a pilot channel could minimize downstream impact, perhaps as an interim or emergency measure for a dam with inadequate spillway capacity.  相似文献   

3.
Hydrologic risk analysis for dam safety relies on a series of probabilistic analyses of rainfall-runoff and flow routing models, and their associated inputs. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated in order to evaluate the probability of dam overtopping. Typically, parametric density estimation methods have been applied in this setting, and the exhaustive Monte Carlo simulation (MCS) of models is used to derive some of the distributions. Often, the distributions used to model some of the random variables are inappropriate relative to the expected behaviour of these variables, and as a result, simulations of the system can lead to unrealistic values of extreme rainfall or water surface levels and hence of the probability of dam overtopping. In this paper, three major innovations are introduced to address this situation. The first is the use of nonparametric probability density estimation methods for selected variables, the second is the use of Latin Hypercube sampling to improve the efficiency of MCS driven by the multiple random variables, and the third is the use of Bootstrap resampling to determine initial water surface level. An application to the Soyang Dam in South Korea illustrates how the traditional parametric approach can lead to potentially unrealistic estimates of dam safety, while the proposed approach provides rather reasonable estimates and an assessment of their sensitivity to key parameters.  相似文献   

4.
The traditional and still prevailing approach to characterization of flood hazards to dams is the inflow design flood (IDF). The IDF, defined either deterministically or probabilistically, is necessary for sizing a dam, its discharge facilities and reservoir storage. However, within the dam safety risk informed decision framework, the IDF does not carry much relevance, no matter how accurately it is characterized. In many cases, the probability of the reservoir inflow tells us little about the probability of dam overtopping. Typically, the reservoir inflow and its associated probability of occurrence is modified by the interplay of a number of factors (reservoir storage, reservoir operating rules and various operational faults and natural disturbances) on its way to becoming the reservoir outflow and corresponding peak level—the two parameters that represent hydrologic hazard acting upon the dam. To properly manage flood risk, it is essential to change approach to flood hazard analysis for dam safety from the currently prevailing focus on reservoir inflows and instead focus on reservoir outflows and corresponding reservoir levels. To demonstrate these points, this paper presents stochastic simulation of floods on a cascade system of three dams and shows progression from exceedance probabilities of reservoir inflow to exceedance probabilities of peak reservoir level depending on initial reservoir level, storage availability, reservoir operating rules and availability of discharge facilities on demand. The results show that the dam overtopping is more likely to be caused by a combination of a smaller flood and a system component failure than by an extreme flood on its own.  相似文献   

5.
Abstract

The segmentation of flood seasons has both theoretical and practical importance in hydrological sciences and water resources management. The probability change-point analysis technique is applied to segmenting a defined flood season into a number of sub-seasons. Two alternative sampling methods, annual maximum and peaks-over-threshold, are used to construct the new flow series. The series is assumed to follow the binomial distribution and is analysed with the probability change-point analysis technique. A Monte Carlo experiment is designed to evaluate the performance of proposed flood season segmentation models. It is shown that the change-point based models for flood season segmentation can rationally partition a flood season into appropriate sub-seasons. China's new Three Gorges Reservoir, located on the upper Yangtze River, was selected as a case study since a hydrological station with observed flow data from 1882 to 2003 is located 40 km downstream of the dam. The flood season of the reservoir can be reasonably divided into three sub-seasons: the pre-flood season (1 June–2 July); the main flood season (3 July–10 September); and the post-flood season (11–30 September). The results of flood season segmentation and the characteristics of flood events are reasonable for this region.

Citation Liu, P., Guo, S., Xiong, L. & Chen, L. (2010) Flood season segmentation based on the probability change-point analysis technique. Hydrol. Sci. J. 55(4), 540–554.  相似文献   

6.
Dam overtopping risk assessment considering inspection program   总被引:3,自引:2,他引:1  
Safety inspection of large dams in Taiwan is conducted every 5 years. The practice does not take into consideration uncertainty of dam conditions. The goal of this study is to determine the optimal dam inspection interval under the consideration of overtopping risk incorporating uncertainty gate availability. In earlier studies, assessment of overtopping risk only considered the uncertainties in reservoir properties and natural randomness of hydrologic events without giving much thought to the availability of spillway gates. As a result, the overtopping risk could be underestimated. In this study, an innovative concept is proposed to evaluate dam overtopping by taking into account spillway gate availability. The framework consists of three parts: (1) evaluation of conditional overtopping risk for different numbers of malfunctioning spillway gates; (2) evaluation of spillway gate availability; and (3) dam inspection scheduling. Furthermore, considerations are given to overtopping risk, inspection cost, and dam break cost for determining the optimal inspection schedule. The methodology is applied to the Shihmen Reservoir in Taiwan and to evaluate its time-dependent overtopping risk. Results show that overtopping risk considering the availability of the spillway gates is higher than the one without considering the availability of the spillway gates.  相似文献   

7.
Reservoirs are the most important constructions for water resources management and flood control. Great concern has been paid to the effects of reservoir on downstream area and the differences between inflows and dam site floods due to the changes of upstream flow generation and concentration conditions after reservoir’s impoundment. These differences result in inconsistency between inflow quantiles and the reservoir design criteria derived by dam site flood series, which can be a potential risk and must be quantificationally evaluated. In this study, flood frequency analysis (FFA) and flood control risk analysis (FCRA) methods are used with the long reservoir inflow series derived from a multiple inputs and single output model and a copula-based inflow estimation model. The results of FFA and FCRA are compared and the influences on reservoir flood management are also discussed. The Three Gorges Reservoir (TGR) in China is selected as a case study. Results show that the differences between the TGR inflow and dam site floods are significant which result in changes on its flood control risk rates. The mean values of TGR’s annual maximum inflow peak discharge and 3 days flood volume have increased 5.58 and 3.85% than the dam site ones, while declined by 1.82 and 1.72% for the annual maximum 7 and 15 days flood volumes. The flood control risk rates of middle and small flood events are increased while extreme flood events are declined. It is shown that the TGR can satisfy the flood control task under current hydrologic regime and the results can offer references for better management of the TGR.  相似文献   

8.
The cascading failure of multiple landslide dams can trigger a larger peak flood discharge than that caused by a single dam failure.Therefore,for an accurate numerical simulation,it is essential to elucidate the primary factors affecting the peak discharge of the flood caused by a cascading failure,which is the purpose of the current study.First,flume experiments were done on the cascading failure of two landslide dams under different upstream dam heights,downstream dam heights,and initial downstream reservoir water volumes.Then,the experimental results were reproduced using a numerical simulation model representing landslide dam erosion resulting from overtopping flow.Finally,the factors influencing the peak flood discharge caused by the cascading failure were analyzed using the numerical simulation model.Experimental results indicated that the inflow discharge into the downstream dam at the time when the downstream dam height began to rapidly erode was the main factor responsible for a cascading failure generating a larger peak flood discharge than that generated by a single dam failure.Furthermore,the results of a sensitivity analysis suggested that the upstream and downstream dam heights,initial water volume in the reservoir of the downstream dam,upstream and downstream dam crest lengths,and distance between two dams were among the most important factors in predicting the flood discharge caused by the cascading failure of multiple landslide dams.  相似文献   

9.
Existing riverbank riprap could face the risk of failure if the flood regime changes in future. Additionally, changed sediment transport in rivers, as a possible result of climate change, impacts the failure risk of flood protection measures. Evaluation of this potential failure is the primary issue of riprap stability and safety assessment. The consequences of the bank failure are probably uncontrolled erosion and flooding with disastrous consequences in residential areas or damage to infrastructures. Thus, a probabilistic analysis of riprap failure considering different mechanisms due to the flood and sediment transport uncertainties is required to assess embankment stability. In this article, the concept of a probabilistic assessment model based on Monte Carlo simulation method, moment analysis methods, and Rosenblueth point estimation method are presented to define the failure risk of riprap as the river bank protection. The probability of failure in different modes, namely direct block erosion, toe scouring and overtopping, has been defined by taking into account the river bed level variation based on bedload transport described with a probabilistic function of the peak discharge. The result of three models comparison revealed a good agreement (the average deviation of less than 2%) in estimation of riprap failure probability. This model is a strategical tool to search the critical river reaches and helps to evaluate the risk maps. So that, the model could cover the engineering aspect of environmental stability in the rivers with riprap as the bank protections.  相似文献   

10.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

11.
《水文科学杂志》2012,57(15):1867-1892
ABSTRACT

The flood peak is the dominating characteristic in nearly all flood-statistical analyses. Contrary to the general assumptions of design flood estimation, the peak is not closely related to other flood characteristics. Differentiation of floods into types provides a more realistic view. Often different parts of the probability distribution function of annual flood peaks are dominated by different flood types, which raises the question how shifts in flood regimes would modify the statistics of annual maxima. To answer this, a distinction into five flood types is proposed; then, temporal changes in flood-type frequencies are investigated. We show that the frequency of floods caused by heavy rain has increased significantly in recent years. A statistical model is developed that simulates peaks for each event type by type-specific peak–volume relationships. In a simulation study, we show how changes in frequency of flood event type lead to changes in the quantiles of annual maximum series.  相似文献   

12.
《水文科学杂志》2013,58(5):974-991
Abstract

The aim is to build a seasonal flood frequency analysis model and estimate seasonal design floods. The importance of seasonal flood frequency analysis and the advantages of considering seasonal design floods in the derivation of reservoir planning and operating rules are discussed, recognising that seasonal flood frequency models have been in use for over 30 years. A set of non-identical models with non-constant parameters is proposed and developed to describe flows that reflect seasonal flood variation. The peak-over-threshold (POT) sampling method was used, as it is considered to provide significantly more information on flood seasonality than annual maximum (AM) sampling and has better performance in flood seasonality estimation. The number of exceedences is assumed to follow the Poisson distribution (Po), while the peak exceedences are described by the exponential (Ex) and generalized Pareto (GP) distributions and a combination of both, resulting in three models, viz. Po-Ex, Po-GP and Po-Ex/GP. Their performances are analysed and compared. The Geheyan and the Baiyunshan reservoirs were chosen for the case study. The application and statistical experiment results show that each model has its merits and that the Po-Ex/GP model performs best. Use of the Po-Ex/GP model is recommended in seasonal flood frequency analysis for the purpose of deriving reservoir operation rules.  相似文献   

13.
Risk analysis for clustered check dams due to heavy rainfall   总被引:7,自引:1,他引:6  
Check dams are commonly constructed around the world for alleviating soil erosion and preventing sedimentation of downstream rivers and reservoirs.Check dams are more vulnerable to failure due to their less stringent flood control standards compared to other dams.Determining the critical precipitation that will result in overtopping of a dam is a useful approach to assessing the risk of failure on a probabilistic basis and for providing early warning in case of an emergency.However,many check dams are built in groups,spreading in several tributaries in cascade forms,comprising a complex network.Determining the critical precipitation for dam overtopping requires a knowledge of its upstream dams on whether they survived or were overtopped during the same storm,while these upstream dams in turn need the information for their upstream dams.The current paper presents an approach of decomposing the dam cluster into(1)the heading dam,(2)border dams,and(3)intermediate dams.The algorithm begins with the border dams that have no upstream dams and proceeds with upgraded maps without the previous border dams until all the dams have been checked.It is believed that this approach is applicable for small-scale check dam systems where the time lag of flood routing can be neglected.As a pilot study,the current paper presents the analytical results for the Wangmaogou Check Dam System that has 22 dams connected in series and parallel.The algorithm clearly identified 7 surviving dams,with the remaining ones being overtopped for a storm of 179.6 mm in 12 h,which is associated with a return period of one in 200 years.  相似文献   

14.
This study analyses the differences in significant trends in magnitude and frequency of floods detected in annual maximum flood (AMF) and peak over threshold (POT) flood peak series, for the period 1965–2005. Flood peaks are identified from European daily discharge data using a baseflow-based algorithm and significant trends in the AMF series are compared with those in the POT series, derived for six different exceedence thresholds. The results show that more trends in flood magnitude are detected in the AMF than in the POT series and for the POT series more significant trends are detected in flood frequency than in flood magnitude. Spatially coherent patterns of significant trends are detected, which are further investigated by stratifying the results into five regions based on catchment and hydro-climatic characteristics. All data and tools used in this study are open-access and the results are fully reproducible.  相似文献   

15.
Clustering stochastic point process model for flood risk analysis   总被引:7,自引:0,他引:7  
Since the introduction into flood risk analysis, the partial duration series method has gained increasing acceptance as an appealing alternative to the annual maximum series method. However, when the base flow is low, there is clustering in the flood peak or flow volume point process. In this case, the general stochastic point process model is not suitable to risk analysis. Therefore, two types of models for flood risk analysis are derived on the basis of clustering stochastic point process theory in this paper. The most remarkable characteristic of these models is that the flood risk is considered directly within the time domain. The acceptability of different models are also discussed with the combination of the flood peak counted process in twenty years at Yichang station on the Yangtze river. The result shows that the two kinds of models are suitable ones for flood risk analysis, which are more flexible compared with the traditional flood risk models derived on the basis of annual maximum series method or the general stochastic point process theory. Received: September 29, 1997  相似文献   

16.
Since the introduction into flood risk analysis, the partial duration series method has gained increasing acceptance as an appealing alternative to the annual maximum series method. However, when the base flow is low, there is clustering in the flood peak or flow volume point process. In this case, the general stochastic point process model is not suitable to risk analysis. Therefore, two types of models for flood risk analysis are derived on the basis of clustering stochastic point process theory in this paper. The most remarkable characteristic of these models is that the flood risk is considered directly within the time domain. The acceptability of different models are also discussed with the combination of the flood peak counted process in twenty years at Yichang station on the Yangtze river. The result shows that the two kinds of models are suitable ones for flood risk analysis, which are more flexible compared with the traditional flood risk models derived on the basis of annual maximum series method or the general stochastic point process theory. Received: September 29, 1997  相似文献   

17.
Abstract

The impulse response of a linear convective-diffusion analogy (LD) model used for flow routing in open channels is proposed as a probability distribution for flood frequency analysis. The flood frequency model has two parameters, which are derived using the methods of moments and maximum likelihood. Also derived are errors in quantiles for these parameter estimation methods. The distribution shows that the two methods are equivalent in terms of producing mean values—the important property in case of unknown true distribution function. The flood frequency model is tested using annual peak discharges for the gauging sections of 39 Polish rivers where the average value of the ratio of the coefficient of skewness to the coefficient of variation equals about 2.52, a value closer to the ratio of the LD model than to the gamma or the lognormal model. The likelihood ratio indicates the preference of the LD over the lognormal for 27 out of 39 cases. It is found that the proposed flood frequency model represents flood frequency characteristics well (measured by the moment ratio) when the LD flood routing model is likely to be the best of all linear flow routing models.  相似文献   

18.
Prediction of the peak break‐up water level, which is the maximum instantaneous stage during ice break‐up, is desirable to allow effective ice flood mitigation, but traditional hydrologic flood routing techniques are not efficient in addressing the large uncertainties caused by numerous factors driving the peak break‐up water level. This research provides a probability prediction framework based on vine copulas. The predictor variables of the peak break‐up water level are first chosen, the pair copula structure is then constructed by using vine copulas, the conditional density distribution function is derived to perform a probability prediction, and the peak break‐up water level value can then be estimated from the conditional density distribution function given the conditional probability and fixed values of the predictor variables. This approach is exemplified using data from 1957 to 2005 for the Toudaoguai and Sanhuhekou stations, which are located in the Inner Mongolia Reach of the Yellow River, and the calibration and validation periods are divided at 1986. The mean curve of the peak break‐up water level estimated from the conditional distribution function can capture the tendency of the observed series at both the Toudaoguai and Sanhuhekou stations, and more than 90% of the observed values fall within the 90% prediction uncertainty bands, which are approximately twice the standard deviation of the observed series. The probability prediction results for the validation period are consistent with those for the calibration period when the nonstationarity of the marginal distributions for the Sanhuhekou station are considered. Compared with multiple linear regression results, the uncertainty bands from the conditional distribution function are much narrower; moreover, the conditional distribution function is more capable of addressing the nonstationarity of predictor variables, and the conclusions are confirmed by jackknife analysis. Scenario predictions for cases in which the peak break‐up water level is likely to be higher than the bankfull water level can also be conducted based on the conditional distribution function, with good performance for the two stations.  相似文献   

19.
Conventional flood frequency analysis is concerned with providing an unbiased estimate of the magnitude of the design flow exceeded with the probabilityp, but sampling uncertainties imply that such estimates will, on average, be exceeded more frequently. An alternative approach is therefore, to derive an estimator which gives an unbiased estimate of flow risk: the difference between the two magnitudes reflects uncertainties in parameter estimation. An empirical procedure has been developed to estimate the mean true exceedance probabilities of conventional estimates made using a GEV distribution fitted by probability weighted moments, and adjustment factors have been determined to enable the estimation of flood magnitudes exceeded with, on average, the desired probability.  相似文献   

20.
There are two basic approaches for estimating flood quantiles: a parametric and a nonparametric method. In this study, the comparisons of parametric and nonparametric models for annual maximum flood data of Goan gauging station in Korea were performed based on Monte Carlo simulation. In order to consider uncertainties that can arise from model and data errors, kernel density estimation for fitting the sampling distributions was chosen to determine safety factors (SFs) that depend on the probability model used to fit the real data. The relative biases of Sheater and Jones plug-in (SJ) are the smallest in most cases among seven bandwidth selectors applied. The relative root mean square errors (RRMSEs) of the Gumbel (GUM) are smaller than those of any other models regardless of parent models considered. When the Weibull-2 is assumed as a parent model, the RRMSEs of kernel density estimation are relatively small, while those of kernel density estimation are much bigger than those of parametric methods for other parent models. However, the RRMSEs of kernel density estimation within interpolation range are much smaller than those for extrapolation range in comparison with those of parametric methods. Among the applied distributions, the GUM model has the smallest SFs for all parent models, and the general extreme value model has the largest values for all parent models considered.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号