首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
During flood events, breaching of flood defences along a river system can have a significant reducing effect on downstream water levels and flood risks. This paper presents a Monte Carlo based flood risk framework for policy decision making, which takes this retention effect into account. The framework is developed to estimate societal flood risk in terms of potential numbers of fatalities and associated probabilities. It is tested on the Rhine–Meuse delta system in the Netherlands, where floods can be caused by high flows in the Rhine and Meuse rivers and/or high sea water levels in the North Sea. Importance sampling is applied in the Monte Carlo procedure to increase computational efficiency of the flood risk computations. This paper focuses on the development and testing of efficient importance sampling strategies for the framework. The development of an efficient importance sampling strategy for river deltas is more challenging than for non-tidal rivers where only discharges are relevant, because the relative influence of river discharge and sea water level on flood levels differs from location to location. As a consequence, sampling methods that are efficient and accurate for one location may be inefficient for other locations or, worse, may introduce errors in computed design water levels. Nevertheless, in the case study described in this paper the required simulation time was reduced by a factor 100 after the introduction of an efficient importance sampling method in the Monte Carlo framework, while at the same time the accuracy of the Monte Carlo estimates were improved.  相似文献   

2.
Parametric method of flood frequency analysis (FFA) involves fitting of a probability distribution to the observed flood data at the site of interest. When record length at a given site is relatively longer and flood data exhibits skewness, a distribution having more than three parameters is often used in FFA such as log‐Pearson type 3 distribution. This paper examines the suitability of a five‐parameter Wakeby distribution for the annual maximum flood data in eastern Australia. We adopt a Monte Carlo simulation technique to select an appropriate plotting position formula and to derive a probability plot correlation coefficient (PPCC) test statistic for Wakeby distribution. The Weibull plotting position formula has been found to be the most appropriate for the Wakeby distribution. Regression equations for the PPCC tests statistics associated with the Wakeby distribution for different levels of significance have been derived. Furthermore, a power study to estimate the rejection rate associated with the derived PPCC test statistics has been undertaken. Finally, an application using annual maximum flood series data from 91 catchments in eastern Australia has been presented. Results show that the developed regression equations can be used with a high degree of confidence to test whether the Wakeby distribution fits the annual maximum flood series data at a given station. The methodology developed in this paper can be adapted to other probability distributions and to other study areas. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
There is a significant spatial sampling mismatch between radar and rain gauge data. The use of rain gauge data to estimate radar-rainfall error variance requires partitioning of the variance of the radar and rain gauge difference to account for the sampling mismatch. A key assumption in the literature pertaining to the error variance separation method used to partition the variance is that the covariance between radar-rainfall error and the error of rain gauges in representing radar sampling domain is negligible. Our study presents the results of an extensive test of this assumption. The test is based on empirical data and covers temporal scales ranging from 0.25 to 24 h and spatial scales ranging from 1 to 32 km. We used a two-year data set from two high quality and high density rain gauge networks in Oklahoma and excluded the winter months. The results obtained using a resampling procedure show that covariance can be considerable at large scales due to the significant variability. As the variability of the covariance rapidly increases with larger spatial and shorter temporal scales, applications of the error variance separation method at those scales require more caution. The variability of the covariance and one of its constituting variables, the variance ratio of radar and gauge errors, shows simple scaling behavior well characterized by a power-law.  相似文献   

4.
A basic problem in hydrology is computing confidence levels for the value of the T-year flood when it is obtained from a Log Pearson III distribution in terms of estimated mean, estimated standard deviation, and estimated skew. In an important paper Chowdhury and Stedinger [1991] suggest a possible formula for approximate confidence levels, involving two functions previously used by Stedinger [1983] and a third function, λ, for which asymptotic estimates are given. This formula is tested [Chowdhury and Stedinger, 1991] by means of simulations, but these simulations assume a distribution for the sample skew which is not, for a single site, the distribution which the sample skew is forced to have by the basic hypothesis which underlies all of the analysis, namely that the maximum discharges have a Log Pearson III distribution. Here we test these approximate formulas for the case of data from a single site by means of simulations in which the sample skew has the distribution which arises when sampling from a Log Pearson III distribution. The formulas are found to be accurate for zero skew but increasingly inaccurate for larger common values of skew. Work in progress indicates that a better choice of λ can improve the accuracy of the formula.  相似文献   

5.
A basic problem in hydrology is computing confidence levels for the value of the T-year flood when it is obtained from a Log Pearson III distribution in terms of estimated mean, estimated standard deviation, and estimated skew. In an important paper Chowdhury and Stedinger [1991] suggest a possible formula for approximate confidence levels, involving two functions previously used by Stedinger [1983] and a third function, λ, for which asymptotic estimates are given. This formula is tested [Chowdhury and Stedinger, 1991] by means of simulations, but these simulations assume a distribution for the sample skew which is not, for a single site, the distribution which the sample skew is forced to have by the basic hypothesis which underlies all of the analysis, namely that the maximum discharges have a Log Pearson III distribution. Here we test these approximate formulas for the case of data from a single site by means of simulations in which the sample skew has the distribution which arises when sampling from a Log Pearson III distribution. The formulas are found to be accurate for zero skew but increasingly inaccurate for larger common values of skew. Work in progress indicates that a better choice of λ can improve the accuracy of the formula.  相似文献   

6.
A nitrate sensor has been set up to measure every 10 min the nitrate signal in a stream draining a small agricultural catchment dominated by fertilized crops during a 2‐year study period (2006–2008) in the south‐west of France. An in situ sampling protocol using automatic sampler to monitor flood events have been used to assume a point‐to‐point calibration of the sensor values. The nitrate concentration exhibits nonsystematic concentration and dilution effects during flood events. We demonstrate that the calibrated nitrate sensor signal gathered from the outlet is considered to be a continuous signal using the Nyquist–Shannon sampling theorem. The objectives of this study are to quantify the errors generated by a typical infrequent sampling protocol and to design appropriate sampling strategy according to the sampling objectives. Nitrate concentration signal and flow data are numerically sampled to simulate common sampling frequencies. The total fluxes calculated from the simulated samples are compared with the reference value computed on the continuous signal. Uncertainties are increasing as sampling intervals increase; the method that is not using continuous discharge to compute nitrate fluxes bring larger uncertainty. The dispersion and bias computed for each sampling interval are used to evaluate the uncertainty during each hydrological period. High underestimation is made during flood periods when high‐concentration period is overlooked. On the contrary, high sampling frequencies (from 3 h to 1 day) lead to a systematic overestimation (bias around 3%): highest concentrations are overweighted by the interpolation of the concentration in such case. The in situ sampling protocol generates less than 1% of load estimation error and sample highest concentration peaks. We consider useful such newly emerging field technologies to assess short‐term variations of water quality parameters, to minimize the number of samples to be analysed and to assess the quality state of the stream at any time. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

7.
A method for quantifying inflow forecasting errors and their impact on reservoir flood control operations is proposed. This approach requires the identification of the probability distributions and uncertainty transfer scheme for the inflow forecasting errors. Accordingly, the probability distributions of the errors are inferred through deducing the relationship between its standard deviation and the forecasting accuracy quantified by the Nash–Sutcliffe efficiency coefficient. The traditional deterministic flood routing process is treated as a diffusion stochastic process. The diffusion coefficient is related to the forecasting accuracy, through which the forecasting errors are indirectly related to the sources of reservoir operation risks. The associated risks are derived by solving the stochastic differential equation of reservoir flood routing via the forward Euler method. The Geheyan reservoir in China is selected as a case study. The hydrological forecasting model for this basin is established and verified. The flood control operation risks in the forecast-based pre-release operation mode for different forecasting accuracies are estimated by the proposed approach. Application results show that the proposed method can provide a useful tool for reservoir operation risk estimation and management.  相似文献   

8.
Abstract

Abstract The identification of flood seasonality is a procedure with many practical applications in hydrology and water resources management. Several statistical methods for capturing flood seasonality have emerged during the last decade. So far, however, little attention has been paid to the uncertainty involved in the use of these methods, as well as to the reliability of their estimates. This paper compares the performance of annual maximum (AM) and peaks-over-threshold (POT) sampling models in flood seasonality estimation. Flood seasonality is determined by two most frequently used methods, one based on directional statistics (DS) and the other on the distribution of monthly relative frequencies of flood occurrence (RF). The performance is evaluated for the AM and three common POT sampling models depending on the estimation method, flood seasonality type and sample record length. The results demonstrate that the POT models outperform the AM model in most analysed scenarios. The POT sampling provides significantly more information on flood seasonality than the AM sampling. For certain flood seasonality types, POT samples can lead to estimation uncertainty that is found in up to ten-times longer AM samples. The performance of the RF method does not depend on the flood seasonality type as much as that of the DS method, which performs poorly on samples generated from complex seasonality distributions.  相似文献   

9.
Expressions for the expected values of GEV order statistics have been derived in simple summation form and in terms of probability weighted moments. Using exact plotting positions from GEV order statistics a new unbiased plotting position formula has been developed for the General Extreme Value distribution. The formula can, explicitly, take into account the coefficient of skewness, (or the shape parameter, k), of the underlying distribution.The developed formula better approximates the exact plotting positions as compared to other existing formulae and is quite easy to use.  相似文献   

10.
Expressions for the expected values of GEV order statistics have been derived in simple summation form and in terms of probability weighted moments. Using exact plotting positions from GEV order statistics a new unbiased plotting position formula has been developed for the General Extreme Value distribution. The formula can, explicitly, take into account the coefficient of skewness, (or the shape parameter, k), of the underlying distribution.The developed formula better approximates the exact plotting positions as compared to other existing formulae and is quite easy to use.  相似文献   

11.
This paper compares two Monte Carlo sequential data assimilation methods based on the Kalman filter, for estimating the effect of measurements on simulations of state error variance made by a one-dimensional hydrodynamic model. The first method used an ensemble Kalman filter (EnKF) to update state estimates, which were then used as initial conditions for further simulations. The second method used an ensemble transform Kalman filter (ETKF) to quickly estimate the effect of measurement error covariance on forecast error covariance without the need to re-run the simulation model. The ETKF gave an unbiased estimate of EnKF analysed error variance, although differences in the treatment of measurement errors meant the results were not identical. Estimates of forecast error variance could also be made, but their accuracy deteriorated as the time from measurements increased due in part to model non-linearity and the decreasing signal variance. The motivation behind the study was to assess the ability of the ETKF to target possible measurements, as part of an adaptive sampling framework, before they are assimilated by an EnKF-based forecasting model on the River Crouch, Essex, UK. The ETKF was found to be a useful tool for quickly estimating the error covariance expected after assimilating measurements into the hydrodynamic model. It, thus, provided a means of quantifying the ‘usefulness’ (in terms of error variance) of possible sampling schemes.  相似文献   

12.
A new unbiased plotting position formula for Gumbel distribution   总被引:1,自引:0,他引:1  
The probability plots (graphical approach) are used to fit the probability distribution to given series, to identify the outliers and to assess goodness of fit. The graphical approach requires probability of exceedence or non exceedence of various events. This is obtained through the use of plotting position formula. In literature many plotting position formulae have been reported. All of the many existing formulae provide different results particularly at the tails of the distribution and hence there is need of unbiased plotting position formulae for different distributions. Expression for the largest expected order statistics is found in a simple form. Using exact plotting position from Gumbel order statistics a new unbiased plotting position formula has been developed for the Gumbel distribution. The developed formula better approximates the exact plotting positions as compared to other existing formulae.  相似文献   

13.
Flow-stratified sampling is a new method for sampling water quality constituents such as suspended sediment to estimate loads. As with selection-at-list-time (SALT) and time-stratified sampling, flow-stratified sampling is a statistical method requiring random sampling, and yielding unbiased estimates of load and variance. It can be used to estimate event yields or to estimate mean concentrations in flow classes for detecting change over time or differences from water quality standards. Flow-stratified sampling is described and its variance compared with those of SALT and time-stratified sampling. Time-stratified sampling generally gives the smallest variance of the three methods for estimating storm yields. Flow-stratified sampling of individual storms may fail to produce estimates in some short-lived strata because they may have sample sizes of zero. SALT will tend to give small samples and relatively high variances for small stroms. For longer and more complex hydrographs, having numerous peaks, flow-stratified sampling gives the lowest variance, and the SALT variance is lower than that of time-stratified sampling unless the sample size is very large. A desirable feature of flow-stratified sampling is that the variance can be reduced after sampling by splitting strata, particularly high flow strata that have been visited just once, and recalculating the total and variance. SALT has the potential to produce the lowest variance, but cannot be expected to do so with an auxiliary variable based on stage.  相似文献   

14.
There are two basic approaches for estimating flood quantiles: a parametric and a nonparametric method. In this study, the comparisons of parametric and nonparametric models for annual maximum flood data of Goan gauging station in Korea were performed based on Monte Carlo simulation. In order to consider uncertainties that can arise from model and data errors, kernel density estimation for fitting the sampling distributions was chosen to determine safety factors (SFs) that depend on the probability model used to fit the real data. The relative biases of Sheater and Jones plug-in (SJ) are the smallest in most cases among seven bandwidth selectors applied. The relative root mean square errors (RRMSEs) of the Gumbel (GUM) are smaller than those of any other models regardless of parent models considered. When the Weibull-2 is assumed as a parent model, the RRMSEs of kernel density estimation are relatively small, while those of kernel density estimation are much bigger than those of parametric methods for other parent models. However, the RRMSEs of kernel density estimation within interpolation range are much smaller than those for extrapolation range in comparison with those of parametric methods. Among the applied distributions, the GUM model has the smallest SFs for all parent models, and the general extreme value model has the largest values for all parent models considered.  相似文献   

15.
We have developed a flood water level estimation method that only employs satellite images and a DEM. The method involves three steps: (1) discriminating flood areas and identifying clumps of each flood area, (2) extracting the edges of the identified flood area using a buffering technique, and (3) performing spatial interpolation to transform the extracted elevation to flood water levels. We compared the estimated flood water levels with the observed ones. The RMSE using the RADARSAT was 1.99 and 1.30 m at river and floodplain points, respectively, whereas the RMSE using the MODIS was 4.33 and 1.33 m at the river and floodplain points, respectively. Given that most errors are attributed to the DEM, the method exhibited good performance. Furthermore, the method reproduced the flow directions and flood water level changes during the flooding period. Thus, we demonstrated that the characteristics of flood inundation can be understood even when ground observation data cannot be obtained.  相似文献   

16.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

17.
This study investigates the uncertainty in the estimation of the design flood induced by errors in flood data. We initially describe and critically discuss the main sources of uncertainty affecting river discharge data, when they are derived using stage-discharge rating curves. Then, different error structures are used to investigate the effects of flood data errors on design flood estimation. Annual maxima values of river discharge observed on the Po River (Italy) at Pontelagoscuro are used as an example. The study demonstrates that observation errors may have a significant impact on the uncertainty of design floods, especially when the rating curve is affected by systematic errors.  相似文献   

18.
The acoustic Doppler velocimeter (ADV) measures three‐dimensional velocities in a small, remote sampling volume at high frequencies, however, these measurements incorporate errors that are intrinsic to the measurement technique. This paper demonstrates a new method for calculating the total measurement errors, including sampling errors, Doppler noise and errors due to velocity shear in the sampling volume associated with single‐point ADV measurements. This procedure incorporates both the effects of instrument configuration and the distribution of errors between velocity components for any probe orientation. It is shown that the ADV can characterize turbulent velocity fluctuations at frequencies up to the maximum sampling rate and that Reynolds shear stress errors are very small. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

19.
The design storm approach, where the subject criterion variable is evaluated by using a synthetic storm pattern composed of identical return frequencies of storm pattern input, is shown to be an effective approximation to a considerably more complex probabilistic model. The single area unit hydrograph technique is shown to be an accurate mathematical model of a highly discretized catchment with linear routing for channel flow approximation, and effective rainfalls in subareas which are linear with respect to effective rainfall output for a selected “loss” function. The use of a simple “loss” function which directly equates to the distribution of rainfall depth-duration statistics (such as a constant fraction of rainfall, or a ?-index model) is shown to allow the pooling of data and thereby provide a higher level of statistical significance (in estimating T-year outputs for a hydrologic criterion variable) than use of an arbitrary “loss” function. The above design storm unit hydrograph approach is shown to provide the T-year estimate of a criterion variable when using rainfall data to estimate runoff.  相似文献   

20.
为考虑洪水预报误差的空间变化,提出一种基于微分响应的流域产流分单元修正方法.该方法建立了各单元流域产流与流域出口流量之间的微分响应关系,采用正则化最小二乘法结合逐步迫近进行反演求解,将产流误差估计量分配给相应单元流域实现流域产流分单元修正.将构建的方法应用于大坡岭流域和七里街流域进行新安江模型产流修正,比较分析了流域产流分单元修正、流域面平均产流修正和自回归修正的效果.结果表明:流域产流分单元修正效果优于流域面平均产流修正;随着预见期的增大,产流微分响应修正效果优于自回归修正.该方法通过汇流系统将流域出口断面流量信息进行分解用于修正各单元流域产流,有利于提高实时洪水预报精度.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号