首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
The log-Gumbel distribution is one of the extreme value distributions which has been widely used in flood frequency analysis. This distribution has been examined in this paper regarding quantile estimation and confidence intervals of quantiles. Specific estimation algorithms based on the methods of moments (MOM), probability weighted moments (PWM) and maximum likelihood (ML) are presented. The applicability of the estimation procedures and comparison among the methods have been illustrated based on an application example considering the flood data of the St. Mary's River.  相似文献   

3.
Studies have illustrated the performance of at-site and regional flood quantile estimators. For realistic generalized extreme value (GEV) distributions and short records, a simple index-flood quantile estimator performs better than two-parameter (2P) GEV quantile estimators with probability weighted moment (PWM) estimation using a regional shape parameter and at-site mean and L-coefficient of variation (L-CV), and full three-parameter at-site GEV/PWM quantile estimators. However, as regional heterogeneity or record lengths increase, the 2P-estimator quickly dominates. This paper generalizes the index flood procedure by employing regression with physiographic information to refine a normalized T-year flood estimator. A linear empirical Bayes estimator uses the normalized quantile regression estimator to define a prior distribution which is employed with the normalized 2P-quantile estimator. Monte Carlo simulations indicate that this empirical Bayes estimator does essentially as well as or better than the simpler normalized quantile regression estimator at sites with short records, and performs as well as or better than the 2P-estimator at sites with longer records or smaller L-CV.  相似文献   

4.
Halphen laws have been proposed as a complete system of distributions with sufficient statistics that lead to estimation with minimum variance. The Halphen system provides a flexibility to fit a large variety of data sets from natural events. In this paper we present the method of moments (MM) to estimate the Halphen type B and IB distribution parameters. Their computation is very fast when compared to those given by the maximum likelihood method (ML). Furthermore, this estimation method is very easy to implement since the formulae are explicit. Some simulations show the equivalence of both methods when estimating the quantiles for finite sample size.  相似文献   

5.
The index flood method is widely used in regional flood frequency analysis (RFFA) but explicitly relies on the identification of ‘acceptable homogeneous regions’. This paper presents an alternative RFFA method, which is particularly useful when ‘acceptably homogeneous regions’ cannot be identified. The new RFFA method is based on the region of influence (ROI) approach where a ‘local region’ can be formed to estimate statistics at the site of interest. The new method is applied here to regionalize the parameters of the log‐Pearson 3 (LP3) flood probability model using Bayesian generalized least squares (GLS) regression. The ROI approach is used to reduce model error arising from the heterogeneity unaccounted for by the predictor variables in the traditional fixed‐region GLS analysis. A case study was undertaken for 55 catchments located in eastern New South Wales, Australia. The selection of predictor variables was guided by minimizing model error. Using an approach similar to stepwise regression, the best model for the LP3 mean was found to use catchment area and 50‐year, 12‐h rainfall intensity as explanatory variables, whereas the models for the LP3 standard deviation and skewness only had a constant term for the derived ROIs. Diagnostics based on leave‐one‐out cross validation show that the regression model assumptions were not inconsistent with the data and, importantly, no genuine outlier sites were identified. Significantly, the ROI GLS approach produced more accurate and consistent results than a fixed‐region GLS model, highlighting the superior ability of the ROI approach to deal with heterogeneity. This method is particularly applicable to regions that show a high degree of regional heterogeneity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

6.
This study proposes an improved nonstationary model for flood frequency analysis by investigating the relationship between flood peak and flood volume, using the Three Gorges Dam (TGD), China, for verification. First, the generalized additive model for location, scale and shape (GAMLSS) is used as the prior distribution. Then, under Bayesian theory, the prior distribution is updated using the conditional distribution, which is derived from the copula function. The results show that the improvement of the proposed model is significant compared with the GAMLSS-based prior distribution. Meanwhile, selection of a suitable prior distribution has a significant effect on the results of the improvement. For applications to the TGD, the nonstationary model can obviously increase the engineering management benefits and reduce the perceived risks of large floods. This study provides guidance for the dynamic management of hydraulic engineering under nonstationary conditions.  相似文献   

7.
In this study, a dynamic flood‐frequency analysis model considering the storm coverage effect is proposed and applied to six sub‐basins in the Pyungchang River basin, Korea. The model proposed is composed of the rectangular pulse Poisson process model for rainfall, the Soil Conservation Service curve number method for infiltration and the geomorphoclimatic instantaneous unit hydrograph for runoff estimation. Also, the model developed by Marco and Valdes is adopted for quantifying the storm‐coverage characteristics. By comparing the results from the same model with and without the storm‐coverage effect consideration, we could quantify the storm‐coverage effect on the flood‐frequency analysis. As a result of that, we found the storm‐coverage effect was so significant that overestimation of the design flood was unavoidable without its consideration. This also becomes more serious for larger basins where the probability of complete storm coverage is quite low. However, for smaller basins, the limited number of rain gauges is found to hamper the proper quantification of the storm‐coverage characteristics. Provided with a relationship curve between the basin size and the storm coverage (as in this study), this problem could be overcome with an acceptable accuracy level. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

8.
The paper presents an analysis of 17 long annual maximum series (AMS) of flood flows for Swiss Alpine basins, aimed at checking the presence of changes in the frequency regime of annual maxima. We apply Pettitt's change point test, the nonparametric sign test and Sen's test on trends. We also apply a parametric goodness‐of‐fit test for assessing the suitability of distributions estimated on the basis of annual maxima collected up to a certain year for describing the frequency regime of later observations. For a number of series the tests yield consistent indications for significant changes in the frequency regime of annual maxima and increasing trends in the intensity of annual maximum discharges. In most cases, these changes cannot be explained by anthropogenic causes only (e.g. streamflow regulation, construction of dams). Instead, we observe a statistically significant relationship between the year of change and the elevation of the catchment outlet. This evidence is consistent with the findings of recent studies that explain increasing discharges in alpine catchments with an increase in the temperature controlling the portion of mountain catchments above the freezing point. Finally, we analyse the differences in return periods (RPs) estimated for a given flood flow on the basis of recent and past observations. For a large number of the study AMS, we observe that, on average, the 100‐year flood for past observations corresponds to a RP of approximately 10 to 30 years on the basis of more recent observation. From a complementary perspective, we also notice that estimated RP‐year flood (i.e. flood quantile (FQ) associated with RP) increases on average by approximately 20% for the study area, irrespectively of the RP. Practical implications of the observed changes are illustrated and discussed in the paper. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

9.
10.
11.
12.
Conventional flood frequency analysis is concerned with providing an unbiased estimate of the magnitude of the design flow exceeded with the probabilityp, but sampling uncertainties imply that such estimates will, on average, be exceeded more frequently. An alternative approach is therefore, to derive an estimator which gives an unbiased estimate of flow risk: the difference between the two magnitudes reflects uncertainties in parameter estimation. An empirical procedure has been developed to estimate the mean true exceedance probabilities of conventional estimates made using a GEV distribution fitted by probability weighted moments, and adjustment factors have been determined to enable the estimation of flood magnitudes exceeded with, on average, the desired probability.  相似文献   

13.
Parametric method of flood frequency analysis (FFA) involves fitting of a probability distribution to the observed flood data at the site of interest. When record length at a given site is relatively longer and flood data exhibits skewness, a distribution having more than three parameters is often used in FFA such as log‐Pearson type 3 distribution. This paper examines the suitability of a five‐parameter Wakeby distribution for the annual maximum flood data in eastern Australia. We adopt a Monte Carlo simulation technique to select an appropriate plotting position formula and to derive a probability plot correlation coefficient (PPCC) test statistic for Wakeby distribution. The Weibull plotting position formula has been found to be the most appropriate for the Wakeby distribution. Regression equations for the PPCC tests statistics associated with the Wakeby distribution for different levels of significance have been derived. Furthermore, a power study to estimate the rejection rate associated with the derived PPCC test statistics has been undertaken. Finally, an application using annual maximum flood series data from 91 catchments in eastern Australia has been presented. Results show that the developed regression equations can be used with a high degree of confidence to test whether the Wakeby distribution fits the annual maximum flood series data at a given station. The methodology developed in this paper can be adapted to other probability distributions and to other study areas. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
Abstract

Flood frequency analysis based on a set of systematic data and a set of historical floods is applied to several Mediterranean catchments. After identification and collection of data on historical floods, several hydraulic models were constructed to account for geomorphological changes. Recent and historical rating curves were constructed and applied to reconstruct flood discharge series, together with their uncertainty. This uncertainty stems from two types of error: (a) random errors related to the water-level readings; and (b) systematic errors related to over- or under-estimation of the rating curve. A Bayesian frequency analysis is performed to take both sources of uncertainty into account. It is shown that the uncertainty affecting discharges should be carefully evaluated and taken into account in the flood frequency analysis, as it can increase the quantiles confidence interval. The quantiles are found to be consistent with those obtained with empirical methods, for two out of four of the catchments.

Citation Neppel, L., Renard, B., Lang, M., Ayral, P.-A., Coeur, D., Gaume, E., Jacob, N., Payrastre, O., Pobanz, K. & Vinet, F. (2010) Flood frequency analysis using historical data: accounting for random and systematic errors. Hydrol. Sci. J. 55(2), 192–208.  相似文献   

15.
The pushover analysis (POA) procedure is difficult to apply to high-rise buildings, as it cannot account for the contributions of higher modes. To overcome this limitation, a modal pushover analysis (MPA) procedure was proposed by Chopra et al. (2001). However, invariable lateral force distributions are still adopted in the MPA. In this paper, an improved MPA procedure is presented to estimate the seismic demands of structures, considering the redistribution of inertia forces after the structure yields. This improved procedure is verified with numerical examples of 5-, 9- and 22-story buildings. It is concluded that the improved MPA procedure is more accurate than either the POA procedure or MPA procedure. In addition, the proposed procedure avoids a large computational effort by adopting a two-phase lateral force distribution..  相似文献   

16.
The expression of equi-risk line derived by the authors represents the relationship between discharge capacityy 0 u and storage capacityz 0 u to keep flood frequency under a certain risk level represented by the return periodT, i.e.,z 0/z 0 u ={(y 0 uy 0 u )/y 0 u } S , wherey 0 u andz 0 u areT-year probability peak discharge and total volume of a hydrograph. The shape parametersS is evaluated in this paper for various release rules of the storage facilities and correlations of durations and peaks of hydrographs. The expression forS is: , whereS 0 andS are the values ofS forp=0 and , andp is the exponent of a general storage-release relation,q=az' p, wherea is the storage constant, andz' andq are the volume of stored water and the corresponding release. The values ofS 0 andS are expressed in terms of the correlation coefficient of durations and peaks of inflow hydrographs.  相似文献   

17.
18.
A simple and accurate traveltime approximation is important in many applications in seismic data processing, inversion and modelling stages. Generalized moveout approximation is an explicit equation that approximates reflection traveltimes in general two-dimensional models. Definition of its five parameters can be done from properties of finite offset rays, for general models, or by explicit calculation from model properties, for specific models. Two versions of classical finite-offset parameterization for this approximation use traveltime and traveltime derivatives of two rays to define five parameters, which makes them asymmetrical. Using a third ray, we propose a balance between the number of rays and the order of traveltime derivatives. Our tests using different models also show the higher accuracy of the proposed method. For acoustic transversely isotropic media with a vertical symmetry axis, we calculate a new moveout approximation in the generalized moveout approximation functional form, which is explicitly defined by three independent parameters of zero-offset two-way time, normal moveout velocity and anellipticity parameter. Our test shows that the maximum error of the proposed transversely isotropic moveout approximation is about 1/6 to 1/8 of that of the moveout approximation that had been reported as the most accurate approximation in these media. The higher accuracy is the result of a novel parameterization that do not add any computational complexity. We show a simple example of its application on synthetic seismic data.  相似文献   

19.
We use residual moveouts measured along continuous full azimuth reflection angle gathers, in order to obtain effective horizontal transversely isotropic model parameters. The angle gathers are generated through a special angle domain imaging system, for a wide range of reflection angles and full range of phase velocity azimuths. The estimation of the effective model parameters is performed in two stages. First, the background horizontal transversely isotropic (HTI)/vertical transversely isotropic (VTI) layered model is used, along with the values of reflection angles, for converting the measured residual moveouts (or traveltime errors) into azimuthally dependent normal moveout (NMO) velocities. Then we apply a digital Fourier transform to convert the NMO velocities into azimuthal wavenumber domain, in order to obtain the effective HTI model parameters: vertical time, vertical compression velocity, Thomsen parameter delta and the azimuth of the medium axis of symmetry. The method also provides a reliability criterion of the HTI assumption. The criterion shows whether the medium possesses the HTI type of symmetry, or whether the azimuthal dependence of the residual traveltime indicates to a more complex azimuthal anisotropy. The effective model used in this approach is defined for a 1D structure with a set of HTI, VTI and isotropic layers (with at least one HTI layer). We describe and analyse the reduction of a multi‐layer structure into an equivalent effective HTI model. The equivalent model yields the same NMO velocity and the same offset azimuth on the Earth's surface as the original layered structure, for any azimuth of the phase velocity. The effective model approximates the kinematics of an HTI/VTI layered structure using only a few parameters. Under the hyperbolic approximation, the proposed effective model is exact.  相似文献   

20.
Anisotropy in subsurface geological models is primarily caused by two factors: sedimentation in shale/sand layers and fractures. The sedimentation factor is mainly modelled by vertical transverse isotropy (VTI), whereas the fractures are modelled by a horizontal transversely isotropic medium (HTI). In this paper we study hyperbolic and non‐hyperbolic normal reflection moveout for a package of HTI/VTI layers, considering arbitrary azimuthal orientation of the symmetry axis at each HTI layer. We consider a local 1D medium, whose properties change vertically, with flat interfaces between the layers. In this case, the horizontal slowness is preserved; thus, the azimuth of the phase velocity is the same for all layers of the package. In general, however, the azimuth of the ray velocity differs from the azimuth of the phase velocity. The ray azimuth depends on the layer properties and may be different for each layer. In this case, the use of the Dix equation requires projection of the moveout velocity of each layer on the phase plane. We derive an accurate equation for hyperbolic and high‐order terms of the normal moveout, relating the traveltime to the surface offset, or alternatively, to the subsurface reflection angle. We relate the azimuth of the surface offset to its magnitude (or to the reflection angle), considering short and long offsets. We compare the derived approximations with analytical ray tracing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号