首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Abstract

Applicability of log-Gumbel (LG) and log-logistic (LL) probability distributions in hydrological studies is critically examined under real conditions, where the assumed distribution differs from the true one. The set of alternative distributions consists of five two-parameter distributions with zero lower bound, including LG and LL as well as lognormal (LN), linear diffusion analogy (LD) and gamma (Ga) distributions. The log-Gumbel distribution is considered as both a false and a true distribution. The model error of upper quantiles and of the first two moments is analytically derived for three estimation methods: the method of moments (MOM), the linear moments method (LMM) and the maximum likelihood method (MLM). These estimation methods are used as methods of approximation of one distribution by another distribution. As recommended in the first of this two-part series of papers, MLM turns out to be the worst method, if the assumed LG or LL distribution is not the true one. It produces a huge bias of upper quantiles, which is at least one order higher than that of the other two methods. However, the reverse case, i.e. acceptance of LN, LD or Ga as a hypothetical distribution, while the LG or LL distribution is the true one, gives the MLM bias of reasonable magnitude in upper quantiles. Therefore, one should avoid choosing the LG and LL distributions in flood frequency analysis, especially if MLM is to be applied.  相似文献   

2.
Ad hoc techniques for estimating the quantiles of the Generalized Pareto (GP) and the Generalized Extreme Values (GEV) distributions are introduced. The estimators proposed are based on new estimators of the position and the scale parameters recently introduced in the Literature. They provide valuable estimates of the quantiles of interest both when the shape parameter is known and when it is unknown (this latter case being of great relevance in practical applications). In addition, weakly-consistent estimators are introduced, whose calculation does not require the knowledge of any parameter. The procedures are tested on simulated data, and comparisons with other techniques are shown. The research was partially supported by Contract n. ENV4-CT97-0529 within the project “FRAMEWORK” of the European Community – D.G. XII. Grants by “Progetto Giovani Ricercatori” are also acknowledged.  相似文献   

3.
Abstract

Statistical analysis of extremes is often used for predicting the higher return-period events. In this paper, the trimmed L-moments with one smallest value trimmed—TL-moments (1,0)—are introduced as an alternative way to estimate floods for high return periods. The TL-moments (1,0) have an ability to reduce the undesirable influence that a small value in the statistical sample might have on a large return period. The main objective of this study is to derive the TL-moments (1,0) for the generalized Pareto (GPA) distribution. The performance of the TL-moments (1,0) was compared with L-moments through Monte Carlo simulation based on the streamflow data of northern Peninsular Malaysia. The result shows that, for some cases, the use of TL-moments (1,0) is a better option as compared to L-moments in modelling those series.

Citation Ahmad, U.N., Shabri, A. & Zakaria, Z.A. (2011) Trimmed L-moments (1,0) for the generalized Pareto distribution. Hydrol.Sci. J. 56(6), 1053–1060.  相似文献   

4.
The generalized Pareto distribution has received much popularity as models for extreme events in hydrological sciences. In this note, the important problem of the sum of two independent generalized Pareto random variables is considered. Exact analytical expressions for the probability distribution of the sum are derived and a detailed application to drought data from Nebraska is provided. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

5.
The generalized gamma (GG) distribution has a density function that can take on many possible forms commonly encountered in hydrologic applications. This fact has led many authors to study the properties of the distribution and to propose various estimation techniques (method of moments, mixed moments, maximum likelihood etc.). We discuss some of the most important properties of this flexible distribution and present a flexible method of parameter estimation, called the generalized method of moments (GMM) which combines any three moments of the GG distribution. The main advantage of this general method is that it has many of the previously proposed methods of estimation as special cases. We also give a general formula for the variance of theT-year eventX T obtained by the GMM along with a general formula for the parameter estimates and also for the covariances and correlation coefficients between any pair of such estimates. By applying the GMM and carefully choosing the order of the moments that are used in the estimation one can significantly reduce the variance ofT-year events for the range of return periods that are of interest.  相似文献   

6.
Abstract

Two probability density functions (pdf), popular in hydrological analyses, namely the log-Gumbel (LG) and log-logistic (LL), are discussed with respect to (a) their applicability to hydrological data and (b) the drawbacks resulting from their mathematical properties. This paper—the first in a two-part series—examines a classical problem in which the considered pdf is assumed to be the true distribution. The most significant drawback is the existence of the statistical moments of LG and LL for a very limited range of parameters. For these parameters, a very rapid increase of the skewness coefficient, as a function of the coefficient of variation, is observed (especially for the log-Gumbel distribution), which is seldom observed in the hydrological data. These probability distributions can be applied with confidence only to extreme situations. For other cases, there is an important disagreement between empirical data and theoretical distributions in their tails, which is very important for the characterization of the distribution asymmetry. The limited range of shape parameters in both distributions makes the analyses (such as the method of moments), that make use of the interpretation of moments, inconvenient. It is also shown that the often-used L-moments are not sufficient for the characterization of the location, scale and shape parameters of pdfs, particularly in the case where attention is paid to the tail part of probability distributions. The maximum likelihood method guarantees an asymptotic convergence of the estimators beyond the domain of the existence of the first two moments (or L-moments), but it is not sensitive enough to the upper tails shape.  相似文献   

7.
A consistent approach to the frequency analysis of hydrologic data in arid and semiarid regions, i.e. the data series containing several zero values (e.g. monthly precipitation in dry seasons, annual peak flow discharges, etc.), requires using discontinuous probability distribution functions. Such an approach has received relatively limited attention. Along the lines of physically based models, the extensions of the Muskingum‐based models to three parameter forms are considered. Using 44 peak flow series from the USGS data bank, the fitting ability of four three‐parameter models was investigated: (1) the Dirac delta combined with Gamma distribution; (2) the Dirac delta combined with two‐parameter generalized Pareto distribution; (3) the Dirac delta combined with two‐parameter Weibull (DWe) distribution; (4) the kinematic diffusion with one additional parameter that controls the probability of the zero event (KD3). The goodness of fit of the models was assessed and compared both by evaluation of discrepancies between the results of both estimation methods (i.e. the method of moments (MOM) and the maximum likelihood method (MLM)) and using the log of likelihood function as a criterion. In most cases, the DWe distribution with MLM‐estimated parameters showed the best fit of all the three‐parameter models. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

8.
Hydrologists use the generalized Pareto (GP) distribution in peaks-over-threshold (POT) modelling of extremes. A model with similar uses is the two-parameter kappa (KAP) distribution. KAP has had fewer hydrological applications than GP, but some studies have shown it to merit wider use. The problem of choosing between GP and KAP arises quite often in frequency analyses. This study, by comparing some discrimination methods between these two models, aims to show which method(s) is (are) recommended. Three specific methods are considered: one uses the Anderson-Darling goodness-of-fit (GoF) statistic, another uses the ratio of maximized likelihood (closely related to the Akaike information criterion and the Bayesian information criterion), and the third employs a normality transformation followed by application of the Shapiro-Wilk statistic. We show this last method to be the most recommendable, due to its advantages with sizes typically encountered in hydrology. We apply the simulation results to some flood POT datasets.
EDITOR D. Koutsoyiannis; ASSOCIATE EDITOR E. Volpi  相似文献   

9.
A bivariate pareto model for drought   总被引:2,自引:2,他引:0  
Univariate Pareto distributions have been so widely used in hydrology. It seems however that bivariate or multivariate Pareto distributions have not yet found applications in hydrology, especially with respect to drought. In this note, a drought application is described by assuming a bivariate Pareto model for the joint distribution drought durations and drought severity in the State of Nebraska. Based on this model, exact distributions are derived for the inter arrival time, magnitude and the proportion of droughts. Estimates of 2, 5, 10, 20, 50 and 100 year return periods are derived for the three variables, drought duration, drought severity and the pairwise combinations: (drought duration, drought severity), (inter arrival time of drought, proportion of drought) and (drought duration, drought magnitude). These return period estimates could have an important role in hydrology, for example, with respect to measures of vegetation water stress for plants in water-controlled ecosystems.  相似文献   

10.
极值理论在地震危险性分析中有着重要应用, 发震震级超过某一阈值的超出量分布可以近似为广义帕累托分布. 基于广义帕累托分布给出了若干地震活动性参数的估计公式, 包括强震震级分布、 地震复发周期和重现水平、 期望重现震级、 地震危险性概率和潜在震级上限等; 以云南地区震级资料为基础数据, 讨论了阈值选取、 模型拟合诊断和参数估计; 在此基础上计算了该地区的地震活动性参数. 结果表明, 广义帕累托分布较好地刻画了强震震级分布, 通过超阈值(POT)模型计算的复发周期与实际复发间隔统计基本一致, 高分位数估计在一定阈值范围内表现稳定, 为工程抗震中潜在震级上限的确定提供了一种途径.  相似文献   

11.
基于广义帕累托分布的地震震级分布尾部特征分析   总被引:4,自引:2,他引:2       下载免费PDF全文
极值理论在地震危险性分析中有着重要应用,发震震级超过某一阈值的超出量分布可以近似为广义帕累托分布.基于广义帕累托分布给出了若干地震活动性参数的估计公式,包括强震震级分布、地震复发周期和重现水平、期望重现震级、地震危险性概率和潜在震级上限等;以云南地区震级资料为基础数据,讨论了阈值选取、模型拟合诊断和参数估计;在此基础上计算了该地区的地震活动性参数.结果表明,广义帕累托分布较好地刻画了强震震级分布,通过超阈值(POT)模型计算的复发周期与实际复发间隔统计基本一致,高分位数估计在一定阈值范围内表现稳定,为工程抗震中潜在震级上限的确定提供了一种途径.  相似文献   

12.
Abstract

Abstract A new theoretically-based distribution in frequency analysis is proposed. The extended three-parameter Burr XII distribution includes the generalized Pareto distribution, which is used to model the exceedences over threshold; log-logistic distribution, which is also advocated in flood frequency analysis; and Weibull distribution, which is a part of the generalized extreme value distribution used to model annual maxima as special cases. The extended Burr distribution is flexible to approximate extreme value distribution. Note that both the generalized Pareto and generalized extreme value distributions are limiting results in modelling the exceedences over threshold and block extremes, respectively. From a modelling perspective, generalization might be necessary in order to obtain a better fit. The extended three-parameter Burr XII distribution is therefore a meaningful candidate distribution in the frequency analysis. Maximum likelihood estimation for this distribution is investigated in the paper. The use of the extended three-parameter Burr XII distribution is demonstrated using data from China.  相似文献   

13.
The probabilistic analysis of volcanic eruption time series is an essential step for the assessment of volcanic hazard and risk. Such series describe complex processes involving different types of eruptions over different time scales. A statistical method linking geological and historical eruption time series is proposed for calculating the probabilities of future eruptions. The first step of the analysis is to characterize the eruptions by their magnitudes. As is the case in most natural phenomena, lower magnitude events are more frequent, and the behavior of the eruption series may be biased by such events. On the other hand, eruptive series are commonly studied using conventional statistics and treated as homogeneous Poisson processes. However, time-dependent series, or sequences including rare or extreme events, represented by very few data of large eruptions require special methods of analysis, such as the extreme-value theory applied to non-homogeneous Poisson processes. Here we propose a general methodology for analyzing such processes attempting to obtain better estimates of the volcanic hazard. This is done in three steps: Firstly, the historical eruptive series is complemented with the available geological eruption data. The linking of these series is done assuming an inverse relationship between the eruption magnitudes and the occurrence rate of each magnitude class. Secondly, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Thirdly, the linked eruption series are analyzed as a non-homogeneous Poisson process with a generalized Pareto distribution as intensity function. As an application, the method is tested on the eruption series of five active polygenetic Mexican volcanoes: Colima, Citlaltépetl, Nevado de Toluca, Popocatépetl and El Chichón, to obtain hazard estimates.  相似文献   

14.
Abstract

Statistical analysis of extreme events is often carried out to predict large return period events. In this paper, the use of partial L-moments (PL-moments) for estimating hydrological extremes from censored data is compared to that of simple L-moments. Expressions of parameter estimation are derived to fit the generalized logistic (GLO) distribution based on the PL-moments approach. Monte Carlo analysis is used to examine the sampling properties of PL-moments in fitting the GLO distribution to both GLO and non-GLO samples. Finally, both PL-moments and L-moments are used to fit the GLO distribution to 37 annual maximum rainfall series of raingauge station Kampung Lui (3118102) in Selangor, Malaysia, and it is found that analysis of censored rainfall samples of PL-moments would improve the estimation of large return period events.

Editor D. Koutsoyiannis; Associate editor K. Hamed

Citation Zakaria, Z.A., Shabri, A. and Ahmad, U.N., 2012. Estimation of the generalized logistic distribution of extreme events using partial L-moments. Hydrological Sciences Journal, 57 (3), 424–432.  相似文献   

15.
Heavy tailed random variables (rvs) have proven to be an essential element in modeling a wide variety of natural and human-induced processes, and the sums of heavy tailed rvs represent a particularly important construction in such models. Oriented toward both geophysical and statistical audiences, this paper discusses the appearance of the Pareto law in seismology and addresses the problem of the statistical approximation for the sums of independent rvs with common Pareto distribution F(x)=1 – x for 1/2 < < 2. Such variables have infinite second moment which prevents one from using the Central Limit Theorem to solve the problem. This paper presents five approximation techniques for the Pareto sums and discusses their respective accuracy. The main focus is on the median and the upper and lower quantiles of the sums distribution. Two of the proposed approximations are based on the Generalized Central Limit Theorem, which establishes the general limit for the sums of independent identically distributed rvs in terms of stable distributions; these approximations work well for large numbers of summands. Another approximation, which replaces the sum with its maximal summand, has less than 10% relative error for the upper quantiles when < 1. A more elaborate approach considers the two largest observations separately from the rest of the observations, and yields a relative error under 1% for the upper quantiles and less than 5% for the median. The last approximation is specially tailored for the lower quantiles, and involves reducing the non-Gaussian problem to its Gaussian equivalent; it too yields errors less than 1%. Approximation of the observed cumulative seismic moment in California illustrates developed methods.  相似文献   

16.
The continuous time random walk (CTRW) has both an elegant mathematical theory and a successful record at modeling solute transport in the subsurface. However, there are some interpretation ambiguities relating to the relationship between the discrete CTRW transition distributions and the underlying continuous movement of solute that have not been addressed in existing literature. These include the exact definition of “transition”, and the extent to which transition probability distributions are unique/quantifiable from data. Here, we present some theoretical results which address these uncertainties in systems with an advective bias. Simultaneously, we present an alternative, reduced parameter CTRW formulation for general advective transport in heterogeneous porous media, which models early- and late-time transport by use of random transition times between sparse, imaginary planes normal to flow. We show that even in the context of this reduced-parameter formulation there is nonuniqueness in the definitions of both transition lengths and waiting time distributions, and that neither may be uniquely determined from experimental data. For practical use of this formulation, we suggest Pareto transition time distributions, leading to a two-degree-of-freedom modeling approach. We then demonstrate the power of this approach in fitting two sets of existing experimental data. While the primary focus is the presentation of new results, the discussion is designed to be pedagogical and to provide a good entry point into practical modeling of solute transport with the CTRW.  相似文献   

17.
F. Ashkar 《水文科学杂志》2013,58(6):1092-1106
Abstract

The potential is investigated of the generalized regression neural networks (GRNN) technique in modelling of reference evapotranspiration (ET0) obtained using the FAO Penman-Monteith (PM) equation. Various combinations of daily climatic data, namely solar radiation, air temperature, relative humidity and wind speed, are used as inputs to the ANN so as to evaluate the degree of effect of each of these variables on ET0. In the first part of the study, a comparison is made between the estimates provided by the GRNN and those obtained by the Penman, Hargreaves and Ritchie methods as implemented by the California Irrigation Management System (CIMIS). The empirical models were calibrated using the standard FAO PM ET0 values. The GRNN estimates are also compared with those of the calibrated models. Mean square error, mean absolute error and determination coefficient statistics are used as comparison criteria for the evaluation of the model performances. The GRNN technique (GRNN 1) whose inputs are solar radiation, air temperature, relative humidity and wind speed, gave mean square errors of 0.058 and 0.032 mm2 day?2, mean absolute errors of 0.184 and 0.127 mm day?1, and determination coefficients of 0.985 and 0.986 for the Pomona and Santa Monica stations (Los Angeles, USA), respectively. Based on the comparisons, it was found that the GRNN 1 model could be employed successfully in modelling the ET0 process. The second part of the study investigates the potential of the GRNN and the empirical methods in ET0 estimation using the nearby station data. Among the models, the calibrated Hargreaves was found to perform better than the others.  相似文献   

18.
Selecting ground motions based on the generalized intensity measure distribution (GIMD) approach has many appealing features, but it has not been fully verified in engineering practice. In this paper, several suites of ground motions, which have almost identical distributions of spectral acceleration (SA) ordinates but different distributions of non‐SA intensity measures, are selected using the GIMD‐based approach for a given earthquake scenario. The selected ground motion suites are used to compute the sliding displacements of various slopes. Comparisons of the resulting displacements demonstrate that selecting ground motions with biased distribution of some intensity measures (ie, Arias intensity) may yield systematic biases (up to 60% for some slope types). Therefore, compared to the ground motions selected based only on the distribution of SA ordinates, the ground motion suite selected by the GIMD‐based approach can better represent the various characteristics of earthquake loadings, resulting in generally unbiased estimation in specific engineering applications.  相似文献   

19.
The maximum entropy principle of the information theory gives rise to a general regularization strategy for ill-posed inverse problems. The methods based on this principle have become standard in various branches of engineering sciences. Of course, ill-posed problems frequently appear in Earth sciences, too. Nonetheless, the concept of maximum entropy is not very popular here. Therefore, we review the basic approaches employing the principle of maximum entropy in one way or another. We can distinguish at least three different approaches, partly yielding coincident results. One possible area of application is the determination of Earth and planetary models, although the paper cannot treat this in its practical complexity. Most of the discussion is restricted to the determination of the Earth's mass density function from various sources of data. Three sample problems are solved using the principle of maximum entropy: a spherical and an ellipsoidal problem related to the Earth and an ellipsoidal problem related to Mars. This illustrates the numerical procedure, which is non-trivial in many cases. It also shows some results, partly compared to standard solutions. The pros and cons of the approaches are discussed.  相似文献   

20.
广义反射-透射系数算法的无量纲化   总被引:1,自引:0,他引:1  
水平层状介质中瑞利波频散曲线的计算一直是受关注的问题.陈晓非提出的广义反射-透射系数方法虽然具备很好的精确性和稳定性,但是算法中起决定作用的矩阵E存在有量纲元素,并且不同元素之间的数量级差异很大.本文作者通过把广义反射-透射系数算法无量纲化,得到了简洁的瑞利波频散函数的公式体系.分别计算了改进后算法和原算法中矩阵E的最大模元素与最小模元素的比值ε,发现前者的矩阵E中最大模元素与最小模元素最多只相差1个数量级,而后者的矩阵E中最大模元素与最小模元素之间最大达11个数量级的差异,尽管后者可以通过选取变量的单位减小ε的值,但新算法精度更高,从而完善了利用广义反射-透射系数方法求解瑞利波频散曲线问题的理论和算法.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号