首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We present a statistically robust approach based on probability weighted moments to assess the presence of simple scaling in geophysical processes. The proposed approach is different from current approaches which rely on estimation of high order moments. High order moments of simple scaling processes (distributions) may not have theoretically defined values and consequently, their empirical estimates are highly variable and do not converge with increasing sample size. They are, therefore, not an appropriate tool for inference. On the other hand we show that the probability weighted moments of such processes (distributions) do exist and, hence, their empirical estimates are more robust. These moments, therefore, provide an appropriate tool for inferring the presence of scaling. We illustrate this using simulated Levystable processes and then draw inference on the nature of scaling in fluctuations of a spatial rainfall process.  相似文献   

2.
Probability weighted moments (PWM) are widely used in hydrology for estimating parameters of statistical distributions, including the Gumbel distribution. The classical PWM-approach considers the moments βi=E[XFi] with i=0,1 for estimation of the Gumbel scale and location parameters. However, there is no reason why these probability weights (F0 and F1) should provide the most efficient PWM-estimators of Gumbel parameters and quantiles. We explore an extended class of PWMs that does not impose arbitrary restrictions on the values of i. Estimation based on the extended class of PWMs is called the generalized method of probability weighted moments (GPWM) to distinguish it from the classical procedure. In fact, our investigation demonstrates that it may be advantage to use weight functions that are not of the form Fi. We propose an alternative PWM-estimator of the Gumbel distribution that maintains the computational simplicity of the classical PWM method, but provides slightly more accurate quantile estimates in terms of mean square error of estimation. A simple empirical formula for the standard error of the proposed quantile estimator is presented.  相似文献   

3.
The spatial scaling properties of Canadian annual average streamflow (abbreviated as AASF) are assessed using both the product moments (PMs) and the probability weighted moments (PWMs) of AASF across the entire country and in its sub-climatic regions. By the PMs, the log relationship between the kth moments of AASF and the drainage area can be almost represented by a perfect straight line across the entire country and in its sub-climatic regions, whose regression parameters are a linear function of the moment order. By the PWMs, the logarithm of the kth PWM is a linear function of the logarithm of drainage area for the entire country and its sub-climatic regions, where its slope (or scale exponent) in a region is constant and is independent of the order. These results indicate that Canadian AASF exhibits simple scaling and drainage area alone may describe most of the variability in the moments of AASF. The third approach, based on the log linearity between quantiles and drainage area, is applied to Region 2, also demonstrate simple scaling of AASF in that region, as concluded from using PMs and PWMs methods, which indicates that all three methods are consistent. The simple scaling results provide a basis for using the index flood method to conduct regional frequency analysis of AASF in Canada.  相似文献   

4.
Ugo Moisello 《水文研究》2007,21(10):1265-1279
The use of partial probability weighted moments (PPWM) for estimating hydrological extremes is compared to that of probability weighted moments (PWM). Firstly, estimates from at‐site data are considered. Two Monte Carlo analyses, conducted using continuous and empirical parent distributions (of peak discharge and daily rainfall annual maxima) and applying four different distributions (Gumbel, Fréchet, GEV and generalized Pareto), show that the estimates obtained from PPWMs are better than those obtained from PWMs if the parent distribution is unknown, as happens in practice. Secondly, the use of partial L‐moments (obtained from PPWMs) as diagnostic tools is considered. The theoretical partial L‐diagrams are compared with the experimental data. Five different distributions (exponential, Pareto, Gumbel, GEV and generalized Pareto) and 297 samples of peak discharge annual maxima are considered. Finally, the use of PPWMs with regional data is investigated. Three different kinds of regional analyses are considered. The first kind is the regression of quantile estimates on basin area. The study is conducted applying the GEV distribution to peak discharge annual maxima. The regressions obtained with PPWMs are slightly better than those obtained with PWMs. The second kind of regional analysis is the parametric one, of which four different models are considered. The congruence between local and regional estimates is examined, using peak discharge annual maxima. The congruence degree is sometimes higher for PPWMs, sometimes for PWMs. The third kind of regional analysis uses the index flood method. The study, conducted applying the GEV distribution to synthetic data from a lognormal joint distribution, shows that better estimates are obtained sometimes from PPWMs, sometimes from PWMs. All the results seem to indicate that using PPWMs can constitute a valid tool, provided that the influence of ouliers, of course higher with censored samples, is kept under control. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

5.
Conventional flood frequency analysis is concerned with providing an unbiased estimate of the magnitude of the design flow exceeded with the probabilityp, but sampling uncertainties imply that such estimates will, on average, be exceeded more frequently. An alternative approach is therefore, to derive an estimator which gives an unbiased estimate of flow risk: the difference between the two magnitudes reflects uncertainties in parameter estimation. An empirical procedure has been developed to estimate the mean true exceedance probabilities of conventional estimates made using a GEV distribution fitted by probability weighted moments, and adjustment factors have been determined to enable the estimation of flood magnitudes exceeded with, on average, the desired probability.  相似文献   

6.
Abstract

The well-established physical and mathematical principle of maximum entropy (ME), is used to explain the distributional and autocorrelation properties of hydrological processes, including the scaling behaviour both in state and in time. In this context, maximum entropy is interpreted as maximum uncertainty. The conditions used for the maximization of entropy are as simple as possible, i.e. that hydrological processes are non-negative with specified coefficients of variation (CV) and lag one autocorrelation. In this first part of the study, the marginal distributional properties of hydrological variables and the state scaling behaviour are investigated. Application of the ME principle under these very simple conditions results in the truncated normal distribution for small values of CV and in a nonexponential type (Pareto) distribution for high values of CV. In addition, the normal and the exponential distributions appear as limiting cases of these two distributions. Testing of these theoretical results with numerous hydrological data sets on several scales validates the applicability of the ME principle, thus emphasizing the dominance of uncertainty in hydrological processes. Both theoretical and empirical results show that the state scaling is only an approximation for the high return periods, which is merely valid when processes have high variation on small time scales. In other cases the normal distributional behaviour, which does not have state scaling properties, is a more appropriate approximation. Interestingly however, as discussed in the second part of the study, the normal distribution combined with positive autocorrelation of a process, results in time scaling behaviour due to the ME principle.  相似文献   

7.
This study investigated the spatial scaling properties of Canadian flood flows, namely, annual maximum mean 1‐, 5‐ and 7‐day flows using both the product moments (PMs) and probability weighted moments (PWMs). Both approaches demonstrate that flood flows in climatic regions 1 (Pacific), 2 (South British Columbia mountains), 3 (Yukon and northern British Columbia), 6 (Northeastern forest), 7 (Great Lakes and St. Lawrence rivers), 8 (Atlantic), and 10 (Arctic tundra) exhibit simple scaling with scaling exponent θ/H close to 0·90, while flood flows in regions 4 (Prairie provinces), 5 (Northwestern forest), and 9 (Mackenzie) does not with scaling exponent θ/H close to 0·50. The plots of coefficient of variations of flood flows versus drainage area indicate that Cv remains almost constant in regions 1, 2, 3, 6, 7, 8, and 10, while it decreases as drainage area increases in regions 4, 5, and 9. These results demonstrate that the index flood method is applicable in climatic regions 1, 2, 3, 6, 7, 8, and 10, while it is not in climatic regions 4, 5, and 9. The physical backgroud of the simple scaling of flood flows in most Canadian climatic regions is that snowmelt or rain‐on‐snow runoff is a dominant flood‐generating mechanism across the country. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

8.
This study aims to model the joint probability distribution of drought duration, severity and inter-arrival time using a trivariate Plackett copula. The drought duration and inter-arrival time each follow the Weibull distribution and the drought severity follows the gamma distribution. Parameters of these univariate distributions are estimated using the method of moments (MOM), maximum likelihood method (MLM), probability weighted moments (PWM), and a genetic algorithm (GA); whereas parameters of the bivariate and trivariate Plackett copulas are estimated using the log-pseudolikelihood function method (LPLF) and GA. Streamflow data from three gaging stations, Zhuangtou, Taian and Tianyang, located in the Wei River basin, China, are employed to test the trivariate Plackett copula. The results show that the Plackett copula is capable of yielding bivariate and trivariate probability distributions of correlated drought variables.  相似文献   

9.
Hans Van de Vyver 《水文研究》2018,32(11):1635-1647
Rainfall intensity–duration–frequency (IDF) curves are a standard tool in urban water resources engineering and management. They express how return levels of extreme rainfall intensity vary with duration. The simple scaling property of extreme rainfall intensity, with respect to duration, determines the form of IDF relationships. It is supposed that the annual maximum intensity follows the generalized extreme value (GEV) distribution. As well known, for simple scaling processes, the location parameter and scale parameter of the GEV distribution obey a power law with the same exponent. Although, the simple scaling hypothesis is commonly used as a suitable working assumption, the multiscaling approach provides a more general framework. We present a new IDF relationship that has been formulated on the basis of the multiscaling property. It turns out that the GEV parameters (location and scale) have a different scaling exponent. Next, we apply a Bayesian framework to estimate the multiscaling GEV model and to choose the most appropriate model. It is shown that the model performance increases when using the multiscaling approach. The new model for IDF curves reproduces the data very well and has a reasonable degree of complexity without overfitting on the data.  相似文献   

10.
The log-Pearson type 3 distribution is widely used in North America and Australia for fitting annual flood series. Four different versions of the method of moments used in fitting this distribution are compared using Monte Carlo simulated samples which reflect some of the characteristics of annual flood series observed at some Canadian rivers. The bias, standard error, root mean square error, and skew, of the parameter estimates, and of estimates of events associated with different probabilities of occurrence are examined. Also examined are the correlation coefficients between probabilities of occurrence are examined. Also examined are the correlation coefficients between the parameter estimates and between the sample moments that are used in each of the four methods of estimation. It is observed that variances, covariances and correlation coefficients calculated using the usual first-order asymptotic approximation might have considerable error and therefore should be used with caution. On the basis of mean square error of events with return period above the range covered by the sample it is observed that a method proposed earlier which uses moments of order 1, 2 and 3 in real space performs better than the other three methods although certain of the other methods follow the recommendation put forward by some investigators that higher order moments (moments of order 3 or more) should be avoided in flood frequency estimation. It is argued in the present study that the use of higher order moments should not be avoided simply because they have high variability because it is not only the variability of the moments which determines the degree of variability of the estimated design flood events but also the correlation that exists between these moments. Some recommendations are given at the end of the study aimed at achieving better efficiency in flood frequency research at a period where more and more distributions and methods of estimation are being proposed.  相似文献   

11.
Abstract

Two probability density functions (pdf), popular in hydrological analyses, namely the log-Gumbel (LG) and log-logistic (LL), are discussed with respect to (a) their applicability to hydrological data and (b) the drawbacks resulting from their mathematical properties. This paper—the first in a two-part series—examines a classical problem in which the considered pdf is assumed to be the true distribution. The most significant drawback is the existence of the statistical moments of LG and LL for a very limited range of parameters. For these parameters, a very rapid increase of the skewness coefficient, as a function of the coefficient of variation, is observed (especially for the log-Gumbel distribution), which is seldom observed in the hydrological data. These probability distributions can be applied with confidence only to extreme situations. For other cases, there is an important disagreement between empirical data and theoretical distributions in their tails, which is very important for the characterization of the distribution asymmetry. The limited range of shape parameters in both distributions makes the analyses (such as the method of moments), that make use of the interpretation of moments, inconvenient. It is also shown that the often-used L-moments are not sufficient for the characterization of the location, scale and shape parameters of pdfs, particularly in the case where attention is paid to the tail part of probability distributions. The maximum likelihood method guarantees an asymptotic convergence of the estimators beyond the domain of the existence of the first two moments (or L-moments), but it is not sensitive enough to the upper tails shape.  相似文献   

12.
The robustness of large quantile estimates of largest elements in a small sample by the methods of moments (MOM), L‐moments (LMM) and maximum likelihood (MLM) was evaluated and compared. Bias (B) and mean square error (MSE) were used to measure the estimation methods performance. Quantiles were estimated by eight two‐parameter probability distributions with the variation coefficient being the shape parameter. The effect of dropping largest elements of the series on large quantile values was assessed for various variation coefficient (CV)/sample size (n) ‘combinations’ with n = 30 as the basic value. To that end, both the Monte Carlo sampling experiments and an asymptotic approach consisting in distribution truncation were applied. In general, both sampling and asymptotic approaches point to MLM as the most robust method of the three considered, with respect to bias of large quantiles. Comparing the performance of two other methods, the MOM estimates were found to be more robust for small and moderate hydrological samples drawn from distributions with zero lower‐bound than were the LMM estimates. Extending the evaluation to outliers, it was shown that all the above findings remain valid. However, using the MSE variation as a measure of performance, the LMM was found to be the best for most distribution/variation coefficient combinations, whereas MOM was found to be the worst. Moreover, removal of the largest sample element need not result in a loss of estimation efficiency. The gain in accuracy is observed for the heavy‐tailed and log‐normal distributions, being particularly distinctive for LMM. In practice, while dealing with a single sample deprived of its largest element, one should choose the estimation method giving the lowest MSE of large quantiles. For n = 30 and several distribution/variation coefficient combinations, the MLM outperformed the two other methods in this respect and its supremacy grew with sample size, while MOM was usually the worst. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

13.
Among the schemes for earthquake forecasting, the search for semi-periodicity during large earthquakes in a given seismogenic region plays an important role. When considering earthquake forecasts based on semi-periodic sequence identification, the Bayesian formalism is a useful tool for: (1) assessing how well a given earthquake satisfies a previously made forecast; (2) re-evaluating the semi-periodic sequence probability; and (3) testing other prior estimations of the sequence probability. A comparison of Bayesian estimates with updated estimates of semi-periodic sequences that incorporate new data not used in the original estimates shows extremely good agreement, indicating that: (1) the probability that a semi-periodic sequence is not due to chance is an appropriate estimate for the prior sequence probability estimate; and (2) the Bayesian formalism does a very good job of estimating corrected semi-periodicity probabilities, using slightly less data than that used for updated estimates. The Bayesian approach is exemplified explicitly by its application to the Parkfield semi-periodic forecast, and results are given for its application to other forecasts in Japan and Venezuela.  相似文献   

14.
The inverted beta distributions are popular models for hydrology. However, they suffer from the fact that they do not possess finite moments of all orders. In this note, a truncated version of the inverted beta distribution is introduced, which possesses finite moments of all orders and could therefore be a better model for hydrological data with a finite upper bound. Explicit expressions for the moments of the truncated distribution and its estimation procedure are derived. An application is provided to ozone level data from New York.  相似文献   

15.
The log-Gumbel distribution is one of the extreme value distributions which has been widely used in flood frequency analysis. This distribution has been examined in this paper regarding quantile estimation and confidence intervals of quantiles. Specific estimation algorithms based on the methods of moments (MOM), probability weighted moments (PWM) and maximum likelihood (ML) are presented. The applicability of the estimation procedures and comparison among the methods have been illustrated based on an application example considering the flood data of the St. Mary's River.  相似文献   

16.
A peaks over threshold (POT) method of analysing daily rainfall values is developed using a Poisson process of occurrences and a generalised Pareto distribution (GPD) for the exceedances. The parameters of the GPD are estimated by the method of probability weighted moments (PWM) and a method of combining the individual estimates to define a regional curve is proposed.  相似文献   

17.
Frequency analyses of annual extreme rainfall series from 5 min to 24 h   总被引:1,自引:0,他引:1  
The parameter estimation methods of (1) moments, (2) maximum‐likelihood, (3) probability‐weighted moments (PWM) and (4) self‐determined PWM are applied to the probability distributions of Gumbel, general extreme values, three‐parameter log‐normal (LN3), Pearson‐3 and log‐Pearson‐3. The special method of computing parameters so as to make the sample skewness coefficient zero is also applied to LN3, and hence, altogether 21 candidate distributions resulted. The parameters of these distributions are computed first by original sample series of 14 successive‐duration annual extreme rainfalls recorded at a rain‐gauging station. Next, the parameters are scaled by first‐degree semi‐log or log‐log polynomial regressions versus rainfall durations from 5 to 1440 min (24 h). Those distributions satisfying the divergence criterion for frequency curves are selected as potential distributions, whose better‐fit ones are determined by a conjunctive evaluation of three goodness‐of‐fit tests. Frequency tables, frequency curves and intensity–duration–frequency curves are the outcome. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

18.
The aim of the study is to present the effective and relatively simple empirical approach to rainfall intensity-duration-frequency-formulas development, based on Controlled Random Search (CRS) for global optimization. The approach is mainly dedicated to the cases in which the commonly used IDF-relationships do not provide satisfactory fit between simulations and observations, and more complex formulas with higher number of parameters are advisable. Precipitation data from Gdańsk gauge station were analyzed as the example, with use of peak-overthreshold method and Chomicz scale for rainfall intensity. General forms of the IDF-function were chosen and the parameter calibration with use of CRS algorithm was developed. The compliance of the obtained IDFformulas with precipitation data and the efficiency of the algorithm were analyzed. The study confirmed the proposed empirical approach may be an interesting alternative for probabilistic ones, especially when IDFrelationship has more complex form and precipitation data do not match “typical” hydrological distributions.  相似文献   

19.
 Being a non-linear method based on a rigorous formalism and an efficient processing of various information sources, the Bayesian maximum entropy (BME) approach has proven to be a very powerful method in the context of continuous spatial random fields, providing much more satisfactory estimates than those obtained from traditional linear geostatistics (i.e., the various kriging techniques). This paper aims at presenting an extension of the BME formalism in the context of categorical spatial random fields. In the first part of the paper, the indicator kriging and cokriging methods are briefly presented and discussed. A special emphasis is put on their inherent limitations, both from the theoretical and practical point of view. The second part aims at presenting the theoretical developments of the BME approach for the case of categorical variables. The three-stage procedure is explained and the formulations for obtaining prior joint distributions and computing posterior conditional distributions are given for various typical cases. The last part of the paper consists in a simulation study for assessing the performance of BME over the traditional indicator (co)kriging techniques. The results of these simulations highlight the theoretical limitations of the indicator approach (negative probability estimates, probability distributions that do not sum up to one, etc.) as well as the much better performance of the BME approach. Estimates are very close to the theoretical conditional probabilities, that can be computed according to the stated simulation hypotheses.  相似文献   

20.
Two well-known methods for estimating statistical distributions in hydrology are the Method of Moments (MOMs) and the method of probability weighted moments (PWM). This paper is concerned with the case where a part of the sample is censored. One situation where this might occur is when systematic data (e.g. from gauges) are combined with historical data, since the latter are often only reported if they exceed a high threshold. For this problem, three previously derived estimators are the “B17B” estimator, which is a direct modification of MOM to allow for partial censoring; the “partial PWM estimator”, which similarly modifies PWM; and the “expected moments algorithm” estimator, which improves on B17B by replacing a sample adjustment of the censored-data moments with a population adjustment. The present paper proposes a similar modification to the PWM estimator, resulting in the “expected probability weighted moments (EPWM)” estimator. Simulation comparisons of these four estimators and also the maximum likelihood estimator show that the EPWM method is at least competitive with the other four and in many cases the best of the five estimators.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号