首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 687 毫秒
1.
We have fitted field measurements of fracture spacings (from the vicinity of Lake Strom Thurmond, Georgia, U.S.A.) to the Weibull, Schuhmann and fractal distributions. The fracture spacings follow a fractal and Weibull distribution which implies that they were formed as a result of a repetitive fragmentation process. The limited variation of the fracture density with orientation in the study area suggests that the stress distribution generating these fractures may be uniform.  相似文献   

2.
The grain-size distributions of bedload gravels in Oak Creek, Oregon, follow the ideal Rosin distribution at flow stages which exceed that necessary to initiate breakup of the pavement in the bed material. The distributions systematically vary with flow discharge and bed stress, such that at higher flow stages the grain sizes are coarser while the spread of the distribution decreases. A differential bedload transport function for individual grain-size fractions is formulated utilizing the dependence of the two parameters in the Rosin distribution on the flow stress. The total transport rate, which is also a function of the flow stress, is apportioned within the Rosin grain-size distribution to yield the fractional transport rates. The derived bedload function has the advantage of yielding smooth, continuous frequency distributions of transport rates for the grain-size fractions, in contrast to the discrete transport functions which predict rates for specified sieve fractions. Successful reproduction of the measured fractional transport rates and bedload grain-size distributions in Oak Creek by this approach demonstrates its potential for evaluations of transport rates of size fractions in gravel-bed streams. The approach will be useful in investigations of downstream changes in bed material grain-size distributions.  相似文献   

3.
《水文科学杂志》2013,58(1):236-252
Abstract

Suspended sediments are a natural component of aquatic ecosystems, but when present in high concentrations they can become a threat to aquatic life and can carry large amounts of pollutants. Suspended sediment concentration (SSC) is therefore an important abiotic variable used to quantify water quality and habitat availability for some species of fish and invertebrates. This study is an attempt to quantify and predict annual extreme events of SSC using frequency analysis methods. Time series of daily suspended sediment concentrations in 208 rivers in North America were analysed to provide a large-scale frequency analysis study of annual maximum concentrations. Seasonality and the correlation of discharges and annual peak of suspended sediment concentration were also analysed. Peak concentrations usually occur in spring and summer. A significant correlation between extreme SSC and associated discharge was detected only in half of the stations. Probability distributions were fitted to station data recorded at the stations to estimate the return period for a specific concentration, or the concentration for a given return period. Selection criteria such as the Akaike and Bayesian information criterion were used to select the best statistical distribution in each case. For each selected distribution, the most appropriate parameter estimation method was used. The most commonly used distributions were exponential, lognormal, Weibull and Gamma. These four distributions were used for 90% of stations.  相似文献   

4.
To characterize the seasonal variation of the marginal distribution of daily precipitation, it is important to find which statistical characteristics of daily precipitation actually vary the most from month-to-month and which could be regarded to be invariant. Relevant to the latter issue is the question whether there is a single model capable to describe effectively the nonzero daily precipitation for every month worldwide. To study these questions we introduce and apply a novel test for seasonal variation (SV-Test) and explore the performance of two flexible distributions in a massive analysis of approximately 170,000 monthly daily precipitation records at more than 14,000 stations from all over the globe. The analysis indicates that: (a) the shape characteristics of the marginal distribution of daily precipitation, generally, vary over the months, (b) commonly used distributions such as the Exponential, Gamma, Weibull, Lognormal, and the Pareto, are incapable to describe “universally” the daily precipitation, (c) exponential-tail distributions like the Exponential, mixed Exponentials or the Gamma can severely underestimate the magnitude of extreme events and thus may be a wrong choice, and (d) the Burr type XII and the Generalized Gamma distributions are two good models, with the latter performing exceptionally well.  相似文献   

5.
Modelling raindrop size distribution (DSD) is a fundamental issue to connect remote sensing observations with reliable precipitation products for hydrological applications. To date, various standard probability distributions have been proposed to build DSD models. Relevant questions to ask indeed are how often and how good such models fit empirical data, given that the advances in both data availability and technology used to estimate DSDs have allowed many of the deficiencies of early analyses to be mitigated. Therefore, we present a comprehensive follow-up of a previous study on the comparison of statistical fitting of three common DSD models against 2D-Video Distrometer (2DVD) data, which are unique in that the size of individual drops is determined accurately. By maximum likelihood method, we fit models based on lognormal, gamma and Weibull distributions to more than 42.000 1-minute drop-by-drop data taken from the field campaigns of the NASA Ground Validation program of the Global Precipitation Measurement (GPM) mission. In order to check the adequacy between the models and the measured data, we investigate the goodness of fit of each distribution using the Kolmogorov–Smirnov test. Then, we apply a specific model selection technique to evaluate the relative quality of each model. Results show that the gamma distribution has the lowest KS rejection rate, while the Weibull distribution is the most frequently rejected. Ranking for each minute the statistical models that pass the KS test, it can be argued that the probability distributions whose tails are exponentially bounded, i.e. light-tailed distributions, seem to be adequate to model the natural variability of DSDs. However, in line with our previous study, we also found that frequency distributions of empirical DSDs could be heavy‐tailed in a number of cases, which may result in severe uncertainty in estimating statistical moments and bulk variables.  相似文献   

6.
The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.  相似文献   

7.
8.
Abstract

This paper describes a stochastic rainfall model which has been developed to generate synthetic sequences of hourly rainfalls at a point. The model has been calibrated using data from Farnborough in Hampshire, England. This rainfall data series was divided into wet and dry spells; analysis of the durations of these spells suggests that they may be represented by exponential and generalized Pareto distributions respectively. The total volume of rainfall in wet spells was adequately fitted by a conditional gamma distribution. Random sampling from a beta distribution, defining the average shape of all rainfall profiles, is used in the model to obtain the rainfall profile for a given wet spell. Results obtained from the model compare favourably with observed monthly and annual rainfall totals and with annual maximum frequency distributions of 1, 2, 6, 12, 24 and 48 hours duration at Farnborough. The model has a total of 22 parameters, some of which are specific to winter or summer seasons.  相似文献   

9.
The majority of continental arc volcanoes go through decades or centuries of inactivity, thus, communities become inured to their threat. Here we demonstrate a method to quantify hazard from sporadically active volcanoes and to develop probabilistic eruption forecasts. We compiled an eruption-event record for the last c. 9,500 years at Mt Taranaki, New Zealand through detailed radiocarbon dating of recent deposits and a sediment core from a nearby lake. This is the highest-precision record ever collected from the volcano, but it still probably underestimates the frequency of eruptions, which will only be better approximated by adding data from more sediment core sites in different tephra-dispersal directions. A mixture of Weibull distributions provided the best fit to the inter-event period data for the 123 events. Depending on which date is accepted for the last event, the mixture-of-Weibulls model probability is at least 0.37–0.48 for a new eruption from Mt Taranaki in the next 50 years. A polymodal distribution of inter-event periods indicates that a range of nested processes control eruption recurrence at this type of arc volcano. These could possibly be related by further statistical analysis to intrinsic factors such as step-wise processes of magma rise, assembly and storage.  相似文献   

10.
11.
Empirical frequency distributions of multiplicative cascade weights, or breakdown coefficients, at small timescales are analyzed for 5-min precipitation time series from four gauges in Germany. It is shown that histograms of the weights, W, are strongly deformed by the recording precision of rainfall amounts. A randomization procedure is proposed to statistically remove the artifacts due to precision errors in the original series. Evolution of the probability distributions of W from beta-like for large timescales to combined beta-normal distribution with a pronounced peak at W ≈ 0.5 for small timescales is observed. A new 3N-B distribution built from 3 separate normal, N, distributions and one beta, B, distribution is proposed for reproduction of the empirical histograms of W at small timescales. Parameters of the 3N-B distributions are fitted for all gauges and analyzed timescales. Microcanonical cascades models with a generator based on 3N-B distributions are developed and their performance at disaggregating precipitation at 1280-min intervals down to 5-min intervals is evaluated.  相似文献   

12.
 Estimation of confidence limits and intervals for the two- and three-parameter Weibull distributions are presented based on the methods of moment (MOM), probability weighted moments (PWM), and maximum likelihood (ML). The asymptotic variances of the MOM, PWM, and ML quantile estimators are derived as a function of the sample size, return period, and parameters. Such variances can be used for estimating the confidence limits and confidence intervals of the population quantiles. Except for the two-parameter Weibull model, the formulas obtained do not have simple forms but can be evaluated numerically. Simulation experiments were performed to verify the applicability of the derived confidence intervals of quantiles. The results show that overall, the ML method for estimating the confidence limits performs better than the other two methods in terms of bias and mean square error. This is specially so for γ≥0.5 even for small sample sizes (e.g. N=10). However, the drawback of the ML method for determining the confidence limits is that it requires that the shape parameter be bigger than 2. The Weibull model based on the MOM, ML, and PWM estimation methods was applied to fit the distribution of annual 7-day low flows and 6-h maximum annual rainfall data. The results showed that the differences in the estimated quantiles based on the three methods are not large, generally are less than 10%. However, the differences between the confidence limits and confidence intervals obtained by the three estimation methods may be more significant. For instance, for the 7-day low flows the ratio between the estimated confidence interval to the estimated quantile based on ML is about 17% for T≥2 while it is about 30% for estimation based on MOM and PWM methods. In addition, the analysis of the rainfall data using the three-parameter Weibull showed that while ML parameters can be estimated, the corresponding confidence limits and intervals could not be found because the shape parameter was smaller than 2.  相似文献   

13.
Abstract

A parameter estimation method is proposed for fitting probability distribution functions to low flow observations. LL-moments are variants of L-moments that are analogous to LH-moments, which were defined for the analysis of floods. LL-moments give higher weights to the small observations. Expressions are given that relate them to the probability distribution function for the case of normal, Weibull and power distributions. Sampling properties of the LL-moments and of the distribution parameters and quantiles estimated by them are found by a Monte Carlo simulation study. It is shown on an example that the low flow quantile estimates obtained by LL-moments may be significantly different from those obtained by L-moments.  相似文献   

14.
The quantile of a probability distribution, known as return period or hydrological design value of a hydrological variable is the value corresponding to fixed non-exceedence probability and is very important notion in hydrology. In hydraulic engineering design and water resources management, confidence interval (CI) estimation for a population quantile is of primary interest and among other applications, is used to assess the pollution level of a contaminant in water, air etc. The accuracy on such estimation directly influences the engineering investments and safety. The two parameter Weibull, Pareto, Lognormal, Inverse Gaussian, Gamma are some commonly used probability models in such applications. In spite of its practical importance, the problem of CI estimation of a quantile of these widely applicable distributions has been less attended in the literature. In this paper, a new method is proposed to obtain a CI for a quantile of any distribution for which [or the probability distribution of any one-to-one function of the underlying random variable (RV)] generalized pivotal quantities (GPQs) exist for its parameters. The proposed method is elucidated by constructing CIs for quantiles of Weibull, Pareto, Lognormal, Extreme value distribution of type-I for minimum, Exponential and Normal distributions for complete as well as type II singly right censored samples. The empirical performance evaluation of the proposed method evinced that the proposed method has exact well concentrated coverage probabilities near the nominal level even for small uncensored samples as small as 5 and for censored samples as long as the proportion of censored observations is up to 0.70. The existing methods for Weibull distribution have poor or dispersed coverage probabilities with respect to the nominal level for complete samples. Applications of the proposed method in ground water monitoring and in the assessment of air pollution are illustrated for practitioners.  相似文献   

15.
A statistical model for describing the energy scaling of the distribution of inter-event times is described. By considering the diverse region seismicity (natural and induced) on different scale (energy/magnitude) levels the self-similarity of the distribution has been determined. A comparison between the distribution of inter-event times on different scale levels and the most popular distributions of reliability theory has been carried out. The distribution of inter-event times for different scale levels is well approximated by the Weibull distribution. The Weibull distribution, with parameters which obey the scaling model and the Gutenberg-Richter law, has been tested.  相似文献   

16.
Abstract

Abstract A new theoretically-based distribution in frequency analysis is proposed. The extended three-parameter Burr XII distribution includes the generalized Pareto distribution, which is used to model the exceedences over threshold; log-logistic distribution, which is also advocated in flood frequency analysis; and Weibull distribution, which is a part of the generalized extreme value distribution used to model annual maxima as special cases. The extended Burr distribution is flexible to approximate extreme value distribution. Note that both the generalized Pareto and generalized extreme value distributions are limiting results in modelling the exceedences over threshold and block extremes, respectively. From a modelling perspective, generalization might be necessary in order to obtain a better fit. The extended three-parameter Burr XII distribution is therefore a meaningful candidate distribution in the frequency analysis. Maximum likelihood estimation for this distribution is investigated in the paper. The use of the extended three-parameter Burr XII distribution is demonstrated using data from China.  相似文献   

17.
The statistics of quantities involved in the synthesis of cloud scenes have been investigated from an original data base. Frequency distributions of ice and water content (IWC), horizontal and vertical sizes (L and H), and top temperatures (T) of clouds above Europe have been derived for nine types of clouds (As, Cb, Ci, Cg, LwCg, OrCg, Cs, Ns, Sc). It appears that the cumulated frequency plots can be well fitted with log-normal or Weibull profiles, and that for IWC and T cloud types can be split into two or three classes according to slopes in logarithmic coordinates. Cross-correlation coefficients between IWC, L, H and T have been also derived. Implications for the physics of the cloud build-up processes are briefly outlined. Critical analysis and comparison of other published results are proposed.  相似文献   

18.
This study presents a framework for numerical simulations based upon micromechanical parameters in modeling progressive failures of heterogeneous rock specimens under compression. In our numerical simulations, a Weibull distribution of the strength and elastic properties of the finite elements is assumed, and the associated Weibull parameters are estimated in terms of microstructural properties, such as crack size distribution and grain size, through microscopic observations of microcracks. The main uncertainty in this procedure lies on the fact that various ways can be used to formulate a ``microcrack size distribution' in relating to the Weibull parameters. As one possible choice, the present study uses the number of counted cracks per unit scanned volume per grain size to formulate the crack distributions. Finally, as a tool, the Rock Failure Process Analysis code (RFPA2D) is adopted for simulating the progressive failure and microseismicity of heterogeneous rocks by using an elastic-damage finite-element approach. To verify our framework, compression tests on marble specimens are conducted, and the measured acoustic emissions (AE) are compared with those predicted by the numerical simulations. The mode of failure, compressive strength and AE pattern of our simulations basically agree with experimental observations.  相似文献   

19.
Conventional design methodology for the earthquake‐resistant structures is based on the concept of ensuring ‘no collapse’ during the most severe earthquake event. This methodology does not envisage the possibility of continuous damage accumulation during several not‐so‐severe earthquake events, as may be the case in the areas of moderate to high seismicity, particularly when it is economically infeasible to carry out repairs after damaging events. As a result, the structure may collapse or may necessitate large scale repairs much before the design life of the structure is over. This study considers the use of design force ratio (DFR) spectrum for taking an informed decision on the extent to which yield strength levels should be raised to avoid such a scenario. DFR spectrum gives the ratios by which the yield strength levels of single‐degree‐of‐freedom oscillators of different initial periods should be increased in order to limit the total damage caused by all earthquake events during the lifetime to a specified level. The DFR spectra are compared for three different seismicity models in case of elasto‐plastic oscillators: one corresponding to the exponential distribution for return periods of large events and the other two corresponding to the lognormal and Weibull distributions. It is shown through numerical study for a hypothetical seismic region that the use of simple exponential model may be acceptable only for small values of the seismic gap length. For moderately large to large seismic gap lengths, it may be conservative to use the lognormal model, while the Weibull model may be assumed for very large seismic gap lengths. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

20.
The understanding of the vertical as well as the horizontal behaviours of wind speed is of great importance in many applications such as aviation, meteorology and wind energy conversion. In this work, we propose to apply the principal component analysis (PCA) in order to extract probable components of wind speed. The idea behind the use of PCA is to introduce mixed sources signals to PCA algorithm as input in order to obtain a separated patterns as output. Hence, values of wind speed measured at three levels above the ground will be used as three separate sensors in order to extract the horizontal and the vertical components of wind speed. Once the principal components of wind speed separated, a process of recognition and identification is undertaken via the inspection of the statistical as well as the cyclical behaviors of the obtained components. For the examination of the statistical properties of wind speed, we propose to carry a comparison of the probability density of the extracted components with the Weibull distribution (commonly used to fit wind speed distributions). However, the spectral behavior of the obtained patterns is examined using time–frequency analysis rather than the traditional Fourier analysis. The time–frequency analysis has been chosen as it serves the purpose of following the diurnal and seasonal time variation of the wind speed spectrum. As a result, it has been found that the horizontal wind speed component fits the Weibull distribution and it is characterized by synoptic and intra-seasonal oscillations. On the other hand, the wind speed vertical component is better fitted by the extreme value distribution. It has been also found that the diurnal oscillations are the most significant oscillations in the vertical components especially in the summertime period.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号