首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
 Estimation of confidence limits and intervals for the two- and three-parameter Weibull distributions are presented based on the methods of moment (MOM), probability weighted moments (PWM), and maximum likelihood (ML). The asymptotic variances of the MOM, PWM, and ML quantile estimators are derived as a function of the sample size, return period, and parameters. Such variances can be used for estimating the confidence limits and confidence intervals of the population quantiles. Except for the two-parameter Weibull model, the formulas obtained do not have simple forms but can be evaluated numerically. Simulation experiments were performed to verify the applicability of the derived confidence intervals of quantiles. The results show that overall, the ML method for estimating the confidence limits performs better than the other two methods in terms of bias and mean square error. This is specially so for γ≥0.5 even for small sample sizes (e.g. N=10). However, the drawback of the ML method for determining the confidence limits is that it requires that the shape parameter be bigger than 2. The Weibull model based on the MOM, ML, and PWM estimation methods was applied to fit the distribution of annual 7-day low flows and 6-h maximum annual rainfall data. The results showed that the differences in the estimated quantiles based on the three methods are not large, generally are less than 10%. However, the differences between the confidence limits and confidence intervals obtained by the three estimation methods may be more significant. For instance, for the 7-day low flows the ratio between the estimated confidence interval to the estimated quantile based on ML is about 17% for T≥2 while it is about 30% for estimation based on MOM and PWM methods. In addition, the analysis of the rainfall data using the three-parameter Weibull showed that while ML parameters can be estimated, the corresponding confidence limits and intervals could not be found because the shape parameter was smaller than 2.  相似文献   

2.
A detailed analysis of the data on the intensity of the geomagnetic dipole and frequency of its reversals presented in the world’s paleointensity databases provided the arguments in favor of the hypothesis of the negative correlation between the average virtual dipole moment (VDM) and the frequency of the reversals on the interval from 5 Ma to 100 Ma ago. However, the statistical confidence level of this hypothesis is only 60–70%, which is far below 95%, the standard required confidence level of a hypothesis to be considered statistically reliable. At a high level of confidence (above 99%), the presence of a positive correlation between the mean value and variance of VDM for a number of intervals of stable polarity in the Cenozoic and Mesozoic is confirmed. This finding means that the distribution of VDM on these time intervals is certainly non-Gaussian and is rather described by the gamma- or lognormal law. At the same time, in contrast to the earlier intervals, the histogram of VDM for the Brunhes epoch is closer to the normal distribution. Compared our conclusions with the published results on the numerical modeling of the geodynamo, we found that they are consistent in terms of a probable negative correlation between the average VDM and reversal frequency, as well as the lack of correlation between the average VDM and the length of the interval of stable polarity.  相似文献   

3.
The estimation of the 100-year flood, or more generally the T-year flood, is a basic problem in hydrology. An important source of uncertainty in this estimate is that caused by the uncertain estimation of parameters of the flood distribution. This uncertainty can have a significant effect on the flood design value, and its quantification is an important aspect of evaluating the risk involved in a chosen level of flood protection. In this paper, simulation is used to determine confidence intervals for the flood design value. The simulation allows verification of Stedinger's formula not only as it applies to confidence intervals, but also verifies the formula as an approximation to percentiles as well.  相似文献   

4.
Frequency calculation for extreme flood and methods used for its uncertainty estimation are popular subjects in hydrology research. In this study, uncertainties in extreme flood estimations of the upper Yangtze River were investigated using the Delta and profile likelihood function (PLF) methods, which were used to calculate confidence intervals of key parameters of the generalized extreme value distribution and quantiles of extreme floods. Datasets of annual maximum daily flood discharge (AMDFD) from six hydrological stations located in the main stream and tributaries of the upper Yangtze River were selected in this study. The results showed that AMDFD data from the six stations followed the Weibull distribution, which has a short tail and is bounded above with an upper bound. Of the six stations, the narrowest confidence interval can be detected in the Yichang station, and the widest interval was found in the Cuntan station. Results also show that the record length and the return period are two key factors affecting the confidence interval. The width of confidence intervals decreased with the increase of record length because more information was available, while the width increased with the increase of return period. In addition, the confidence intervals of design floods were similar for both methods in a short return period. However, there was a comparatively large difference between the two methods in a long return period, because the asymmetry of the PLF curve increases with an increase in the return period. This asymmetry of the PLF method is more proficient in reflecting the uncertainty of design flood, suggesting that PLF method is more suitable for uncertainty analysis in extreme flood estimations of the upper Yangtze River Basin.  相似文献   

5.
The application of nonparametric statistical methods to the estimation of some characteristics of the seismicity regime is considered. Emphasis is placed on the behavior of the distribution tail (i.e., the distribution of the strongest events). The methods described do not use any assumptions concerning the distribution function. Confidence intervals are derived for the magnitude distribution function and for the Poisson intensity of the flow of seismic events. The probability that a previously recorded maximum magnitude will be exceeded during some future time interval T and the confidence interval of this probability are estimated. The distribution of the time to the nearest event exceeding the last maximum (to the nearest record) is derived. The nonparametric approach is most effective if the type of empirical data parameterization is unknown or there are grounds for doubting its adequacy.  相似文献   

6.
Summary The theory of multivariate statistical processing of the anisotropy of magnetic susceptibility, measured on a group of specimens, originating from a single geological body (outcrop, locality, etc.), is described. The result of the processing is an estimate of the mean normalized tensor and the estimates of the principal susceptibilities, derived from it, together with the respective intervals of confidence, and the estimates of the principal directions with the respective regions of confidence. An anisotropy test for a group of specimens is proposed. The function of the ANS21 computer program employed is briefly described and an example of its output plot is presented.Dedicated to RNDr. Jan Pícha, CSc., on his 60th Birthday  相似文献   

7.
The fact that dependent variables of groundwater models are generally nonlinear functions of model parameters is shown to be a potentially significant factor in calculating accurate confidence intervals for both model parameters and functions of the parameters, such as the values of dependent variables calculated by the model. The Lagrangian method of Vecchia and Cooley [Vecchia, A.V. & Cooley, R.L., Water Resources Research, 1987, 23(7), 1237–1250] was used to calculate nonlinear Scheffé-type confidence intervals for the parameters and the simulated heads of a steady-state groundwater flow model covering 450 km2 of a leaky aquifer. The nonlinear confidence intervals are compared to corresponding linear intervals. As suggested by the significant nonlinearity of the regression model, linear confidence intervals are often not accurate. The commonly made assumption that widths of linear confidence intervals always underestimate the actual (nonlinear) widths was not correct. Results show that nonlinear effects can cause the nonlinear intervals to be asymmetric and either larger or smaller than the linear approximations. Prior information on transmissivities helps reduce the size of the confidence intervals, with the most notable effects occurring for the parameters on which there is prior information and for head values in parameter zones for which there is prior information on the parameters.  相似文献   

8.
Asymptotic properties of maximum likelihood parameter and quantile estimators of the 2-parameter kappa distribution are studied. Eight methods for obtaining large sample confidence intervals for the shape parameter and for quantiles of this distribution are proposed and compared by using Monte Carlo simulation. The best method is highlighted on the basis of the coverage probability of the confidence intervals that it produces for sample sizes commonly found in practice. For such sample sizes, confidence intervals for quantiles and for the shape parameter are shown to be more accurate if the quantile estimators are assumed to be log normally distributed rather than normally distributed (same for the shape parameter estimator). Also, confidence intervals based on the observed Fisher information matrix perform slightly better than those based on the expected value of this matrix. A hydrological example is provided in which the obtained theoretical results are applied.  相似文献   

9.
In many branches of science, techniques designed for use in one context are used in other contexts, often with the belief that results which hold in the former will also hold or be relevant in the latter. Practical limitations are frequently overlooked or ignored. Three techniques used in seismic data analysis are often misused or their limitations poorly understood: (1) maximum entropy spectral analysis; (2) the role of goodness-of-fit and the real meaning of a wavelet estimate; (3) the use of multiple confidence intervals. It is demonstrated that in practice maximum entropy spectral estimates depend on a data-dependent smoothing window with unpleasant properties, which can result in poor spectral estimates for seismic data. Secondly, it is pointed out that the level of smoothing needed to give least errors in a wavelet estimate will not give rise to the best goodness-of-fit between the seismic trace and the wavelet estimate convolved with the broadband synthetic. Even if the smoothing used corresponds to near-minimum errors in the wavelet, the actual noise realization on the seismic data can cause important perturbations in residual wavelets following wavelet deconvolution. Finally the computation of multiple confidence intervals (e.g. at several spatial positions) is considered. Suppose a nominal, say 90%, confidence interval is calculated at each location. The confidence attaching to the simultaneous use of the confidence intervals is not then 90%. Methods do exist for working out suitable confidence levels. This is illustrated using porosity maps computed using conditional simulation.  相似文献   

10.
Lichenometry is a dating technique that has problems relating to questionable assumptions. The development of a size frequency approach, previously used in attempts to resolve some of the problems, is described and applied to the dating of four debris flows marginal to the San Rafael Glacier in Southern Chile. This study provides examples of the development's application, its problems and directions for further work. The size frequency approach, based on new assumptions, uses parameters derived from population size frequency distributions of the lichen species Placopsis patagonica to provide relative and absolute dating for rock surfaces. Changes in the shapes of distributions suggest the relative age of populations. Absolute dating is based on a curve (spanning a 24 year time period) derived from mean diameter size/age correlations. A stratified random sampling design permits the use of inferential statistics. Standard deviations and confidence intervals show error margins, the degree of relatedness between neighbouring populations, and populations that are anomalous. One-way analysis of variance is used to indicate where populations may safely be grouped. The size frequency approach appears to be particularly suitable for use on unstable debris flows where secondary movements are common. The approach also demonstrates that lichen growth and colonization are sensitive to aspect differences and other variations in microhabitat.  相似文献   

11.
Although water resources management practices recently use bivariate distribution functions to assess drought severity and its frequency, the lack of systematic measurements is the major hindrance in achieving quantitative results. This study aims to suggest a statistical scheme for the bivariate drought frequency analysis to provide comprehensive and consistent drought severities using observed rainfalls and their uncertainty using synthesized rainfalls. First, this study developed a multi-variate regression model to generate synthetic monthly rainfalls using climate variables as causative variables. The causative variables were generated to preserve their correlations using copula functions. This study then focused on constructing bivariate drought frequency curves using bivariate kernel functions and estimating their confidence intervals from 1,000 likely replica sets of drought frequency curves. The confidence intervals achieved in this study may be useful for making a comprehensive drought management plan through providing feasible ranges of drought severity.  相似文献   

12.
In some diseases it is well-known that a unimodal mortality pattern exists. A clear example in developed countries is breast cancer, where mortality increased sharply until the nineties and then decreased. This clear unimodal pattern is not necessarily applicable to all regions within a country. In this paper, we develop statistical tools to check if the unimodality pattern persists within regions using order restricted inference. Break points as well as confidence intervals are also provided. In addition, a new test for checking monotonicity against unimodality is derived allowing to discriminate between a simple increasing pattern and an up-then-down response pattern. A comparison with the widely used joinpoint regression technique under unimodality is provided. We show that the joinpoint technique could fail when the underlying function is not piecewise linear. Results will be illustrated using age-specific breast cancer mortality data from Spain in the period 1975–2005.  相似文献   

13.
Hack's law was originally derived from basin statistics for varied spatial scales and regions.The exponent value of the law has been shown to vary between 0.47 and 0.70,causing uncertainty in its application.This paper focuses on the emergence of Hack's law from debris-flow basins in China.Over 5,000 debris-flow basins in different regions of China with drainage areas less than 100km2 are included in this study.Basins in the different regions are found to present similar distributions.Hack's law is derived fi'om maximum probability and conditional distributions,suggesting that the law should describe some critical state of basin evolution.Results suggest the exponent value is approximately 0.5.Further analysis indicates that Hack's law is related to other scaling laws underlying the evolution of a basin and that the exponent is not dependent on basin shape but rather on the evolutionary stage.A case study of a well known debris-flow basin further confirms Hack's law and its implications in basin evolution.  相似文献   

14.
A method is described for measurement of the magnetic viscosity of rocks, which is considered in its three forms: induced magnetization viscosity, remanent magnetization viscosity and viscous remanent magnetization. An application is then presented in the form of an experimental verification of the E. and O. Thellier test of approximate elimination of the viscous magnetization effect in rocks. It is shown that, used in the conventional manner, the employment of this test is fully justified, although not allowing measurement of the VRM, the values obtained being seriously underestimated.  相似文献   

15.
Hack's law was originally derived from basin statistics for varied spatial scales and regions. The exponent value of the law has been shown to vary between 0.47 and 0.70, causing uncertainty in its application. This paper focuses on the emergence of Hack's law from debris-flow basins in China. Over 5,000 debris-flow basins in different regions of China with drainage areas less than 100km^2 are included in this study. Basins in the different regions are found to present similar distributions. Hack's law is derived from maximum probability and conditional distributions, suggesting that the law should describe some critical state of basin evolution. Results suggest the exponent value is approximately 0,5. Further analysis indicates that Hack's law is related to other scaling laws underlying the evolution of a basin and that the exponent is not dependent on basin shape but rather on the evolutionary stage. A case study of a well known debris-flow basin further confirms Hack's law and its implications in basin evolution.  相似文献   

16.
Spatio–temporal statistical models have been proposed for the analysis of the temporal evolution of the geographical pattern of mortality (or incidence) risks in disease mapping. However, as far as we know, functional approaches based on Hilbert-valued processes have not been used so far in this area. In this paper, the autoregressive Hilbertian process framework is adopted to estimate the functional temporal evolution of mortality relative risk maps. Specifically, the penalized functional estimation of log-relative risk maps is considered to smooth the classical standardized mortality ratio. The reproducing kernel Hilbert space (RKHS) norm is selected for definition of the penalty term. This RKHS-based approach is combined with the Kalman filtering algorithm for the spatio–temporal estimation of risk. Functional confidence intervals are also derived for detecting high risk areas. The proposed methodology is illustrated analyzing breast cancer mortality data in the provinces of Spain during the period 1975–2005. A simulation study is performed to compare the ARH(1) based estimation with the classical spatio–temporal conditional autoregressive approach.  相似文献   

17.
18.
Providing an accurate estimate of the magnetic field on the Earth's surface at a location distant from an observatory has useful scientific and commercial applications, such as in repeat station data reduction, space weather nowcasting or aeromagnetic surveying. While the correlation of measurements between nearby magnetic observatories at low and mid‐latitudes is good, at high geomagnetic latitudes () the external field differences between observatories increase rapidly with distance, even during relatively low magnetic activity. Thus, it is of interest to describe how the differences (or errors) in external magnetic field extrapolation from a single observatory grow with distance from its location. These differences are modulated by local time, seasonal and solar cycle variations, as well as geomagnetic activity, giving a complex temporal and spatial relationship. A straightforward way to describe the differences are via confidence intervals for the extrapolated values with respect to distance. To compute the confidence intervals associated with extrapolation of the external field at varying distances from an observatory, we used 695 station‐years of overlapping minute‐mean data from 37 observatories and variometers at high latitudes from which we removed the main and crustal fields to isolate unmodelled signals. From this data set, the pairwise differences were analysed to quantify the variation during a range of time epochs and separation distances. We estimate the 68.3%, 95.4% and 99.7% confidence levels (equivalent to the 1σ, 2σ and 3σ Gaussian error bounds) from these differences for all components. We find that there is always a small non‐zero bias that we ascribe to instrumentation and local crustal field induction effects. The computed confidence intervals are typically twice as large in the north–south direction compared to the east‐west direction and smaller during the solstice months compared to the equinoxes.  相似文献   

19.
One of the crucial components in seismic hazard analysis is the estimation of the maximum earthquake magnitude and associated uncertainty. In the present study, the uncertainty related to the maximum expected magnitude μ is determined in terms of confidence intervals for an imposed level of confidence. Previous work by Salamat et al. (Pure Appl Geophys 174:763-777, 2017) shows the divergence of the confidence interval of the maximum possible magnitude mmax for high levels of confidence in six seismotectonic zones of Iran. In this work, the maximum expected earthquake magnitude μ is calculated in a predefined finite time interval and imposed level of confidence. For this, we use a conceptual model based on a doubly truncated Gutenberg-Richter law for magnitudes with constant b-value and calculate the posterior distribution of μ for the time interval Tf in future. We assume a stationary Poisson process in time and a Gutenberg-Richter relation for magnitudes. The upper bound of the magnitude confidence interval is calculated for different time intervals of 30, 50, and 100 years and imposed levels of confidence α?=?0.5, 0.1, 0.05, and 0.01. The posterior distribution of waiting times Tf to the next earthquake with a given magnitude equal to 6.5, 7.0, and 7.5 are calculated in each zone. In order to find the influence of declustering, we use the original and declustered version of the catalog. The earthquake catalog of the territory of Iran and surroundings are subdivided into six seismotectonic zones Alborz, Azerbaijan, Central Iran, Zagros, Kopet Dagh, and Makran. We assume the maximum possible magnitude mmax?=?8.5 and calculate the upper bound of the confidence interval of μ in each zone. The results indicate that for short time intervals equal to 30 and 50 years and imposed levels of confidence 1???α?=?0.95 and 0.90, the probability distribution of μ is around μ?=?7.16???8.23 in all seismic zones.  相似文献   

20.
To assess whether changes in the frequency of heavy rainfall events are occurring over time, annual maximum records from 21 rainfall gauges in Ontario are examined using frequency analysis methods. Relative RMSE and related boxplots are used to characterize assessment for selecting distributions; the Gumbel distribution is verified as one of the most suitable distributions to provide accurate quantile estimates. Records were divided into two time periods, and tested using the Mann-Kendall test and lag-1 autocorrelations to ensure that data in each period are identically distributed. The confidence intervals of design rainfalls for each return period (2, 5, 10, and 25-year) are derived by using resampling method, and compared at 90 % confidence levels. The changes in heavy rainfall intensities are tested at gauges across the Province of Ontario. Several significant decreases in heavy rainfall intensities are identified in central and southern Ontario. Increases in heavy rainfall intensities are identified in gauges at Sioux Lookout and Belleville. The sensitivity analysis of changes identified with respect to the year of splitting indicates changes are occurring during the 1980s and 1990s.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号