首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
This paper extends Nair's exact Table of 952 and 99% confidence intervals for the median to data sets containing up to 300 observations.
It provides for an approximate relationship especially useful when the number of observations is large, and allows to calculate quickly confidence intervals affected only seldom and occasionally by tolerable error.  相似文献   

2.
The variogram is a measure of the local variation in space of a random field. For large geostatistical data sets, the traditional empirical variogram may be hard to compute. This article presents, for processes with a fixed domain, the effect of using a subsample of the available data on the performance of the empirical variogram. The motivation of this work, apart from the saving on computation, is to study how dense the observations need to be in the bounded sampling region to obtain most of the information we would get from continuous observations in the fixed domain.  相似文献   

3.
Fitting probability distributions to hydrologic data samples is widely used for quantile estimation purposes. The estimated quantile (X^T) is related to a return period (T). The confidence interval associated with each of the estimates has been calculated empirically, up until now, supposing that the quantile estimator is normally distributed. In this study, it is shown that the confidence interval follows a normal distribution only in the central part of the distribution. The real confidence limits are computed analytically, by defining and integrating the probability density function of the confidence interval. The results with an important number of hydrologic samples show that the upper confidence limits are significantly underestimated towards the tail of the distribution, when determined using the normality approximation for the quantile estimator.  相似文献   

4.
Prior to interpretation and further analysis, many datasets must first be separated into regional and residual components. Traditional techniques are either subjective (e.g., graphical methods) or nonrobust (e.g., all least-squares based methods). Bathymetric data, with their broad spectrum, pose serious difficulties to these traditional methods, in particular those based on spectral decomposition. Spatial median filters offer a solution that is robust, objective, and often defines regional components similar to those produced graphically by hand. Characteristics of spatial median filters in general are discussed and a new empirical method is presented for determining the width of the robust median filter that accomplishes an optimal separation of a gridded dataset into its regional and residual components. The method involves tracing the zero-contour of the residual component and evaluating the ratio between the volume enclosed by the surface inside this contour and the contour's area. The filter width giving the highest ratio (or mean amplitude) is called the Optimal Robust Separator (ORS) and is selected as the optimal filter producing the best separation. The technique allows a unique and objective determination of the regional field and enables researchers to undertake reproducible separations of regional and residual components. The ORS method is applied to both synthetic data and bathymetry/topography of the Hawaiian Islands; ways to improve the technique using alternative diagnostic quantities are discussed.  相似文献   

5.
Mathematical Geosciences - Two important issues characterize the design of bootstrap methods to construct confidence intervals for the correlation between two time series sampled (unevenly or...  相似文献   

6.
The effect of random error on age reliability is investigated with respect to the externaldetector method of fission-track dating. Obtaining confidence limits to age is shown to be equivalent to obtaining confidence limits for products and ratios of Poisson expectations. Attention is drawn to the relevant methodology for calculating exact confidence limits, and some simple approximations to these limits are suggested for use when the number of track counts is large.  相似文献   

7.
Seismic inverse modeling, which transforms appropriately processed geophysical data into the physical properties of the Earth, is an essential process for reservoir characterization. This paper proposes a work flow based on a Markov chain Monte Carlo method consistent with geology, well-logs, seismic data, and rock-physics information. It uses direct sampling as a multiple-point geostatistical method for generating realizations from the prior distribution, and Metropolis sampling with adaptive spatial resampling to perform an approximate sampling from the posterior distribution, conditioned to the geophysical data. Because it can assess important uncertainties, sampling is a more general approach than just finding the most likely model. However, since rejection sampling requires a large number of evaluations for generating the posterior distribution, it is inefficient and not suitable for reservoir modeling. Metropolis sampling is able to perform an equivalent sampling by forming a Markov chain. The iterative spatial resampling algorithm perturbs realizations of a spatially dependent variable, while preserving its spatial structure by conditioning to subset points. However, in most practical applications, when the subset conditioning points are selected at random, it can get stuck for a very long time in a non-optimal local minimum. In this paper it is demonstrated that adaptive subset sampling improves the efficiency of iterative spatial resampling. Depending on the acceptance/rejection criteria, it is possible to obtain a chain of geostatistical realizations aimed at characterizing the posterior distribution with Metropolis sampling. The validity and applicability of the proposed method are illustrated by results for seismic lithofacies inversion on the Stanford VI synthetic test sets.  相似文献   

8.
Fabric shape is often quantified using the three eigenvalues from the ‘orientation tensor’ method applied to a sample of directions. Several studies have used eigenvalues plotted on fabric shape diagrams to distinguish sedimentary facies or strain histories. However, such studies seldom consider how well the sample eigenvalues represent the true fabric shape. In this paper, we use ‘bootstrapping’ techniques to define confidence regions for sample eigenvalues, and show that sample and population eigenvalues may differ substantially. Confidence regions are often very large for small sample sizes, and we recommend that sample sizes should be at least 50.  相似文献   

9.
Although the Gastwirth median is fairly robust (resistant to effects of contamination), it does not, as far as is known, have appropriate confidence limits. It was suspected that it would have confidence limits similar to those of the median. This was borne out in this investigation, which was confined to symmetrical distributions. It is concluded that, for practical purposes, in approximately symmetrical distributions the confidence limits for the median can be assumed to approximate those for the Gastwirth median.  相似文献   

10.
We examine thermoelastic equations of state for silicate perovskite based on data from recent static compression studies. We analyze trade-offs among the fitting parameters and examine data sets for possible effects of metastability and of Mg-Fe solid solution. Significant differences are found between equations of state based on low pressure measurements obtained for perovskite outside of its stability field. Increasingly consistent results are obtained when higher pressure data are used, despite differences in the zero-pressure parameters used to describe the equations of state. The results highlight the importance of measuring the thermoelastic properties of perovskite at high pressure, specifically within its stability field, and the potential problems associated with large extrapolations of equations of state in the analysis of seismic data.  相似文献   

11.
The estimation of risk confidence bounds is an important element of a comprehensive probabilistic risk assessment for a radioactive waste repository. Normal distribution bounds may be used in the asymptotic limit of a very large number of Monte Carlo simulations, but sharp skewness of the risk distribution may severely retard the convergence process. The Tchebycheff bounds are parameter-free and may be applied regardless of distribution, save for the finiteness of variance. These bounds may be generally applicable, but they are invariably very broad. Better parameter-free bounds for mean risk are presented here, based on an inequality originally derived by Guttman.  相似文献   

12.
We consider a non-linear extension of Biot’s model for poromechanics, wherein both the fluid flow and mechanical deformation are allowed to be non-linear. Specifically, we study the case when the volumetric stress and the fluid density are non-linear functions satisfying certain assumptions. We perform an implicit discretization in time (backward Euler) and propose two iterative schemes for solving the non-linear problems appearing within each time step: a splitting algorithm extending the undrained split and fixed stress methods to non-linear problems, and a monolithic L-scheme. The convergence of both schemes are shown rigorously. Illustrative numerical examples are presented to confirm the applicability of the schemes and validate the theoretical results.  相似文献   

13.
The cumulative distribution function (CDF) of magnitude of seismic events is one of the most important probabilistic characteristics in Probabilistic Seismic Hazard Analysis (PSHA). The magnitude distribution of mining induced seismicity is complex. Therefore, it is estimated using kernel nonparametric estimators. Because of its model-free character the nonparametric approach cannot, however, provide confidence interval estimates for CDF using the classical methods of mathematical statistics.To assess errors in the seismic events magnitude estimation, and thereby in the seismic hazard parameters evaluation in the nonparametric approach, we propose the use of the resampling methods. Resampling techniques applied to a one dataset provide many replicas of this sample, which preserve its probabilistic properties. In order to estimate the confidence intervals for the CDF of magnitude, we have developed an algorithm based on the bias corrected and accelerated method (BCa method). This procedure uses the smoothed bootstrap and second-order bootstrap samples. We refer to this algorithm as the iterated BCa method. The algorithm performance is illustrated through the analysis of Monte Carlo simulated seismic event catalogues and actual data from an underground copper mine in the Legnica–Głogów Copper District in Poland.The studies show that the iterated BCa technique provides satisfactory results regardless of the sample size and actual shape of the magnitude distribution.  相似文献   

14.
Empirical Relationships for Debris Flows   总被引:30,自引:10,他引:30  
The assessment of the debris flow hazard potential has to rely on semi-quantitative methods. Due to the complexity of the debris-flow process, numerical simulation models of debris flows are still limited with regard to practical applications. Thus, an overview is given of empirical relationships that can be used to estimate the most important parameters of debris-flow behavior. In a possible procedure, an assessment of a maximum debris-flow volume may be followed by estimates of the peak discharge, the mean flow velocity, the total travel distance, and the runout distance on the fan. The applicability of several empirical equations is compared with available field and laboratory data, and scaling considerations are used to discuss the variability of the parameters over a large range of values. Some recommendations are made with regard to the application of the presented relationships by practicing engineers, apart from advocating field reconnaissance and searching for historic events wherever possible.  相似文献   

15.
Fault-creep events measured on the San Andreas and related faults near Hollister, California, can be described by a rheological model consisting of a spring, power-law dashpotand sliding block connected in series. An empirical creep-event law, derived from many creep-event records analyzed within the constraints of the model, provides a remarkably simple and accurate representation of creep-event behavior. The empirical creep law is expressed by the equation: D(t)= Df [1?1/{ct(n?1)Dfn?1+1}/(n?1)] where D is the value of displacement at time t following the onset of an event, Df is the final equilibrium value of the event displacementand C is a proportionality constant. This discovery should help determine whether the time—displacement character of creep events is controlled by the material properties of fault gouge, or by other parameters.  相似文献   

16.
The main objective of this paper is to construct a robust and reliable metamodel for the mechanized tunnel simulation in computationally expensive applications. To accomplish this, four metamodeling approaches have been implemented and their performance has been systematically evaluated through a comparative study utilizing pure mathematical test functions. These metamodels are quadratic polynomial regression, moving least squares, proper orthogonal decomposition with radial basis functions, and an extended version of the latest approach. This extended version has been proposed by the authors and named proper orthogonal decomposition with extended radial basis functions. After that, a system identification study for mechanized tunneling has been conducted through the back analysis of synthetic measurements. In this study, the best performing metamodel, that is the one suggested by the authors, has been employed to surrogate a complex and computationally expensive 3D finite element simulation of the mechanized tunnel. The obtained results demonstrate that the proposed metamodel can reliably replace the finite element simulation model and drastically reduce the expensive computation time of the back analysis subroutine.  相似文献   

17.
Outlier Detection for Compositional Data Using Robust Methods   总被引:4,自引:2,他引:4  
Outlier detection based on the Mahalanobis distance (MD) requires an appropriate transformation in case of compositional data. For the family of logratio transformations (additive, centered and isometric logratio transformation) it is shown that the MDs based on classical estimates are invariant to these transformations, and that the MDs based on affine equivariant estimators of location and covariance are the same for additive and isometric logratio transformation. Moreover, for 3-dimensional compositions the data structure can be visualized by contour lines. In higher dimension the MDs of closed and opened data give an impression of the multivariate data behavior.  相似文献   

18.
The multiquadric method (MQ) with high interpolation accuracy has been widely used for interpolating spatial data. However, MQ is an exact interpolation method, which is improper to interpolate noisy sampling data. Although the least squares MQ (LSMQ) has the ability to smooth out sampling errors, it is inherently not robust to outliers due to the least squares criterion in estimating the weights of sampling knots. In order to reduce the impact of outliers on the accuracy of digital elevation models (DEMs), a robust method of MQ (MQ-R) has been developed. MQ-R includes two independent procedures: knot selection and the solution of the system of linear equations. The two independent procedures were respectively achieved by the space-filling design and the least absolute deviation, both of which are very robust to outliers. Gaussian synthetic surface, which is subject to a series of errors with different distributions, was employed to compare the performance of MQ-R with that of LSMQ. Results indicate that LSMQ is seriously affected by outliers, whereas MQ-R performs well in resisting outliers, and can construct satisfactory surfaces even though the data are contaminated by severe outliers. A real-world example of DEM construction was employed to evaluate the robustness of MQ-R, LSMQ, and the classical interpolation methods including inverse distance weighting method, thin plate spline, and ANUDEM. Results showed that compared with the classical methods, MQ-R has the highest accuracy in terms of root mean square error. In conclusion, when sampling data is subject to outliers, MQ-R can be considered as an alternative method for DEM construction.  相似文献   

19.
20.
The parameter m i is an important rock property parameter required for use of the Hoek–Brown failure criterion. The conventional method for determining m i is to fit a series of triaxial compression test data. In the absence of laboratory test data, guideline charts have been provided by Hoek to estimate the m i value. In the conventional Hoek–Brown failure criterion, the m i value is a constant for a given rock. It is observed that using a constant m i may not fit the triaxial compression test data well for some rocks. In this paper, a negative exponent empirical model is proposed to express m i as a function of confinement, and this exercise leads us to a new empirical failure criterion for intact rocks. Triaxial compression test data of various rocks are used to fit parameters of this model. It is seen that the new empirical failure criterion fits the test data better than the conventional Hoek–Brown failure criterion for intact rocks. The conventional Hoek–Brown criterion fits the test data well in the high-confinement region but fails to match data well in the low-confinement and tension regions. In particular, it overestimates the uniaxial compressive strength (UCS) and the uniaxial tensile strength of rocks. On the other hand, curves fitted by the proposed empirical failure criterion match test data very well, and the estimated UCS and tensile strength agree well with test data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号