首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The cumulative distribution function (CDF) of magnitude of seismic events is one of the most important probabilistic characteristics in Probabilistic Seismic Hazard Analysis (PSHA). The magnitude distribution of mining induced seismicity is complex. Therefore, it is estimated using kernel nonparametric estimators. Because of its model-free character the nonparametric approach cannot, however, provide confidence interval estimates for CDF using the classical methods of mathematical statistics.To assess errors in the seismic events magnitude estimation, and thereby in the seismic hazard parameters evaluation in the nonparametric approach, we propose the use of the resampling methods. Resampling techniques applied to a one dataset provide many replicas of this sample, which preserve its probabilistic properties. In order to estimate the confidence intervals for the CDF of magnitude, we have developed an algorithm based on the bias corrected and accelerated method (BCa method). This procedure uses the smoothed bootstrap and second-order bootstrap samples. We refer to this algorithm as the iterated BCa method. The algorithm performance is illustrated through the analysis of Monte Carlo simulated seismic event catalogues and actual data from an underground copper mine in the Legnica–Głogów Copper District in Poland.The studies show that the iterated BCa technique provides satisfactory results regardless of the sample size and actual shape of the magnitude distribution.  相似文献   

2.
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).Bayesian techniques provide a mathematical model to estimate the distribution of random variables in presence of uncertainties. The developed method estimates the probability distribution of the number of occurrences in a Poisson process described by the parameter . The input data are the historical occurrences of intensities for a particular site, represented by a discrete probability distribution for each earthquake. The calculation of these historical occurrences requires a careful preparation of all input parameters, i.e. a modelling of their uncertainties. The obtained results show that the variance of the recurrence rate is smaller in regions with higher seismic activity than in less active regions. It can also be demonstrated that long return periods cannot be estimated with confidence, because the time period of observation is too short. This indicates that the long return periods obtained by seismic source methods only reflects the delineated seismic sources and the chosen earthquake size distribution law.  相似文献   

3.
Some Bayesian methods of dealing with inaccurate or vague data are introduced in the framework of seismic hazard assessment. Inaccurate data affected by heterogeneous errors are modeled by a probability distribution instead of the usual value plus a random error representation; these data are generically called imprecise. The earthquake size and the number of events in a certain time are modeled as imprecise data. Imprecise data allow us to introduce into the estimation procedures the uncertainty inherent in the inaccuracy and heterogeneity of the measuring systems from which the data were obtained. The problem of estimating the parameter of a Poisson process is shown to be feasible by the use of Bayesian techniques and imprecise data. This background technique can be applied to a general problem of seismic hazard estimation. Initially, data in a regional earthquake catalog are assumed imprecise both in size and location (i.e errors in the epicenter or spreading over a given source). By means of scattered attenuation laws, the regional catalog can be translated into a so-called site catalog of imprecise events. The site catalog is then used to estimate return periods or occurrence probabilities, taking into account all sources of uncertainty. Special attention is paid to priors in the Bayesian estimation. They can be used to introduce additional information as well as scattered frequency-size laws for local events. A simple example is presented to illustrate the capabilities of this methodology.  相似文献   

4.
A method based on Bayesian techniques has been applied to evaluate the seismic hazard in the two test areas selected by the participants in the ESC/SC8-TERESA project: Sannio-Matese in Italy and the northern Rhine region (BGN). A prior site occurrence model (prior SOM) is obtain from a seismicity distribution modeled in wide seismic sources. The posterior occurrence model (posterior SOM) is calculated after a Bayesian correction which, basically, recovers the spatial information of the epicenter distribution and considers attenuation and location errors, not using source zones. The uncertainties of the occurrence probabilities are evaluated in both models.The results are displayed in terms of probability and variation coefficient contour maps for a chosen intensity level, and with plots of mean return period versus intensity in selected test sites, including the 90% probability intervals.It turns out that the posterior SOM gives a better resolution in the probability estimate, decreasing its uncertainty, especially in low seismic activity regions.  相似文献   

5.
A. Golara 《Natural Hazards》2014,73(2):567-577
Seismic hazard maps are widely used for engineering design, land-use planning, and disaster mitigation. The development of the new seismic hazard map of Iran with regard to the specification of Iranian high-pressure gas network is based on probabilistic seismic hazard analysis using the historical and new earthquakes data, geology, tectonics, fault activity, and seismic zone models in Iran. The map displays the probabilistic estimates of peak ground acceleration for the return period of 2,475 year (2 % probability in 50 years). The results presented in this study will provide the basis for the preparation of risk map, the estimation of insurance premiums, finding best paths for future pipelines, planning, and relocating lifeline facilities especially for interconnected infrastructures.  相似文献   

6.
赵海军  马凤山  李志清  郭捷  张家祥 《地球科学》2022,47(12):4401-4416
应用概率地震危险性评价模型进行地震滑坡危险性区划,是解决潜在地震诱发滑坡危险性评价中震源不确定性与诱发滑坡时空不确定性的有效方法 .通过理论分析,结合鲁甸地震区的实际情况,对基于力学原理的Newmark滑块位移模型与概率地震滑坡危险性分析方法中的参数的不确定性问题进行了分析,将斜坡岩土体地震作用下的强度衰减效应、地震加速度地形放大效应、断层破碎带效应融合到了斜坡累积位移计算模型中,进行了模型计算参数的优化.改进后的分析模型,更好地反映了高陡斜坡地形与断层破碎带对地震滑坡灾害发育的控制作用,在鲁甸地震区域滑坡应用中,优化模型中的滑坡失稳极高风险区与实际地震滑坡分布表现出了较好的一致性,在超越概率2%的滑坡失稳概率分布中,鲁甸地区包谷垴-小河断裂、鲁甸-昭通断裂带及牛栏江河谷地带地震滑坡高-极高风险区分布面积增幅十分显著.因此,在Newmark滑块位移模型中考虑地震动参数与岩土参数动态响应规律与变量间的定量关系,对于提高区域斜坡稳定性分析的可靠性具有重要意义.  相似文献   

7.
Probabilistic seismic hazard maps for the sultanate of Oman   总被引:2,自引:0,他引:2  
This study presents the results of the first probabilistic seismic hazard assessment (PSHA) in the framework of logic tree for Oman. The earthquake catalogue was homogenized, declustered, and used to define seismotectonic source model that characterizes the seismicity of Oman. Two seismic source models were used in the current study; the first consists of 26 seismic source zones, while the second is expressing the alternative view that seismicity is uniform along the entire Makran and Zagros zones. The recurrence parameters for all the seismogenic zones were determined using the doubly bounded exponential distribution except the zones of Makran, which were modelled using the characteristic distribution. Maximum earthquakes were determined and the horizontal ground accelerations in terms of geometric mean were calculated using ground-motion prediction relationships developed based upon seismic data obtained from active tectonic environments similar to those surrounding Oman. The alternative seismotectonic source models, maximum magnitude, and ground-motion prediction relationships were weighted and used to account for the epistemic uncertainty. Hazard maps at rock sites were produced for 5?% damped spectral acceleration (SA) values at 0.1, 0.2, 0.3, 1.0 and 2.0?s spectral periods as well as peak ground acceleration (PGA) for return periods of 475 and 2,475?years. The highest hazard is found in Khasab City with maximum SA at 0.2?s spectral period reaching 243 and 397?cm/s2 for return periods 475 and 2,475 years, respectively. The sensitivity analysis reveals that the choice of seismic source model and the ground-motion prediction equation influences the results most.  相似文献   

8.
In Canada, Montreal is the second city with the highest seismic risk. This is due to its relatively high seismic hazard, old infrastructures and high population density. The region is characterised by moderate seismic activity with no recent record of a major earthquake. The lack of historical strong ground motion records for the region contributes to large uncertainties in the estimation of hazards. Among the sources of uncertainty, the attenuation function is the main contributor and its effect on estimates of risks is investigated. Epistemic uncertainty was considered by obtaining damage estimates for three attenuation functions that were developed for Eastern North America. The results indicate that loss estimates are highly sensitive to the choice of the attenuation function and suggest that epistemic uncertainty should be considered both for the definition of the hazard function and in loss estimation methodologies. Seismic loss estimates are performed for a 2% in 50 years seismic threat, which corresponds to the design level earthquake in the national building code of Canada, using HAZUS-MH4 for the Montreal region over 522 census tracts. The study estimated that for the average scenario roughly 5% of the building stock would be damaged with direct economic losses evaluated at 1.4 billion dollars for such a scenario. The maximum number of casualties would result in approximately 500 people being injured or dead at a calculated time of occurrence of 2?pm.  相似文献   

9.
The continuous ranked probability score (CRPS) is a much used measure of performance for probabilistic forecasts of a scalar observation. It is a quadratic measure of the difference between the forecast cumulative distribution function (CDF) and the empirical CDF of the observation. Analytic formulations of the CRPS can be derived for most classical parametric distributions, and be used to assess the efficiency of different CRPS estimators. When the true forecast CDF is not fully known, but represented as an ensemble of values, the CRPS is estimated with some error. Thus, using the CRPS to compare parametric probabilistic forecasts with ensemble forecasts may be misleading due to the unknown error of the estimated CRPS for the ensemble. With simulated data, the impact of the type of the verified ensemble (a random sample or a set of quantiles) on the CRPS estimation is studied. Based on these simulations, recommendations are issued to choose the most accurate CRPS estimator according to the type of ensemble. The interest of these recommendations is illustrated with real ensemble weather forecasts. Also, relationships between several estimators of the CRPS are demonstrated and used to explain the differences of accuracy between the estimators.  相似文献   

10.
A probabilistic assessment of the seismic hazard in Turkey   总被引:1,自引:0,他引:1  
  相似文献   

11.
The development of the new seismic hazard map of metropolitan Tehran is based on probabilistic seismic hazard computation using the non-Poisson recurrence time model. For this model, two maps have been prepared to indicate the earthquake hazard of the region in the form of iso-acceleration contour lines. They display the non-Poisson probabilistic estimates of peak ground accelerations over bedrock for 10 and 63 % probability of exceedance in 50 years. To carry out the non-Poisson seismic hazard analysis, appropriate distributions of interoccurrence times of earthquakes were used for the seismotectonic provinces which the study region is located and then the renewal process was applied. In order to calculate the seismic hazard for different return periods in the probabilistic procedure, the study area encompassed by the 49.5–54.5°E longitudes and 34–37°N latitudes was divided into 0.1° intervals generating 1,350 grid points. PGA values for this region are estimated to be 0.30–0.32 and 0.16–0.17 g for 10 and 63 % probability of exceedance, respectively, in 50 years for bedrock condition.  相似文献   

12.
Analysis of the distribution of ore-bodies in space can be useful generally in the estimation of mineral resources or the management of exploration. This study is directed to analysis of the undiscovered potential of well known areas of mining district size, referred to as metallogenic units (MUs). The analysis employs an effort-adjusted and truncated probability model for number of occurrences within a subdivision (quadrat) of a MU and Monte Carlo sampling to generate an approximation to the probability distribution for number of occurrences and number of mines within an MU when it is totally explored. Exploration potential for Monitor, Bodie, Aurora, and Camp Douglas MUs (Walker Lake quadrangle of Nevada and California) are estimated to be 9, 4, 7, and 4 mines, respectively.  相似文献   

13.
作为地震灾害评估的理论基础,地震动力学主要研究与地震活动有关的断裂机制、破裂过程、震源辐射和由此而引起的地震波的传播及地面运动规律。对地震力学、震源辐射和能量释放等经典理论问题进行了系统研究。在此基础上,应用最新的定量地震学研究方法,以逻辑树的形式综合地震、地质和大地测量资料,提供了不同构造环境和断裂机制条件下地震灾害评估的概率分析和确定性分析实例。用于震源分析的典型构造类型包括板内地壳震源层、地壳活动断层及其速率、板块俯冲界面和俯冲板片。由于输入模型中不确定因素的存在,如输入参数的随机性和科学分析方法本身的不确定性,对分析结果的不确定性需审慎对待。通常对不同的模型或参量,包括地面衰减模型,进行加权平均可较为合理地减小结果的偏差:概率分析和确定性分析方法的结合亦为可取之有效途径。  相似文献   

14.
This study provides a procedure for assessing seismic hazardand uncertainties in regions that are characterised by a large non-instrumental earthquakedatabase and a seismic and tectonic behaviour which doesn't allow an evident seismic zonation.This procedure is a synthesis of the non-zoning or non-parametric methodology (using extremevalues distribution functions as proposed by Epstein and Lomnitz, 1966) and the zoning orparametric methodology (using the theorem of total probability as proposed by Cornell, 1968)via a logic tree procedure taking into consideration the advantages offered by each of these.Taking the area which we shall describe as the east coast of Spain and surrounding inland areas,an application was made and a specific logic tree was developed in order to solve the problems anduncertainties related to the evaluation of the seismic hazard using both methodologies. The use of thelogic tree allowed the systematisation of a large number of solutions obtained. A number of relevantresults were obtained which show that in some cases there are great differences in the seismichazard results provided by the non-zoning and the zoning methodologies. In these cases, mean value andstandard deviation of the obtained results provide an intermediate solution to the over-conservativeestimation provided by the non-zoning methodology and the lowest results provided by the zoningmethodology. In other cases results provided by both methodologies are significantly closer.In any case, synthesis among both methodologies gives a wider knowledge of the uncertaintiesassociated with the seismic hazard results. Finally uncertainties increase with the decreaseof the annual probability of exceedence and in sites with a seismic history of large size earthquakes.  相似文献   

15.
We performed large-scale earthquake economic loss estimations for France and cost–benefit analyses for several French cities by developing a semiempirical, intensity-based approach. The proposed methodology is inexpensive and easily applicable in case of a paucity of detailed information regarding the specific regional seismic hazard and the structural characteristics of the building stock, which is of particular importance in moderate-to-low seismic hazard regions. The exposure model is derived from census datasets, and the seismic vulnerability distribution of buildings is calculated using data mining techniques. Several hypothetical, large-scale retrofit scenarios are proposed, with increasing levels of investment. These cities, in their respective reinforced states, are then subjected to a series of hazard scenarios. Seismic hazard data for different return periods are calculated from regulatory accelerations from French seismic zoning. Loss estimations for the original (non-reinforced) configuration show high levels of expected building repair and replacement costs for all time spans. Finally, the benefits in terms of damage avoidance are compared with the costs of each retrofit measure. Relatively limited strengthening investments reduce the probability of building collapse, which is the main cause of human casualties. However, the results of this study suggest that retrofitting is, on average, only cost-effective in the parts of France with the highest seismicity and over the longest time horizons.  相似文献   

16.
In conventional seismic hazard analysis, uniform distribution over area and magnitude range is assumed for the evaluation of source seismicity which is not able to capture peculiar characteristic of near-fault ground motion well. For near-field hazard analysis, two important factors need to be considered: (1) rupture directivity effects and (2) occurrence of scenario characteristic ruptures in the nearby sources. This study proposed a simple framework to consider these two effects by modifying the predictions from the conventional ground motion model based on pulse occurrence probability and adjustment of the magnitude frequency distribution to account for the rupture characteristic of the fault. The results of proposed approach are compared with those of deterministic and probabilistic seismic hazard analyses. The results indicate that characteristic earthquake and directivity consideration both have significant effects on seismic hazard analysis estimates. The implemented approach leads to results close to deterministic seismic hazard analysis in the short period ranges (T < 1.0 s) and follows probabilistic seismic hazard analysis results in the long period ranges (T > 1.0 s). Finally, seismic hazard maps based on the proposed method could be developed and compared with other methods.  相似文献   

17.
Rigorous and objective testing of seismic hazard assessments against the real seismic activity must become the necessary precondition for any responsible seismic risk estimation. Because seismic hazard maps seek to predict the shaking that would actually occur, the reference hazard maps for the Italian seismic code, obtained by probabilistic seismic hazard assessment (PSHA), and the alternative ground shaking maps based on the neo-deterministic approach (NDSHA), are cross-compared and tested against the real seismicity for the territory of Italy. The comparison between predicted intensities and those reported for past earthquakes shows that models generally provide rather conservative estimates, except for PGA with 10 % probability of being exceeded in 50 years, which underestimates the largest earthquakes. In terms of efficiency in predicting ground shaking, measured accounting for the rate of underestimated events and for the territorial extent of areas characterized by high seismic hazard, the NDSHA maps appear to outscore the PSHA ones.  相似文献   

18.
19.
The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum-likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historical data.  相似文献   

20.
Probabilistic seismic hazard maps in term of Modified Mercalli (MM) intensity are derived by applying the Cornell-McGuire method to four earthquake source zones in Panama and adjacent areas. The maps contain estimates of the maximum MM intensity for return periods of 5, 25 and 100 yr. The earthquake phenomenon is based on the point source model. The probabilistic iso-intensity map for a return period of 50 yr indicates that the Panama Suture Zone (PSZ) could experience a maximum (MM) intensity IX, and the Panama Fracture Zone (PFZ) an MM intensity VIII, for the rest of the area this varies from IV up to VIII. The present study intends to serve as a reference for more advanced approaches, to stimulate discussions and suggestions on the data base, assumptions and inputs, and path for the risk based assessment of the seismic hazard in the site selection and in the design of common buildings and engineering.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号