首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 242 毫秒
1.
The magnitudes (M S , m bP , m bS ) of the largest historical earthquakes which occurred in the first half of the 20 th century, calculated on the basis of records of Wiechert horizontal seismographs in Göttingen (Germany) and Zagreb (Croatia), are compared with one another, as well as with the magnitudes reported in worldwide catalogues. Systematic trends are observed in the data regarding the temporal stability of magnitude estimations in Göttingen, as well as the apparent non-linearity of the instrument responsle in the case of the Wiechert seismograph in Zagreb. We were unable to clearly identify their causes – possible explanations include effects caused by the interaction of the seismometer's frame and mass, as well as local soil conditions, but nonhomogeneity of the reference catalogues cannot be ruled out. The results indicate that a careful re-examination and cross-checking of the reported magnitude figures for the earthquakes from the first half of the 20th century is required.  相似文献   

2.
Seismic hazard analysis requires knowledge of the recurrence rates of large magnitude earthquakes that drive the hazard at low probabilities of interest for seismic design. Earthquake recurrence is usually determined through studies of the historic earthquake catalogue for a given region. Reliable historic catalogues generally span time periods of 100–200 years in North America, while large magnitude events (M?≥?7) have recurrence rates on the order of hundreds or thousands of years in many areas, resulting in large uncertainty in recurrence rates for large events. Using Monte Carlo techniques and assuming typical recurrence parameters, we simulate earthquake catalogues that span long periods of time. We then split these catalogues into smaller catalogues spanning 100–200 years that mimic the length of historic catalogues. For each of these simulated “historic” catalogues, a recurrence rate for large magnitude events is determined. By comparing recurrence rates from one historic-length catalogue to another, we quantify the uncertainty associated with determining recurrence rates from short historic catalogues. The use of simulations to explore the uncertainty (rather than analytical solutions) allows us flexibility to consider issues such as the relative contributions of aleatory versus epistemic uncertainty, and the influence of fitting method, as well as lending insight into extreme-event statistics. The uncertainty in recurrence rates of large (M?>?7) events is about a factor of two in regions of high seismicity, due to the shortness of historic catalogues. This uncertainty increases greatly with decreasing seismic activity. Uncertainty is dependent on the length of the catalogue as well as the fitting method used (least squares vs. maximum likelihood). Examination of 90th percentile recurrence rates reveals that epistemic uncertainty in the true parameters may cause recurrence rates determined from historic catalogues to be uncertain by a factor greater than 50.  相似文献   

3.
针对九寨沟MS7.0地震之后不同时间段的余震序列目录,利用推定最大余震震级,给出了实际最大余震震级的估计值。结果表明,推定最大余震震级随主震后时间尺度的延长而趋于稳定,且该值与实际发生的最大余震的震级一致。需要强调的是,就九寨沟地震序列而言,当余震数据较为完备时,采用主震后较短时间段内(1~2天)的余震目录就可以较准确地估算出主震区域内可能发生的最大余震震级。实际上,主震后12h(0.5天)的余震数据已完全可以给出最大余震震级的有效下限。此外,计算中我们采用了里氏震级ML和面波震级MS的余震目录,结果显示,2种震级类型目录的估算结果完全一致,表明利用推定最大余震震级估算实际最大余震震级的方法不受震级类型的影响。据此,该最大余震震级快速评估方法可进一步推广应用于我国大陆地区中强震后强余震灾害分析评估中。目前的拟合技术也显示出随着测震技术的不断进步以及余震识别能力的提高,快速评估方法可以在主震后短时间(<1天)内准确地预测可能发生的最大余震震级。  相似文献   

4.
本文基于匹配滤波技术,通过SEPD(Seismic Events and Phase Detection)对2018年11月25日新疆博乐MS4.9地震序列进行检测,检测出遗漏地震32条,84.4%地震为ML0.0—1.0,9.4%地震小于ML0.0,较地震目录中原有15条地震多213%,检测出的遗漏地震事件使地震目录更加完整。检测后的最小完整性震级由检测前的ML1.6减至ML0.8,地震目录最小完整性震级的减小有利于地震工作者对区域地震活动性作出更准确全面的结论,并使地震危险性分析更可靠。  相似文献   

5.
This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135?±?10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low seismic potential provinces. Seismotectonic properties of very high seismic potential provinces have been also presented.  相似文献   

6.
We examine the nature of the seismogenetic system along the San Andreas Fault (SAF), California, USA, by searching for evidence of complexity and non-extensivity in the earthquake record. We use accurate, complete and homogeneous earthquake catalogues in which aftershocks are included (raw catalogues), or have been removed by a stochastic declustering procedure (declustered catalogues). On the basis of Non-Extensive Statistical Physics (NESP), which generalizes the Boltzmann–Gibbs formalism to non-equilibrating (complex) systems, we investigate whether earthquakes are generated by an extensive self-excited Poisson process or by a non-extensive complex process. We examine bivariate cumulative frequency distributions of earthquake magnitudes and interevent times and determine the size and time dependence of the respective magnitude and temporal entropic indices, which indicate the level on non-equilibrium (correlation). It is shown that the magnitude entropic index is very stable and corresponds to proxy b-values that are remarkably consistent with the b-values computed by conventional means. The temporal entropic index computed from the raw catalogues indicate moderately to highly correlated states during the aftershock sequences of large earthquakes, progressing to quasi-uncorrelated states as these die out and before the next large event. Conversely, the analysis of the declustered catalogues shows that background seismicity exhibits moderate to high correlation that varies significantly albeit smoothly with time. This indicates a persistent sub-extensive seismogenetic system. The degree of correlation is generally higher in the southern SAF segment, which is consistent with the observation of shorter return periods for large earthquakes. A plausible explanation is that because aftershock sequences are localized in space and time, their efficient removal unveils long-range background interactions which are obscured by their presence! Our results indicate complexity in the expression of background seismicity along the San Andreas Fault, with criticality being a very likely mechanism as a consequence of the persistent non-equilibrium inferred from the temporal entropic index. However, definite conclusions cannot be drawn until the earthquake record is exhaustively studied in all its forms.  相似文献   

7.
地震活动性分析中余震的删除   总被引:13,自引:0,他引:13       下载免费PDF全文
陈凌  刘杰  陈颙  陈龙生 《地球物理学报》1998,41(Z1):244-252
介绍了几种删除余震的方法,并从地震断层的角度,提出了一种删除余震的新的震级相关时空窗法.采用这些方法,分别对4个具有不同时空尺度的地震目录删除了余震,并对原始目录及删除余震后的目录作了频度统计和R/S分析结果表明删除余震后,地震时间过程的平稳性明显提高,地震事件的独立性增强但仍存在着一定的非随机因素,主要表现在对地震时问过程的R/S分析中,Haret指数H>0.5.在此基础上,进一步讨论了删除余震方法及其有效性的检验.  相似文献   

8.
Several catalogues of global earthquakes reported for the time period from 1900 to 2000 have been compiled to examine lateral variations of the modal (a/b) values as derived from the Gutenberg–Richter empirical law. For this purpose, the world was divided into 27 different seismic regions in terms of tectonic environments. The parameters a and b were calculated using the least-squares method. The modal values computed for each region were used to produce a global map of the modal values using a grid space of 3°. The results show that a and b-values do not always supply much information about tectonic environments of the different regions. It is observed that the modal values estimated for different tectonic regions are consistent with seismicity of the world and represent global seismic sources better than a or b values. The highest modal values have been found in the oceanic subduction zones, and the lowest values in the oceanic ridges. Lowest b values are observed in trenches. These observations suggest that there is a correlation between apparent stresses and b values. Mapping of the modal values provides detailed images of the zones presenting low and high seismic activity and it may be used as a measure of seismic potential sources and relative hazard levels.  相似文献   

9.
Artificial earthquake catalogue simulation is one of the ways to effectively improve the incompleteness of the existing earthquake catalogue,the scarcity of large earthquake records and the improvement of seismological research.Based on the Poisson distribution model of seismic activity and the Gutenberg-Richter magnitude-frequency relationship, the Monte Carlo method which can describe the characteristics of the stochastic nature and the physical experiment process is used.This paper simulates the future seismic catalogues of the Fenhe-Weihe seismic belt of different durations and conducts statistical tests on them. The analysis shows that the simulation catalogue meets the set seismic activity parameters and meets the Poisson distribution hypothesis,which can obtain a better simulated earthquake catalogues that meets the seismic activity characteristics.According to the simulated earthquake catalogues,future earthquake trends in this region are analyzed to provide reference for seismic hazard analysis.  相似文献   

10.
Large data sets covering large areas and time spans and composed of many different independent sources raise the question of the obtained degree of harmonization. The present study is an analysis of the harmonization with respect to the moment magnitude M w within the earthquake catalogue for central, northern, and northwestern Europe (CENEC). The CENEC earthquake catalogue (Grünthal et al., J Seismol, 2009) contains parameters for over 8,000 events in the time period 1000–2004 with magnitude M w ≥ 3.5. Only about 2% of the data used for CENEC have original M w magnitudes derived directly from digital data. Some of the local catalogues and data files providing data give M w, but calculated by the respective agency from other magnitude measures or intensity. About 60% of the local data give strength measures other than M w, and these have to be transformed by us using available formulae or new regressions based on original M w data. Although all events are thus unified to M w magnitude, inhomogeneity in the M w obtained from over 40 local catalogues and data files and 50 special studies is inevitable. Two different approaches have been followed to investigate the compatibility of the different M w sets throughout CENEC. The first harmonization check is performed using M w from moment tensor solutions from SMTS and Pondrelli et al. (Phys Earth Planet Inter 130:71–101, 2002; Phys Earth Planet Inter 164:90–112, 2007). The method to derive the SMTS is described, e.g., by Braunmiller et al. (Tectonophysics 356:5–22, 2002) and Bernardi et al. (Geophys J Int 157:703–716, 2004), and the data are available in greater extent since 1997. One check is made against the M w given in national catalogues and another against the M w derived by applying different empirical relations developed for CENEC. The second harmonization check concerns the vast majority of data in CENEC related to earthquakes prior to 1997 or where no moment tensor based M w exists. In this case, an empirical relation for the M w dependence on epicentral intensity (I 0) and focal depth (h) was derived for 41 master events, i.e., earthquakes, located all over central Europe, with high-quality data. To include also the data lacking h, the corresponding depth-independent relation for these 41 events was also derived. These equations are compared with the different sets of data from which CENEC has been composed, and the goodness of fit is demonstrated for each set. The vast majority of the events are very well or reasonably consistent with the respective relation so that the data can be said to be harmonized with respect to M w, but there are exceptions, which are discussed in detail.  相似文献   

11.
Comparison of surface and borehole locations of induced seismicity   总被引:1,自引:0,他引:1  
Monitoring of induced microseismic events has become an important tool in hydraulic fracture diagnostics and understanding fractured reservoirs in general. We compare microseismic event and their uncertainties using data sets obtained with surface and downhole arrays of receivers. We first model the uncertainties to understand the effect of different acquisition geometries on location accuracy. For a vertical array of receivers in a single monitoring borehole, we find that the largest part of the final location uncertainty is related to estimation of the backazimuth. This is followed by uncertainty in the vertical position and radial distance from the receivers. For surface monitoring, the largest uncertainty lies in the vertical position due to the use of only a single phase (usually P‐wave) in the estimation of the event location. In surface monitoring results, lateral positions are estimated robustly and are not sensitive to the velocity model. In this case study, we compare event location solutions from two catalogues of microseismic events; one from a downhole array and the second from a surface array of 1C geophone. Our results show that origin time can be reliably used to find matching events between the downhole and surface catalogues. The locations of the corresponding events display a systematic shift consistent with a poorly calibrated velocity model for downhole dataset. For this case study, locations derived from surface monitoring have less scatter in both vertical and horizontal directions.  相似文献   

12.
用于中短期地震预报的一些地震活动性参量相关性讨论   总被引:18,自引:2,他引:16  
陆远忠  阎利军  郭若眉 《地震》1999,19(1):11-18
在利用地震活动性图像预报地震的研究中,发展了众多定量描述活动图像特征的参量,用它们综合进行短期预报,其相互间的相关性是十分重要的。利用计算机产生地震发生的时间、位置和强度分别符合均匀分布,泊松分布,负指数分布,韦泊尔分布5个随机地震目录,以及华北地区无强震时段的天然地震目录,分别计算了b,C,D,Mf,N值随时间的变化,统计分析了其相关关系,结果认为,D值与N值呈显相关;b值与Mf值呈显性负相  相似文献   

13.
We present a cellular automaton model which simulates the process of seismogenesis using rules for evolution which are derived from the field of fracture mechanics, and include an interplay of positive and negative feedbacks. We describe the implementation of this model, and its analysis, in a massively parallel environment using the Connection Machine. Starting from a lattice with a fractal distribution of fracture toughnesses, theb value evolves in a way which closely mimics both the evolutions ofb value observed in the laboratory and derived from earthquake catalogues, reaching a broad and irregular maximum in the period preceding a major event, and declining rapidly during catastrophic failure. We conclude that the processes modelled are a reasonable representation of those occurring in Nature, and that the cellular automaton paradigm is a valuable way of simulating these processes on a large scale in an economical manner.  相似文献   

14.
The algorithm CN makes use of normalized functions. Therefore the original algorithm, developed for the California-Nevada region, can be directly applied, without adjustment of the parameters, to the determination of the Time of Increased Probability (TIP) of strong earthquakes for Central Italy. The prediction is applied to the events with magnitudeMM 0=5.6, which in Central Italy have a return period of about six years. The routinely available digital earthquake bulletins of the Istituto Nazionale di Geofisica (ING), Rome, permits continuous monitoring. Here we extend to November 1994 the first study made by Keilis-Boroket al. (1990b). On the basis of the combined analysis of seismicity and seismotectonic, we formulate a new regionalization, which reduces the total alarm time and the failures to predict, and narrows the spatial uncertainty of the prediction with respect to the results ofKeilis-Borok et al. (1990b).The premonitory pattern is stable when the key parameters of the CN algorithm and the duration of the learning period are changed, and when different earthquake catalogues are used.The anlysis of the period 1904–1940, for whichM 0=6, allows us to identify self-similar properties between the two periods, in spite of the considerably higher seismicity level of the earlier time interval compared with the recent one.  相似文献   

15.
青海省及邻近区域历史地震目录完整性分析探讨   总被引:3,自引:0,他引:3  
以b值研究为基础,运用多种手段对青海及邻近区域的地震目录的完整性进行了分析研究,大致确定了不同震级下限的完整目录的起始时间,得出如下结果:认为4.7级以上地震的完整起始年为1960年左右,5.0级地震的完整起始年为1950年左右;5.5级以上地震的完整起始年为1925年左右;6级以上地震的完整时段为1917年左右。  相似文献   

16.
This paper provides a generic equation for the evaluation of the maximum earthquake magnitude mmax for a given seismogenic zone or entire region. The equation is capable of generating solutions in different forms, depending on the assumptions of the statistical distribution model and/or the available information regarding past seismicity. It includes the cases (i) when earthquake magnitudes are distributed according to the doubly-truncated Gutenberg-Richter relation, (ii) when the empirical magnitude distribution deviates moderately from the Gutenberg-Richter relation, and (iii) when no specific type of magnitude distribution is assumed. Both synthetic, Monte-Carlo simulated seismic event catalogues, and actual data from Southern California, are used to demonstrate the procedures given for the evaluation of mmax.The three estimates of mmax for Southern California, obtained by the three procedures mentioned above, are respectively: 8.32 ± 0.43, 8.31 ± 0.42 and 8.34 ± 0.45. All three estimates are nearly identical, although higher than the value 7.99 obtained by Field et al. (1999). In general, since the third procedure is non-parametric and does not require specification of the functional form of the magnitude distribution, its estimate of the maximum earthquake magnitude mmax is considered more reliable than the other two which are based on the Gutenberg-Richter relation.  相似文献   

17.
The earthquake recurrence time distribution in a given space-time window is being studied, using earthquake catalogues from different seismic regions (Southern California, Canada, and Central Asia). The quality of the available catalogues, taking into account the completeness of the magnitude, is examined. Based on the analysis of the catalogues, it was determined that the probability densities of the earthquake recurrence times can be described by a universal gamma distribution, in which the time is normalized with the mean rate of occurrence. The results show a deviation from the gamma distribution at the short interevent times, suggesting the existence of clustering. This holds from worldwide to local scales and for quite different tectonic environments.  相似文献   

18.
The routine location of regional seismic events using data from the Czech National Seismological Network (CNSN) is based on Pn, Pg, Sn, Sg phases. A simple velocity model derived from Kárník's (1953) interpretation of an earthquake in Northern Hungary in 1951 has hitherto been used. At present, numerous local seismic networks record and locate local events, which are occasionally recorded at regional distances as well. Due to the relatively small dimensions of local networks, hypocenters (and origin times) determined by a local network might be considered as nearly exact from the point of view of regional-scale CNSN. The comparison of common locations performed by CNSN and by a local network enables us to estimate the accuracy of CNSN locations, as well as to optimize a simple velocity model. The joint interpretation of the CNSN bulletin and the catalogues of four local seismic networks WEBNET, OSTRAVA, KLADNO and LUBIN produced a new ID velocity model. The most frequent epicentral error in this model is less than 5 km, and most foci lie up to 15 km from the true position. The performed analysis indicates bimodal distribution of Sn residuals.  相似文献   

19.
The parametric catalogues of historical earthquakes in East Siberia contain large data gaps. Among these is a 15-year period in the late nineteenth century (1886–1901). This period was not covered by any of macroseismic catalogues known; neither acquisition nor systematization of macroseismic data was ever performed for that purpose. However, 15 years is a rather long period in which large seismic events may have occurred. The present paper deals with the previously unknown earthquake that occurred on November 13, 1898. The primary macroseismic data were taken from regional periodicals. On the strength of all the evidence obtained, the earthquake epicenter is localized in Western Transbaikalia, near the western end of the Malkhansky Range; the magnitude is estimated at M?=?5.9. The information about the large earthquake of November 13, 1898 provides filling significant gaps in knowledge for seismicity in Western Transbaikalia and a better understanding of seismic potential of faults therein. The obtained results show that the periods of seismic quiescence in catalogues may be related to insufficient information on seismicity of Eastern Siberia in the historical past rather than to the absence of large earthquakes.  相似文献   

20.
准确及时地获取地震目录是开展地震短临预报和科学研究的重要基础。本文在详尽分析Oracle高级队列技术原理的基础上,并参考了已有的地震目录传输方法,给出了数据传输模型;通过PL/SQL语言编程实现了运用Oracle高级队列技术传输地震目录数据的实例,认为利用Oracle高级队列技术进行地震目录传输是完全可行的,非常适合在地震业务处理系统中使用,为地震目录的传输提供了一种新的手段。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号