首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Douglas  John  Azarbakht  Alireza 《Natural Hazards》2021,105(1):293-311

In the past couple of decades, Operational Earthquake Forecasting (OEF) has been proposed as a way of mitigating earthquake risk. In particular, it has the potential to reduce human losses (injuries and deaths) by triggering actions such as reinforcing earthquake drills and preventing access to vulnerable structures during a period of increased seismic hazard. Despite the dramatic increases in seismic hazard in the immediate period before a mainshock (of up to 1000 times has been observed), the probability of a potentially damaging earthquake occurring in the coming days or weeks remains small (generally less than 5%). Therefore, it is necessary to balance the definite cost of taking an action against the uncertain chance that it will mitigate earthquake losses. In this article, parametric cost–benefit analyses using a recent seismic hazard model for Europe and a wide range of inputs are conducted to assess when potential actions for short-term OEF are cost–beneficial prior to a severe mainshock. Ninety-six maps for various combinations of input parameters are presented. These maps show that low-cost actions (costing less than 1% of the mitigated losses) are cost–beneficial within the context of OEF for areas of moderate to high seismicity in the Mediterranean region. The actions triggered by OEF in northern areas of the continent are, however, unlikely to be cost–beneficial unless very large increases in seismicity are observed or very low-cost actions are possible.

  相似文献   

2.

In this paper, seismic risk scenarios for Bucharest, the capital city of Romania, are proposed and assessed. Bucharest has one of the highest seismic risk levels in Europe, and this is due to a combination of relatively high seismic hazard and a building stock built mainly before the devastating Vrancea 1977 earthquake. In this study, the seismic risk of Bucharest is assessed using the most recent information regarding the characteristics of the residential building stock. The ground motion amplitudes are evaluated starting from random fields obtained by coupling a ground motion model derived for the Vrancea intermediate-depth seismic source with a spatial correlation model. The seismic risk evaluation method applied in this study is based on the well-known macroseismic method. For several structural typologies, the vulnerability parameters are evaluated based on a damage survey performed on 18,000 buildings in Bucharest after the March 1977 earthquake. Subsequently, the risk metrics are compared with those from other studies in the literature that apply a different risk assessment methodology in order to gain a better view of the uncertainties associated with a seismic risk study at city level. Finally, the impact of several Vrancea intermediate-depth earthquake scenarios is evaluated and the results show that the earthquake which has the closest epicenter to Bucharest appears to be the most damaging.

  相似文献   

3.
Shoubiao Zhu 《Natural Hazards》2013,69(2):1261-1279
The sudden and unexpected Wenchuan earthquake (Ms = 8.0) occurred on the Longmen Shan Fault, causing a large number of casualties and huge property loss. Almost no definite precursors were reported prior to this event by Chinese scientists, who made a first successful prediction of the 1975 Haicheng earthquake (M = 7.3) in China. Does the unsuccessful prediction of the Wenchuan earthquake mean earthquake prediction is inherently impossible? In order to answer this question, the paper simulated inter- and co-seismic deformation, and recurrence of strong earthquakes associated with the Longmen Shan listric thrust fault by means of viscoelastic finite element method. The modeling results show that the computed interseismic strain accumulation in the lower crust beneath the Eastern Tibet is much faster than that in the other regions. In particular, the elastic strain energy density rate accumulates very rapid in and around the Longmen Shan fault in the depth above ~25 km that may explain why the great Wenchuan earthquake occurs in the region of such a slow surface deformation rate. The modeled coseismic displacements around the fault are consistent with surface rupture, aftershock distribution, and GPS measurement. Also, the model displays the slip history on the Longmen Shan fault, implying that the average earthquake recurrence interval on the Longmen Shan fault is very long, 3,300 years, which is in good agreement with the observed by paleoseismological investigations and estimates by other methods. Moreover, the model results indicate that the future earthquake could be evaluated based on numerical computation, rather than on precursors or on statistics. Numerical earthquake prediction (NEP) seems to be a promising avenue to a successful prediction, which will play an important part in natural hazard mitigation. NEP is difficult but possible, which needs well supporting.  相似文献   

4.
The work presented in this paper is an outgrowth of a multi—year study at the Wharton School of the University of Pennsylvania on Managing Catastrophic Risks. We focus on the role of homeowners and insurance companies in managing the hazard from earthquake risk. Specifically, we consider alternative earthquake disaster management strategies for a typical homeowner and a small insurance company in the Oakland, California region. These strategies involve the adoption of mitigation measures and the purchase of earthquake insurance by the homeowner and the purchase of an indemnity contract (e.g., excess—of—loss reinsurance) by the insurer.

We focus on how uncertainty impacts these disaster management strategies. Specifically, we illustrate the impact of structural mitigation and risk—transfer mechanisms on the insurer's performance when there is uncertainty in the company's risk profile. This risk profile is captured through a loss exceedance probability (EP) curve, representing the probability that a certain level of monetary loss will be exceeded on an annual basis. Parameters considered in the sensitivity analysis that will shift the loss EP curve include: earthquake recurrence, ground motion attenuation, soil mapping schemes, and the exposure and vulnerability of the residential structures. The paper demonstrates how uncertainty in these parameters impacts the cost effectiveness of mitigation and reinsurance on the insurer's profitability and chances of insolvency, as well as the number of policies the insurer is willing to issue.  相似文献   

5.
Abstract

Eight caves have been investigated near Saint-Paul- de-Fenouillet after the earthquake of 5.2 magnitude of February 1996 which occurred in the eastern Pyrenees (France) and caused moderate damage at the ground surface. The earthquake has been associated with the movement of an E-W fault. The caves had not been visited since the earthquake. Some damage, mainly collapses of soda straws and small rocks, could be attributed to this earthquake. The most interesting cave in the epicentral area is the Paradet cave which is situated on a recently activated fault plane. In this cave, soda straw falls could be attributed to the earthquake, but other more ancient damage was also observed. Analysis of the azimuth of fallen speleothems, which are natural pendulums, may indicate the directions, and an estimation of their mechanical properties gives the threshold of the seismic ground motion amplitude responsible for their collapse, thus supplying information to calibrate damage due to past earthquakes. A statistical study indicates that the main direction of the collapsed soda straws is E–W. Numerical simulations confirm that soda straws are relatively strong objects that may break under certain conditions during earthquakes. © Elsevier, Paris  相似文献   

6.

Complexity in the earthquake mechanism is manifested in different forms such as fractal distribution, clustering of seismicity, etc., and characterized as critical phenomenon. Occurrences of earthquakes generally represent the state of metastable equilibrium. The Andaman–Sumatra subduction zone is one of the most seismically active corridors (possibly in metastable state) in the world. Recently, the region faced three major earthquakes of magnitude more than 8.5 (M ~ 9.1 on December 26, 2004; M ~ 8.6 on March 28, 2005; M ~ 8.6 on April 11, 2012). Researchers have suggested multiple causes of earthquake generation in this region including the one with possible correlation of tidal stresses with earthquake occurrences. The latter issue, however, has been hotly debated in view of the fact that a small stress generated due to tidal forcing cannot cause such a bigger magnitude earthquake. We study here the impact of tidal forcing on critically generated earthquake phenomena. We examined the statistical behavior of recurrence time interval of earthquakes using the available data for period of about 40 years from 1973 to 2013. We constrain the simple empirical toy model using the concept of catastrophe theory to evaluate the impact of small tidal forcing on the critical state of earthquakes occurrences. In addition to the major role of Helmholtz free energy during the plate motion, our analysis suggests that the stability and critical behavior of the earthquake in Sumatra region could be associated with tidal forcing, however, only for triggering of some of the “Catastrophic–Chaotic” earthquake phenomenon.

  相似文献   

7.
According to the latest UNFA Report on state of world population 2007, unleashing the potential of urban growth by 2030, the urban population will rise to 5 billion or 60?% of the world population. Liquefaction in urban areas is dangerous phenomenon, which cause more damage to buildings and loss of human lives. Chennai, the capital city of the State Tamil Nadu in India, is one of the densely populated cities in the country. The city has experienced moderate magnitude earthquakes in the past and also categorized under moderate seismic hazard as per the Bureau of Indian Standards (BIS in Criteria for earthquake resistant design of structures; Bureau of Indian Standards, New Delhi, 1893 2001). A study has been carried out to evaluate the liquefaction potential of Chennai city using geological and geomorphological characteristics. The subsurface lithology and geomorphological maps were combined in the GIS platform for assessing the liquefaction potential. The liquefaction hazard broadly classified into three categories viz., liquefaction likely, possible and not likely areas. Mainly, the liquefaction likely areas spread along the coastal areas and around the river beds. The rest of the areas are liquefaction not likely and possible. The present map can be used as first-hand information on regional liquefaction potential for the city, and it will be help to the scientists, engineers and planners who are working for future site-specific studies of the city.  相似文献   

8.
Wyss  Max 《Natural Hazards》2016,80(1):141-152

The number of fatalities in the Gorkha M7.8 earthquake of April 25, 2015, has been estimated at four different times as follows. In March 2005, the fatality estimate in this journal was 21,000–42,000 with an assumed magnitude of 8.1 (Wyss in Nat Hazard 34:305–314, 2005). Within hours after this earthquake, the estimated number of fatalities by QLARM was 2000–10,000 using a point source model and M7.9. Four hours later, the estimate was 20,000–100,000, based on a first approximation line source model and assuming children were in school. Children out of school, as this was a weekend day, reduced the fatalities by approximately a factor of two, but was not taken into account for the calculation. The final line source estimates based on M7.8 and M7.9 calculates 800–9300 and 1100–11,200 fatalities, respectively. The official count is about 10,000 fatalities. These estimates were performed using QLARM, a computer tool and world data set on the distribution of people in settlements and containing a model of the buildings present. It is argued here that the loss estimate 10 years before the event being within a factor of 2.1 of the eventual loss count is useful for mitigation planning. With varying quality of information on the source and the attenuation, the estimates of fatalities shortly after the earthquake are accurate enough to be useful for first responders. With full knowledge of the rupture properties and the regional attenuation of seismic waves, the numbers of human losses are estimated correctly.

  相似文献   

9.
通过实时地震观测数据获取能力和计算机性能的巨大提高 ,快速生成仪器观测的地面运动和地震动强度空间分布成为可能 ,利用这一成果结合地震灾害易损性研究成果和地震灾害评估模型 ,可以对破坏性地震造成的损失进行快速评估 ,为地震应急响应和决策指挥提供更可靠的信息支持。本文概述了利用强地面运动加速度观测资料进行地震灾害快速评估方法在国内外最新进展 ,提出了利用我省强震台网观测资料进行地震灾害快速评估的基本思路。  相似文献   

10.
The goal of this study was to determine whether principal component analysis (PCA) can be used to process GPS ionospheric total electron content (TEC) data on a monthly basis to identify early earthquake-associated TEC anomalies. PCA is applied to GPS (mean value of a month) ionospheric TEC records collected from the Japan GEONET system to detect TEC anomalies associated with 10 earthquakes in Japan (M?≥?6.0) from 2006 to 2007. According to the results, PCA was able to discriminate clear TEC anomalies in the months when all 10 earthquakes occurred. After reviewing the months when no M?≥?6.0 earthquake occurred but the geomagnetic storm activity was present, it is possible that the maximal principal eigenvalues PCA returned for these 10 earthquakes indicate earthquake-associated TEC anomalies. Previously, PCA has been used to discriminate earthquake-associated TEC anomalies recognized by other researchers who found that a statistical association between large earthquakes and TEC anomalies could be established in the 5 days before earthquake nucleation and in 24 h before earthquake; however, since PCA uses the characteristics of principal eigenvalues to determine earthquake-related TEC anomalies, it is possible to show that such anomalies existed earlier than this 5-day statistical window. In this paper, this is shown through the application of PCA to one-dimensional TEC data relating to the earthquake of 17 February 2007 (M?=?6.0). The analysis is applied to daily TEC and reveals a large principal eigenvalue (representative of an earthquake-associated anomaly) for 02 February, 15 days before the 17 February earthquake.  相似文献   

11.
Li  Fan  Wang  Lin  Jin  Zhigang  Huang  Lifang  Xia  Bo 《Natural Hazards》2019,104(1):101-121

Most disaster research has focused on business recovery at a point in time or over a short period of time, with the goal of summarizing the experience to reduce business vulnerability in future disasters. However, studies on long-term business recovery processes may be more useful for providing lessons that support sustained business operations after a disaster. This study considers the changes in business’ operating statuses following the initial survival of a large earthquake and examines how different factors influence sustained business operations during the long-term recovery after a disaster. The study uses logistic regression techniques along with field investigations and questionnaire data collected from 256 New Beichuan businesses that remained open following the 2008 Wenchuan earthquake in China. The study results showed that some of the original surviving businesses closed during the subsequent post-disaster operation process. As such, identifying businesses reopening after the disaster cannot be equated with long-term recovery. Factors significantly influencing the sustained operation of a business after the Wenchuan earthquake included: pre-disaster financial conditions, post-disaster monthly average profit, borrowing of money from family or friends, the business owner’s gender, and government subsidies. Study findings have important theoretical implications for research on the long-term recovery of businesses after an earthquake. Findings also have practical value for business owners selecting post-disaster sustainable operation strategies.

  相似文献   

12.
The author, using the theory of physical similarity as developed in the U. S. S. R. and equations describing the development of folds and faults in rocks, theoretically proves the possibility of using scale models in tectonophysics.

New instruments necessary for investigation of equivalent materials (which are necessary for conditions of similarity) have been created in the U. S. S. R. Some substances having properties meeting model-material requirements have been known for a long time. New materials with the required properties have also been created. As a result, scale models can be practically used to study tectonic deformation and fractures.

The fundamental principles of the optical method of investigation of stress state of elastic and plastic transparent models are described, indicating that the scale-model method may be used for the investigation of the tectonic-stress fields in the earth's crust.

Three examples demonstrate the ability of the scale-model method to help solve different geological and geophysical problems. The hypothetical physical conditions of two types of folds - longitudinal bending and longitudinal thickening - were checked.

The notions about the distribution of tectonic faults formed during the growth of transversal bending anticlines were made more precise with the aid of transparent models.

Transparent plastic models are used to study the ratio of the magnitude of tangential stresses in the region of earthquake foci to the velocity of the movement of the earth's crust. In elastic transparent models it is possible to see changes in the character of earthquake foci with time due to the development of tectonic faults In such models, the influence of a type of tectonic deformation and fault magnitude on the value of seismic energy generating from the earthquake focus can be studied. All these data cannot be obtained by only field investigations. Therefore, even experimental information obtained from scale models facilitates the development of geological criteria of seismicity.  相似文献   

13.
In 2014, China will pilot its first earthquake insurance program, and the inhabitants’ perception of earthquake risk in the pilot area is significant for the implementation of this plan. In this study, the authors conducted a field survey in four districts in the insurance pilot area to investigate the factors affecting the earthquakes risk perception of residents. The survey concentrates on the factors of hazard experience and residents’ house type and shows that people who have experienced more earthquakes tend to have a lower risk perception while people who have suffered serious earthquake loss tend to have a higher risk perception. For the house type factor, the author finds that house type is correlated with the risk perception from an earthquake. The effect on risk perception is significantly reduced when people enhance their house type with brick walls, concrete beams, and column. Furthermore, gender, income, and education level also have direct effects on how residents perceive of the risk from an earthquake.  相似文献   

14.

A 6.8-magnitude earthquake that occurred on January 24, 2020, has been effective in Turkey’s eastern regions. The earthquake, with recorded peak ground acceleration (PGA) value of 0.292 g, caused the destruction or heavy damage of buildings, especially in the city center of Elaz?? province. The purpose of this paper was to share the results of detailed investigation in the earthquake-stricken area. Additionally, the causes of damages and failures observed in the buildings were compared to those that had occurred in previous earthquakes in Turkey. In this study, the damages observed in especially RC buildings as well as in masonry and rural buildings were summarized, the lessons learned were evaluated, and the results were interpreted with reference to Turkish earthquake codes. In the study, it was particularly emphasized why the building stock underwent such damage even though the buildings were exposed to earthquake acceleration well below the design acceleration values.

  相似文献   

15.
Quantitative estimates of earthquake losses are needed as soon as possible after an event. A majority of earthquake-prone countries lack the necessary dense seismograph networks, modern communication, and in some places the experts to assess losses immediately, so the earliest possible warnings must come from global information and international experts. Earthquakes of interest to us are in most areas of the world M ≥ 6. In this article, we have analyzed the response time for distributing source parameter estimates from: National Earthquake Information Center (NEIC) of the US Geological Survey (USGS), the European Mediterranean Seismological Center (EMSC), and Geophysical Institute-Russian Academy of Science, Obninsk (RAS). In terms of earthquake consequences, the Pacific Tsunami Warning Center (TWC) issues assessments of the likelihood of tsunamis, the Joint Research Laboratory in Ispra, Italy (JRC) issues alerts listing sociological aspects of the affected region, and we distribute loss estimates, and recently the USGS has started posting impact assessment information on their PAGER web page. Two years ago, the USGS reduced its median delay of distributing earthquake source parameters by a factor of 2 to the currently observed 26 min, and they distribute information for 99% of the events of interest to us. The median delay of EMSC is 41 min, with 30% of our target events reported. RAS reports after 81 min and 30% of the target events. The first tsunami assessments by TWC reach us 18 min (median) after large earthquakes in the Pacific area. The median delay of alerts by the JRC is 44 min (36 min recently). The World Agency for Planetary Monitoring and Earthquake Risk Reduction (WAPMERR) distributes detailed loss estimates in 41 min (median). Moment tensor solutions of the USGS, which can be helpful for refining loss estimates, reach us in 78 min (median) for 58% of the earthquakes of interest.  相似文献   

16.

Analysing pre-earthquake signals using satellite technology are getting importance among the scientific community, since round-the-clock survey for the wider region is possible compared to ground-based monitoring techniques. Several scientists are involved in various satellites and ground-based technologies to decode the complex physical mechanism of the earthquake process since 1980. They involved in measuring anomalous variations using space-based methodologies like EM signals, SAR interferometry, GPS for ionospheric sounding, satellite gravimetry, atmospheric sounding, Outgoing Longwave Radiation (OLR), radon gas and seismo-tectonic clouds. In this paper, the authors have considered surface latent heat flux (SLHF) and OLR satellite data for detailed analysis of earthquakes took place during the year 2014 in Sumatra and Nicobar Is regions. At the surface and atmospheric interface, the anomalous variations in SLHF were observed prior to the occurrence of the earthquake. Similarly, anomalous variations in OLR have been observed 3–30 days prior to the big earthquakes and it is measured above the cloud level. From the analysis, the author has found that variations in the SLHF and OLR flux can be utilized as efficient tools to identify the impending big earthquakes. SLHF and OLR variation level can give us a clue about the probable magnitude of earthquakes and also about earthquake preparation zones. Hence, by correlating the above-mentioned parameters, it is potential to key out the impending earthquakes with reasonable accuracy.

  相似文献   

17.
Iceland has been subjected to destructive earthquakes and volcanic eruptions throughout history. Such events are often preceded by changes in earthquake activity over varying timescales. Although most seismicity is confined to micro-earthquakes, large earthquakes have occurred within populated regions. Following the most recent hazardous earthquakes in 2000, the Icelandic Meteorological Office (IMO) developed an early warning and information system (EWIS) Web-site for viewing near-real-time seismicity in Iceland. Here we assess Web-site usage data in relation to earthquake activity, as recorded by the South Iceland Lowland (SIL) seismic network. Between March 2005 and May 2006 the SIL seismic network recorded 12,583 earthquakes. During this period, the EWIS Web-site logged a daily median of 91 visits. The largest onshore event (M L 4.2) struck 20 km from Reykjavík on 06 March 2006 and was followed by an immediate, upsurge in usage resulting in a total of 1,173 unique visits to the Web-site. The greatest cluster of large (≥M L 3) events occurred 300 km offshore from Reykjavík in May 2005. Within this swarm, 9 earthquakes ≥M L 3 were detected on 11 May 2005, resulting in the release of a media bulletin by IMO. During the swarm, and following the media bulletin, the EWIS Web-site logged 1,234 unique visits gradually throughout the day. In summary, the data reveal a spatial and temporal relationship between Web-site usage and earthquake activity. The EWIS Web-site is accessed immediately after the occurrence of a local earthquake, whereas distant, unfelt earthquakes generate gradual interest prompted by media bulletins and, possibly, other contributing factors. We conclude that the Internet is a useful tool for displaying seismic information in near-real-time, which has the capacity to help increase public awareness of natural hazards.  相似文献   

18.

In this research, deep learning (DL) model is proposed to classify the soil reliability for liquefaction. The applicability of the DL model is tested in comparison with emotional backpropagation neural network (EmBP). The database encompassing cone penetration test of Chi–Chi earthquake. This study uses cone resistance (qc) and peck ground acceleration as inputs for prediction of liquefaction susceptibility of soil. The performance of developed models has been assessed by using various parameters (receiver operating characteristic, sensitivity, specificity, Phi correlation coefficient, Precision–Recall F measure). The performance of DL is excellent. Consistent results obtained from the proposed deep learning model, compared to the EmBP, indicate the robustness of the methodology used in this study. In addition, both the developed model was also tested on global earthquake data. During validation on global data, both the models shows good results based on fitness parameters. The developed classification models a simple, but also efficient decision-making tool in engineering design to quantitatively assess the liquefaction potential. The finding of this paper can be further used to capture the relationship between soil and earthquake parameters.

  相似文献   

19.

While many approaches for assessing earthquake risk exist within the literature and practice, it is the dynamic interrelationships between earthquake hazard, physical risk, and the social conditions of populations that are the focal point for disaster risk reduction. Here, the measurement of vulnerability to earthquakes (i.e., characteristics that create the potential for harm or loss) has become a major focus area. However, metrics aimed at measuring vulnerability to earthquakes suffer from several key limitations. For instance, hazard and community context are often ignored, and attempts to validate metrics are largely non-existent. The purpose of this paper is to produce composite indices of the vulnerability of countries to earthquakes within three topical areas: social vulnerability, economic vulnerability, and recovery potential. To improve upon the status quo in indicators development for measuring vulnerability to seismic events, our starting point was to: (1) define a set of indicators that are context specific to earthquakes as defined by the literature; (2) delineate indicators within categorical areas of vulnerability that are easy to understand and could be put into practical use by DRR practitioners; and (3) propose indicators that are validated using historical earthquake impacts. When mapped, the geographic variations in the differential susceptibility of populations and economies to the adverse effects of damaging earthquake impacts become evident, as does differential ability of countries to recover from them. Drivers of this geographic variation include average country debt, the type and density of infrastructure, poverty, governance, and educational attainment, to name just a few.

  相似文献   

20.
Ground motion intensity was presented as Modified Mercalli Intensity in gross domestic product (GDP)-based earthquake loss assessment method, and the variable uncertainties and the variation of GDP with time were neglected. In this study, the hazard-loss/GDP relation was determined based on peak ground acceleration (PGA), and the uncertainties of PGA and the hazard-loss/GDP relation were modeled with the Envelope Bound Convex Model and the curves of GDP with time were also derived. Finally, the earthquake loss of Ningbo city was estimated and disaggregated,and sensitivity of uncertain variables was investigated. The results show that the loss with Chen method locates in the lower half intervals of convex analysis results, and the most loss is from frequent earthquake. Hazard-Loss/GDP relation gives the most contribution to loss uncertainty, and the reminder sequence of sensitivity is the annual occurrence rate v, the b value and the upper bound magnitude M u .  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号