首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The results of forecasting the Chilean tsunami in the Pacific Ocean on April 1,2014are presented. For the first time in Russia the forecast was prepared in the near real-time mode 9.5-10.5 hours before the tsunami attacked the Russian coast. A good agreement was obtained with the tsunami forms registered by DART stations along the US West Coast and the Aleutian and Kuril Islands. The information about the expected tsunami meets the requirements to the short-term tsunami forecast formulated by UNESCO Intergovernmental Oceanographic Commission.  相似文献   

2.
选取了一组代表性的南海海啸源,并分别使用COMCOT海啸模式,以数值模拟的方法对南海局地海啸源进行了数值模拟,从海啸的传播影响时间、波高和能量分布等角度,分析了如果南海发生地震海啸,不同海啸源将会对我国南海沿岸地区和南海岛礁造成的影响。通过敏感性试验证实,海啸波的强度受地震震级变化影响较大,因此,如果南海发生强震引发局地海啸,不同海啸源将会给我国南海周边及岛礁等不同区域造成严重损害。  相似文献   

3.
Engineering Design and Uncertainties Related to Climate Change   总被引:2,自引:0,他引:2  
To explore how uncertain climate events might affect investment decisions that need to be made in the near future, this paper examines (1) the relative magnitude of the uncertainties arising from climate change on engineering design in water resources planning and (2) a restricted set of water resource planning techniques that deal with the repeated choice of investment decisions over time. The classical capacity-expansion model of operations research is exploited to show the relative impacts upon engineering design choices for variations in future demand attributable to changes in the climate or other factors and the possible shortfall of supply due to climate change. The type of engineering decisions considered in the paper are sequential, enabling adjustments to revealed uncertainty in subsequent decisions. The range of possible impacts analyzed in the paper lead to similar engineering design decisions. This result means that engineers must be on their guard with respect to under-design or over-design of systems with and without the threat of climate change, but that the sequential nature of the decision-making does not call for drastic action in the early time periods.  相似文献   

4.
《大气与海洋》2013,51(4):415-427
Abstract

An Mw = 7.2 earthquake occurred on 15 June 2005 (utc) seaward of northern California off the west coast of North America. Based on the earthquake location and source parameters, the West Coast and Alaska Tsunami Warning Center issued a tsunami warning for the region extending from the California‐Mexico border to northern Vancouver Island, British Columbia (the first tsunami warning for this region since the 1994 Mw = 8.2 Shikotan earthquake). Six tide gauges on the west coast recorded tsunami waves from this event, with a maximum trough‐to‐crest wave height of 27.7 cm observed at Crescent City, California. Waves of 2.5 to 6.5 cm were measured at the five other sites: Port Orford (Oregon), North Spit and Arena Cove (California), and Tofino and Bamfield (British Columbia). The open‐ocean Deep‐ocean Assessment and Reporting of Tsunami (DART) buoys, 46404 and 46405, recorded tsunami waves of 0.5 and 1.5 cm, respectively, closely matching wave heights derived from numerical models. Incoming tsunami wave energy was mainly at periods of 10 to 40 min. The observed tsunami wave field is interpreted in terms of edge (trapped) and leaky (non‐trapped) waves and a “trapping coefficient” is introduced to estimate the relative contribution of these two wave types. Due to the high (3000 m) water depth in the source area, approximately two‐thirds of the total tsunami energy went to leaky wave modes and only one‐third to edge wave modes. The improved response to and preparedness for the 2005 California tsunami compared to the 1994 Shikotan tsunami is attributable, in part, to the operational capability provided by the open‐ocean bottom‐pressure recorder (DART) system, higher quality coastal tide gauges, and the effective use of numerical models to simulate real‐time tsunamis.  相似文献   

5.
The 2011 Japanese earthquake and tsunami, and the consequent accident at the Fukushima nuclear power plant, have had consequences far beyond Japan itself. Reactions to the accident in three major economies Japan, the UK, and Germany, all of whom were committed to relatively ambitious climate change targets prior to the accident are examined. In Japan and Germany, the accident precipitated a major change of policy direction. In the UK, debate has been muted and there has been essentially no change in energy or climate change policies. The status of the energy and climate change policies in each country prior to the accident is assessed, the responses to the accident are described, and the possible impacts on their positions in the international climate negotiations are analysed. Finally, the three countries' responses are compared and some differences between them observed. Some reasons for their different policy responses are suggested and some themes, common across all countries, are identified

Policy relevance

The attraction of nuclear power has rested on the promise of low-cost electricity, low-carbon energy supply, and enhanced energy independence. The Fukushima accident, which followed the Japanese tsunami of March 2011, has prompted a critical re-appraisal of nuclear power. The responses to Fukushima are assessed for the UK, Germany, and Japan. Before the accident, all three countries considered nuclear as playing a significant part in climate mitigation strategies. Although the UK Government has continued to support nuclear new build following a prompt review of safety arrangements, Japan and Germany have decided to phase out nuclear power, albeit according to different timescales. The factors that explain the different decisions are examined, including patterns of energy demand and supply, the wider political context, institutional arrangements, and public attitudes to risk. The implications for the international climate negotiations are also assessed.  相似文献   

6.
Abstract

The First Nations (Da ‘naxda ‘xw) village of Kwalate, Knight Inlet, British Columbia was located along the shore of a funnel‐shaped bay. Archaeological investigations show that this was a major village that stretched 90 m along the shoreline and was home to possibly 100 or more inhabitants. Oral stories indicate that the village was completely swept away by a tsunami that formed when an 840‐m high rock avalanche descended into the water on the opposite side of the fjord. Shipboard geological mapping, combined with empirical tsunami modelling, indicate that the tsunami was likely 2 to 6 m high prior to run‐up into the village. Radiocarbon dates reveal that the village was occupied from the late 1300s CE until the late 1500s CE when it was destroyed by the tsunami.  相似文献   

7.
Fragments of deep-ocean tidal records up to 3 days long belong to the same functional sub-space, regardless of the record’s origin. The tidal sub-space basis can be derived via Empirical Orthogonal Function (EOF) analysis of a tidal record of a single buoy. Decomposition of a tsunami buoy record in a functional space of tidal EOFs presents an efficient tool for a short-term tidal forecast, as well as for an accurate tidal removal needed for early tsunami detection and quantification [Tolkova, E., 2009. Principal component analysis of tsunami buoy record: tide prediction and removal. Dyn. Atmos. Oceans 46 (1–4), 62–82] EOF analysis of a time series, however, assumes that the time series represents a stationary (in the weak sense) process. In the present work, a modification of one-dimensional EOF formalism not restricted to stationary processes is introduced. With this modification, the EOF-based de-tiding/forecasting technique can be interpreted in terms of a signal passage through a filter bank, which is unique for the sub-space spanned by the EOFs. This interpretation helps to identify a harmonic content of a continuous process whose fragments are decomposed by given EOFs. In particular, seven EOFs and a constant function are proved to decompose 1-day-long tidal fragments at any location. Filtering by projection into a reduced sub-space of the above EOFs is capable of isolating a tsunami wave within a few millimeter accuracy from the first minutes of the tsunami appearance on a tsunami buoy record, and is reliable in the presence of data gaps. EOFs with ∼3-day duration (a reciprocal of either tidal band width) allow short-term (24.75 h in advance) tidal predictions using the inherent structure of a tidal signal. The predictions do not require any a priori knowledge of tidal processes at a particular location, except for recent 49.5 h long recordings at the location.  相似文献   

8.
潮汐和地震对全球气候变化的影响   总被引:1,自引:0,他引:1  
2004年12月26日印尼地震海啸后,全球低温冻害和暴雪灾害频繁发生。"潮汐调温说"和"深海巨震降温说"是一种合理的解释。根据"潮汐调温说"和"深海巨震降温说"理论,2005年全球气温将因为印尼地震海啸和强潮汐南北震荡而降低。美国国家航空航天局(NASA)的科学家认为,一个较弱的厄尔尼诺现象和人类排放的温室气体将使2005年成为人类有记载以来最热的一年。事实上,2005年的温度低于1998年。现在,西方科学家也承认了2005—2007年自然气候的变化抵消了全球气候变暖效应这一客观事实。  相似文献   

9.
Russian Meteorology and Hydrology - Possible extreme fluctuations of sea level in the area of the Akkuyu nuclear power plant induced by tsunami, wind, waves, and tides are discussed. The...  相似文献   

10.
Principal component or Empirical Orthogonal Function (EOF) analysis is applied to tsunameter records by treating them as two-dimensional signals, where the second dimension is created by breaking a single time series into cycles and treating the cycle number as a second dimension. Under certain conditions, principal components calculated from different records are shown to determine the same functional space. Signal decomposition into pre-calculated principal components is used to predict or extract the tidal component of a record. This work shows that EOF processing allows for short-term tidal predictions at tsunami buoy locations with the precision of more advanced methods and with minimal a priori knowledge about tidal dynamics. Also shown is that filtering in EOF domain is sensitive to the non-tidal component of a record and therefore presents a tool for early tsunami detection and quantification.  相似文献   

11.
Learning from natural disasters is predominantly regarded as beneficial: Individuals and governments learn to cope and thereby reduce damage and loss of life in future disasters. We argue against this standard narrative and point to two principal ways in which learning from past disasters can have detrimental consequences: First, investment in protective infrastructures may not only stimulate settlement in hazard-prone areas but also foster a false impression of security, which can prevent individuals from fleeing to safe places when hazard strikes. Second, if disaster events in the past did not have catastrophic consequences, affected individuals do not take future events sufficiently seriously. As a consequence, learning from disasters is a double-edged sword that can prevent large scale damage and human loss most of the time but results in the worst case scenario when a disaster occurs at an unexpected scale and public preparedness measures fail. We demonstrate the devastating impact of misplaced trust in public preparedness measures and misleading lessons drawn from past experience for the case of the 2011 Tohoku tsunami. Our paper contributes to the literatures on ‘negative learning’ and ‘hazard maladaptation’ by demonstrating that a lack of past experience with tsunami mortality in a municipality substantively increases mortality in the Tohoku tsunami.  相似文献   

12.
Previous studies have revealed that political ideology can influence motivations for individual preparedness to mitigate the effects of climate change. Few studies have examined its role in individual preparedness behaviors to reduce the impacts of other natural hazards, such as earthquakes and tsunamis. The purpose of this study is to explore the influence of political ideology on current individual earthquake and tsunami preparedness behaviors among inhabitants of Chile's coastal areas. A statistically representative sample of the Valparaíso Region (N = 500) participated in this study. They were part of a more extensive study conducted between 2018 and 2019 in cities along the Chilean coastline, intending to study preparedness for multiple natural hazards. The survey evaluated trust in government authorities regarding emergency management, current earthquake/tsunami preparedness behaviors, and political ideology. The results reveal that political ideology is a relevant factor in predicting emergency preparedness behaviors and is significantly related to trust in government authorities. The individuals located on the right extreme of both dimensions of political ideology (those self-identified as right-wing and/or pro-market) report a higher level of current earthquake/tsunami preparedness, compared to their respective groups. Thus, for future design and implementation of natural disaster preparedness strategies and programs, the agencies in charge should recognize the role of political ideology.  相似文献   

13.
The estimate of the release of radioactive substances (133Xe, 131I, and 137Cs) into the atmosphere from the Fukushima Daiichi nuclear disaster is presented. It was obtained using the FLEXPART Lagrangian dispersion model and the data of local ground-based measurements of radiation dose rate. The computation period covers the active phase of the nuclear disaster that lasted 20 days after the tsunami. To get the quantitative characteristics of emissions of radioactive substances, the inverse modeling based on the Bayesian approach is used. The emissions were estimated for three altitudes. The total emissions are equal to 2.1 + 0.4 kg (14 000 + 2700 PBq) for 133Xe, (3.8 + 0.4) x 10-2 kg (174 + + 18 PBq) for 131I, and 5.7 + 1.2 kg (18 + 4 PBq) for 137Cs that is consistent with the results of other studies. Retrieved emissions were used to provide the forward modeling for mapping the areas of radionuclide deposition. The developed method of retrieving the emission of radioactive substances makes a useful instrument that operationally estimates and localizes the areas of potential pollution in case of nuclear accidents and could be used for making decisions on the population evacuation.  相似文献   

14.
Estimates of the true economic cost that might be attributed to greenhouse-induced sea-level rise on the developed coastline of the United States are offered for the range of trajectories that is now thought to be most likely. Along a 50-cm sea level rise trajectory (through 2100), for example, transient costs in 2065 (a year frequently anticipated for doubling of greenhouse-gas concentrations) are estimated to be roughly $70 million (undiscounted, but measured in constant 1990$). More generally and carefully cast in the appropriate context of protection decisions for developed property, the results reported here are nearly an order of magnitude lower than estimates published prior to 1994. They are based upon a calculus that reflects rising values for coastal property as the future unfolds, but also includes the cost-reducing potential of natural, market-based adaptation in anticipation of the threat of rising seas and/or the efficiency of discrete decisions to protect or not to protect small tracts of property that will be made when necessary and on the (then current) basis of their individual economic merit.This research was funded by the Electric Power Research Institute as part of its impacts assessment program. Notwithstanding that support, the opinions expressed here and responsibility for any errors reside with the authors. The authors express their appreciation for comments offered on earlier drafts by Rick Freeman, Rob Mendelsohn, Joel Smith, Tom Wilson, Jim Titus, Robert Chen and the Snowmass Workshop on the Impacts of Global Change. If we may, we would also like to dedicate this paper to the memory of Dr. James Broadus from the Woods Hole Oceanographic Institution. His untimely death was, indeed, tragic; and we miss both his company and his flawless contribution to this and other work.  相似文献   

15.
Globalization,Pacific Islands,and the paradox of resilience   总被引:1,自引:0,他引:1  
On April 2nd, 2007 a 12 m tsunami struck Simbo, a relatively remote island in Western Province, Solomon Islands. Although Simbo's population continues to depend on their own food production and small-scale governance regimes regulate access to resources, the island's way of life over the last century has increasingly been affected by processes associated with globalization. In this context of a rapidly globalizing world, this article examines the island's resilience and vulnerability to the tsunami and the adaptive capacities that enabled the response and recovery. The tsunami completely destroyed two villages and damaged fringing coral reefs, but casualties were low and social–ecological rebound relatively brisk. By combining social science methods (household surveys, focus group and ethnographic interviews) and underwater reef surveys we identify a number of countervailing challenges and opportunities presented by globalization that both nurture and suppress the island's resilience to high amplitude, low-frequency disturbances like tsunamis. Analysis suggests that certain adaptive capacities that sustain general system resilience come at the cost of more vulnerability to low-probability hazards. We discuss how communities undergoing increasingly complex processes of change must negotiate these kinds of trade-offs as they manage resilience at multiple spatial and temporal scales. Understanding the shifting dynamics of resilience may be critical for Pacific Island communities who seek to leverage globalization in their favor as they adapt to current social–ecological change and prepare for future large-scale ecological disturbances.  相似文献   

16.
Spatiotemporal parameters of barotropic seiche oscillations in the Nakhodka Gulf in the Sea of Japan are considered using the spectral finite-difference model. The finite-difference approximation of shallow water equations is carried out at the irregular triangular spatial grid. The sets of periods and spatial structures of seiche oscillations corresponding to the strongly pronounced maxima of energy spectrum of sea level data from Nakhodka station of the Russian tsunami warning service are computed and presented.  相似文献   

17.
The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias.In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias.When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.  相似文献   

18.
19.
The significance of multiple scales within processes of global environmental change has attracted increasing attention. Yet the fundamental tasks of linking multiple levels at which regulatory decisions are required to multiple scales of impacts have only recently been identified. This paper addresses the importance of attention to multiple scales in regulatory decisions, how those decisions should link together across scales of governance or decision-making, and how mismatches among scales of impact and scales of regulation can lead to regulatory gaps and breakdowns. This paper begins by presenting a definition of a cross-scale regulatory problem, building on the concept of an externality. It argues that virtually all-significant environmental regulatory problems involve multiple scales at which decisions are required, and that coordination of these decisions is one of the major issues in regulatory design. The paper provides a generalization of what is needed for effective cross-scale regulation, and then discusses the example of salmon aquaculture in British Columbia to illustrate these points. In our view, gaps and mismatches in the regulatory framework across institutional scales appear to contribute to social controversy over salmon aquaculture. These gaps include (i) the site-by-site orientation of the current regulatory process, even though the major impacts are cumulative, and regional in significance, and (ii) the degree to which limitations on the extent of salmon aquaculture are implemented by local governments, even though provincial and federal governments have the mandate and expertise to address these questions.  相似文献   

20.
Accurate sea-level rise (SLR) vulnerability assessments are essential in developing effective management strategies for coastal systems at risk. In this study, we evaluate the effect of combining vertical uncertainties in Light Detection and Ranging (LiDAR) elevation data, datum transformation and future SLR estimates on estimating potential land area and land cover loss, and whether including uncertainty in future SLR estimates has implications for adaptation decisions in Kahului, Maui. Monte Carlo simulation is used to propagate probability distributions through our inundation model, and the output probability surfaces are generalized as areas of high and low probability of inundation. Our results show that considering uncertainty in just LiDAR and transformation overestimates vulnerable land area by about 3 % for the high probability threshold, resulting in conservative adaptation decisions, and underestimates vulnerable land area by about 14 % for the low probability threshold, resulting in less reliable adaptation decisions for Kahului. Not considering uncertainty in future SLR estimates in addition to LiDAR and transformation has variable effect on SLR adaptation decisions depending on the land cover category and how the high and low probability thresholds are defined. Monte Carlo simulation is a valuable approach to SLR vulnerability assessments because errors are not required to follow a Gaussian distribution.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号