Univariate and multivariate stress release models are fitted to historical earthquake data from North China. It is shown that a better fit is obtained by treating separately the Eastern part of the region, including the North China Plain and Bohai Sea, and the Western part of the region, including the Ordos Plateau and its Eastern boundary. Further improvement is obtained by fitting the large events (M7.6) and smaller events in the Western region by different stress release models. The comparisons are made by computing the likelihoods of the fitted models and discounting the number of parameters used by Akaike's AIC criterion. The models are used to develop long-term risk scenarios for the East and West regions. 相似文献
The Cu–Co–Ni Texeo mine has been the most important source of Cu in NW Spain since Roman times and now, approximately 40,000 m3 of wastes from mine and metallurgical operations, containing average concentrations of 9,263 mg kg−1 Cu, 1,100 mg kg−1 As, 549 mg kg−1 Co, and 840 mg kg−1 Ni, remain on-site. Since the cessation of the activity, the abandoned works, facilities and waste piles have been posing
a threat to the environment, derived from the release of toxic elements. In order to assess the potential environmental pollution
caused by the mining operations, a sequential sampling strategy was undertaken in wastes, soil, surface and groundwater, and
sediments. First, screening field tools were used to identify hotspots, before defining formal sampling strategies; so, in
the areas where anomalies were detected in a first sampling stage, a second detailed sampling campaign was undertaken. Metal
concentrations in the soils are highly above the local background, reaching up to 9,921 mg kg−1 Cu, 1,373 mg kg−1 As, 685 mg kg−1 Co, and 1,040 mg kg−1 Ni, among others. Copper concentrations downstream of the mine works reach values up to 1,869 μg l−1 and 240 mg kg−1 in surface water and stream sediments, respectively. Computer-based risk assessment for the site gives a carcinogenic risk
associated with the presence of As in surface waters and soils, and a health risk for long exposures; so, trigger levels of
these elements are high enough to warrant further investigation. 相似文献
The aim of this paper is to discuss a number of issues related to the use of spatial information for landslide susceptibility, hazard, and vulnerability assessment. The paper centers around the types of spatial data needed for each of these components, and the methods for obtaining them. A number of concepts are illustrated using an extensive spatial data set for the city of Tegucigalpa in Honduras. The paper intends to supplement the information given in the “Guidelines for Landslide Susceptibility, Hazard and Risk Zoning for Land Use Planning” by the Joint ISSMGE, ISRM and IAEG Technical Committee on Landslides and Engineered Slopes (JTC-1). The last few decades have shown a very fast development in the application of digital tools such as Geographic Information Systems, Digital Image Processing, Digital Photogrammetry and Global Positioning Systems. Landslide inventory databases are becoming available to more countries and several are now also available through the internet. A comprehensive landslide inventory is a must in order to be able to quantify both landslide hazard and risk. With respect to the environmental factors used in landslide hazard assessment, there is a tendency to utilize those data layers that are easily obtainable from Digital Elevation Models and satellite imagery, whereas less emphasis is on those data layers that require detailed field investigations. A review is given of the trends in collecting spatial information on environmental factors with a focus on Digital Elevation Models, geology and soils, geomorphology, land use and elements at risk. 相似文献
Many different runout prediction methods can be applied to estimate the mobility of future debris flows during hazard assessment. The present article reviews the empirical, analytical, simple flow routing and numerical techniques. All these techniques were applied to back-calculate a debris flow, which occurred in 1982 at La Guingueta catchment, in the Eastern Pyrenees. A sensitivity analysis of input parameters was carried out, while special attention was paid to the influence of rheological parameters. We used the Voellmy fluid rheology for our analytical and numerical modelling, since this flow resistance law coincided best with field observations. The simulation results indicated that the “basal” friction coefficients rather affect the runout distance, while the “turbulence” terms mainly influence flow velocity. A comparison of the velocity computed on the fan showed that the analytical model calculated values similar to the numerical ones. The values of our rheological parameters calibrated at La Guingueta agree with data back-calculated for other debris flows. Empirical relationships represent another method to estimate total runout distance. The results confirmed that they contain an important uncertainty and they are strictly valid only for the conditions, which were the basis for their development. With regards to the simple flow routing algorithm, this methods could satisfactorily simulate the total area affected by the 1982 debris flow, but it was not able to directly calculate total runout distance and velocity. Finally, a suggestion on how different runout prediction methods can be applied to generate debris-flow hazard maps is presented. Taking into account the definition of hazard and intensity, the best choice would be to divide the resulting hazard maps into two types: “final hazard maps” and “preliminary hazard maps”. Only the use of numerical models provided final hazard maps, because they could incorporate different event magnitudes and they supplied output-values for intensity calculation. In contrast, empirical relationships and flow routing algorithms, or a combination of both, could be applied to create preliminary hazard maps. The present study only focussed on runout prediction methods. Other necessary tasks to complete the hazard assessment can be looked up in the “Guidelines for landslide susceptibility, hazard and risk zoning” included in this Special Issue. 相似文献
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies.
Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making. 相似文献
We describe empirical results from a multi-disciplinary project that support modeling complex processes of land-use and land-cover change in exurban parts of Southeastern Michigan. Based on two different conceptual models, one describing the evolution of urban form as a consequence of residential preferences and the other describing land-cover changes in an exurban township as a consequence of residential preferences, local policies, and a diversity of development types, we describe a variety of empirical data collected to support the mechanisms that we encoded in computational agent-based models. We used multiple methods, including social surveys, remote sensing, and statistical analysis of spatial data, to collect data that could be used to validate the structure of our models, calibrate their specific parameters, and evaluate their output. The data were used to investigate this system in the context of several themes from complexity science, including have (a) macro-level patterns; (b) autonomous decision making entities (i.e., agents); (c) heterogeneity among those entities; (d) social and spatial interactions that operate across multiple scales and (e) nonlinear feedback mechanisms. The results point to the importance of collecting data on agents and their interactions when producing agent-based models, the general validity of our conceptual models, and some changes that we needed to make to these models following data analysis. The calibrated models have been and are being used to evaluate landscape dynamics and the effects of various policy interventions on urban land-cover patterns. 相似文献
By definition, a crisis is a situation that requires assistance to be managed. Hence, response to a crisis involves the merging
of local and non-local emergency response personnel. In this situation, it is critical that each participant: (1) know the
roles and responsibilities of each of the other participants; (2) know the capabilities of each of the participants; and (3)
have a common basis for action. For many types of natural disasters, this entails having a common operational picture of the unfolding events, including detailed information on the weather, both current and forecasted, that may impact on either
the emergency itself or on response activities. The Consequences Assessment Tool Set (CATS) is a comprehensive package of
hazard prediction models and casualty and damage assessment tools that provides a linkage between a modeled or observed effect
and the attendant consequences for populations, infrastructure, and resources, and, hence, provides the common operational
picture for emergency response. The Operational Multiscale Environment model with Grid Adaptivity (OMEGA) is an atmospheric
simulation system that links the latest methods in computational fluid dynamics and high-resolution gridding technologies
with numerical weather prediction to provide specific weather analysis and forecast capability that can be merged into the
geographic information system framework of CATS. This paper documents the problem of emergency response as an end-to-end system
and presents the integrated CATS–OMEGA system as a prototype of such a system that has been used successfully in a number
of different situations. 相似文献
Quantitative sinkhole hazard assessments in karst areas allow calculation of the potential sinkhole risk and the performance
of cost-benefit analyses. These estimations are of practical interest for planning, engineering, and insurance purposes. The
sinkhole hazard assessments should include two components: the probability of occurrence of sinkholes (sinkholes/km2 year) and the severity of the sinkholes, which mainly refers to the subsidence mechanisms (progressive passive bending or
catastrophic collapse) and the size of the sinkholes at the time of formation; a critical engineering design parameter. This
requires the compilation of an exhaustive database on recent sinkholes, including information on the: (1) location, (2) chronology
(precise date or age range), (3) size, and (4) subsidence mechanisms and rate. This work presents a hazard assessment from
an alluvial evaporite karst area (0.81 km2) located in the periphery of the city of Zaragoza (Ebro River valley, NE Spain). Five sinkholes and four locations with features
attributable to karstic subsidence where identified in an initial investigation phase providing a preliminary probability
of occurrence of 0.14 sinkholes/km2 year (11.34% in annual probability). A trenching program conducted in a subsequent investigation phase allowed us to rule
out the four probable sinkholes, reducing the probability of occurrence to 0.079 sinkholes/km2 year (6.4% in annual probability). The information on the severity indicates that collapse sinkholes 10–15 m in diameter
may occur in the area. A detailed study of the deposits and deformational structures exposed by trenching in one of the sinkholes
allowed us to infer a modern collapse sinkhole approximately 12 m in diameter and with a vertical throw of 8 m. This collapse
structure is superimposed on a subsidence sinkhole around 80 m across that records at least 1.7 m of synsedimentary subsidence.
Trenching, in combination with dating techniques, is proposed as a useful methodology to elucidate the origin of depressions
with uncertain diagnosis and to gather practical information with predictive utility about particular sinkholes in alluvial
karst settings: precise location, subsidence mechanisms and magnitude, and timing and rate of the subsidence episodes. 相似文献