首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In many parts of the world, earthquakes threaten regional infrastructure systems. For modeling risk using stochastic earthquake catalogs, random variables include rupture location and the damage state of different components. Thus, there is an infinite set of possible damage maps that a risk modeler could evaluate in an event‐based probabilistic loss model. Even a finite but large number of damage maps may not be practical, because many network performance measures are computationally expensive. Here, we show a computationally efficient method for selecting a subset of damage maps, corresponding ground‐motion intensity maps, and associated occurrence rates that reasonably estimates the full distribution of the ground‐motion intensity and a target performance measure using optimization. The method chooses a subset of maps and associated annual rates of occurrence that minimizes the error in estimating the distribution of a network performance measure as well as the marginal distributions of ground‐motion intensity exceedance. The joint distribution of the ground‐motion intensity is implicitly included in the objective function of the optimization problem via the network performance measure. We then show how to tune the optimization parameters based on consistency checks related to the network performance measure and the ground‐motion hazard. We illustrate the proposed method with a case study of the San Francisco Bay Area road network to estimate the exceedance curve of the average percentage change in morning commute trip time. This work facilitates expanded and risk‐consistent studies of the impacts of infrastructure networks on regional seismic risk and resiliency. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

2.
This paper reports the main results of the EC-ProjectSERGISAI. The project developed a computer prototypewhere a methodology for seismic risk assessment hasbeen implemented. Standard procedural codes,Geographic Information Systems and ArtificialIntelligence Techniques compose the prototype, whichpermits a seismic risk assessment to be carried outthrough the necessary steps. Risk is expressed interms of expected damage, given by the combination ofhazard and vulnerability. Two parallel paths have beenfollowed with respect to the hazard factor: theprobabilistic and the deterministic approach. Thefirst provides the hazard analysis based on historicaldata, propagation models, and known seismic sources.The deterministic approach provides the input forscenarios, by selecting a specific ground motion.With respect to the vulnerability factor, severalsystems have been taken into account apart frombuildings, which are usually considered in this typeof analysis. Defining vulnerability as a measure ofhow prone a system is to be damaged in the event of anearthquake, an attempt has been made to move from theassessment of individual objects to the evaluation ofthe performance of urban and regional areas. Anotherstep towards an approach which can better serve civilprotection and land use planning agencies has beenmade by adapting the analysis to the followinggeographical levels: local, sub-regional and regional.Both the hazard and the vulnerability factors havebeen treated in the most suitable way for each one, interms of level of detail, kind of parameters and unitsof measure. In this paper are shown some resultsobtained in two test areas: Toscana in Italy, for theregional level, the Garfagnana sub-area in Toscana,for the sub-regional level, and a part of the city ofBarcelona, Spain, for the local level.  相似文献   

3.
Operative seismic aftershock risk forecasting can be particularly useful for rapid decision‐making in the presence of an ongoing sequence. In such a context, limit state first‐excursion probabilities (risk) for the forecasting interval (a day) can represent the potential for progressive state of damage in a structure. This work lays out a performance‐based framework for adaptive aftershock risk assessment in the immediate post‐mainshock environment. A time‐dependent structural performance variable is adopted in order to measure the cumulative damage in a structure. A set of event‐dependent fragility curves as a function of the first‐mode spectral acceleration for a prescribed limit state is calculated by employing back‐to‐back nonlinear dynamic analyses. An epidemic‐type aftershock sequence model is employed for estimating the spatio‐temporal evolution of aftershocks. The event‐dependent fragility curves for a given limit state are then integrated together with the probability distribution of aftershock spectral acceleration based on the epidemic‐type aftershock sequence aftershock hazard. The daily probability of limit state first‐excursion is finally calculated as a weighted combination of the sequence of limit state probabilities conditioned on the number of aftershocks. As a numerical example, daily aftershock risk is calculated for the L'Aquila 2009 aftershock sequence (central Italy). A representative three‐story reinforced concrete frame with infill panels, which has cyclic strength and stiffness degradation, is used in order to evaluate the progressive damage. It is observed that the proposed framework leads to a sound forecasting of limit state first‐excursion in the structure for two limit states of significant damage and near collapse. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

4.
A framework to estimate sediment loads based on the statistical distribution of sediment concentrations and various functional forms relating distribution characteristics (e.g. mean and variance) to covariates is developed. The covariates are used as surrogates to represent the main processes involved in sediment generation and transport. Statistical models of increasing complexity are built and compared to assess their relative performance using available sediment concentration and covariate data. Application to the Beaurivage River watershed (Québec, Canada) is conducted using data for the 1989–2004 period. The covariates considered in this application are streamflow and calendar day. A comparison of different statistical models shows that, in this case, the log‐normal distribution with a mean value depending on streamflow (power law with an additive term) and calendar day (sinusoidal), a constant coefficient of variation for streamflow dependence and a constant standard deviation for calendar day dependence provide the best result. Model parameters are estimated using the maximum likelihood estimation technique. The selected model is then used to estimate the distribution of annual sediment loads for the Beaurivage River watershed for a selected period. A bootstrap parametric method is implemented to account for uncertainties in parameter values and to build the distributions of annual loads. Comparison of model results with estimates obtained using the empirical ratio estimator shows that the latter were rarely within the 0·1–0·9 quantile interval of the distributions obtained with the proposed approach. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

5.
Drought is a recurring feature of the climate, responsible for social and economic losses in India. In the present work, attempts were made to estimate the drought hazard and risk using spatial and temporal datasets of Tropical Rainfall Measuring Mission (TRMM) and Moderate Resolution Imaging Spectroradiometer (MODIS) in integration with socio-economic vulnerability. The TRMM rainfall was taken into account for trend analysis and Standardized Precipitation Index (SPI) estimation, with aim to investigate the changes in rainfall and deducing its pattern over the area. The SPI and average rainfall data derived from TRMM were interpolated to obtain the spatial and temporal pattern over the entire South Bihar of India, while the MODIS datasets were used to derive the Normalized Difference Vegetation Index (NDVI) deviation in the area. The Geographical Information System (GIS) is taken into account to integrate the drought vulnerability and hazard, in order to estimate the drought risk over entire South Bihar. The results indicated that approximately 36.90% area is facing high to very high drought risk over north-eastern and western part of South Bihar and need conservation measurements to combat this disaster.  相似文献   

6.
A scheme for meteorological drought analysis at various temporal and spatial scales based on a spatial Bayesian interpolation of drought severity derived from Standardized Precipitation Index (SPI) values at observed stations is presented and applied to the Huai River basin of China in this paper, using monthly precipitation record from 1961 to 2006 in 30 meteorological stations across the basin. After dividing the study area into regular grids, drought condition in gauged sites are classified into extreme, severe, moderate and non drought according to SPIs at month, seasonal and annual time scales respectively while that in ungauged grids are explained as risks of various drought severities instead of single state by a Bayesian interpolation. Subsequently, temporal and spatial patterns of drought risks are investigated statistically. Main conclusions of the research are as follows: (1) drought at seasonal scale was more threatening than the other two time scales with a larger number of observed drought events and more notable variation; (2) results of the Mann–Kendall test revealed an upward trend of drought risk in April and September; (3) there were larger risks of extreme and severe drought in southern and northwestern parts of the basin while the northeastern areas tended to face larger risks of moderate drought. The case study in Huai River basin suggests that the proposed approach is a viable and flexible tool for monitoring meteorological drought at multiple scales with a more specific insight into drought characteristics at each severity level.  相似文献   

7.
Megathrust earthquake sequences, comprising mainshocks and triggered aftershocks along the subduction interface and in the overriding crust, can impact multiple buildings and infrastructure in a city. The time between the mainshocks and aftershocks usually is too short to retrofit the structures; therefore, moderate‐size aftershocks can cause additional damage. To have a better understanding of the impact of aftershocks on city‐wide seismic risk assessment, a new simulation framework of spatiotemporal seismic hazard and risk assessment of future M9.0 sequences in the Cascadia subduction zone is developed. The simulation framework consists of an epidemic‐type aftershock sequence (ETAS) model, ground‐motion model, and state‐dependent seismic fragility model. The spatiotemporal ETAS model is modified to characterise aftershocks of large and anisotropic M9.0 mainshock ruptures. To account for damage accumulation of wood‐frame houses due to aftershocks in Victoria, British Columbia, Canada, state‐dependent fragility curves are implemented. The new simulation framework can be used for quasi‐real‐time aftershock hazard and risk assessments and city‐wide post‐event risk management.  相似文献   

8.
Near real-time monitoring of hydrological drought requires the implementation of an index capable of capturing the dynamic nature of the phenomenon. Starting from a dataset of modelled daily streamflow data, a low-flow index was developed based on the total water deficit of the discharge values below a certain threshold. In order to account for a range of hydrological regimes, a daily 95th percentile threshold was adopted, which was computed by means of a 31-day moving window. The observed historical total water deficits were statistically fitted by means of the exponential distribution and the corresponding probability values were used as a measure of hydrological drought severity. This approach has the advantage that it directly exploits daily streamflow values, as well as allowing a near real-time update of the index at regular time steps (i.e. 10 days, or dekad). The proposed approach was implemented on discharge data simulated by the LISFLOOD model over Europe during the period 1995–2015; its reliability was tested on four case studies found within the European drought reference database, as well as against the most recent summer drought observed in Central Europe in 2015. These validations, even if only qualitative, highlighted the ability of the index to capture the timing (starting date and duration) of the main historical hydrological drought events, and its good performance in comparison with the commonly used standardized runoff index (SRI). Additionally, the spatial evolution of the most recent event was captured well in a simulated near real-time test case, suggesting the suitability of the index for operational implementation within the European Drought Observatory.  相似文献   

9.
BET_VH: a probabilistic tool for long-term volcanic hazard assessment   总被引:2,自引:0,他引:2  
In this paper, we illustrate a Bayesian Event Tree to estimate Volcanic Hazard (BET_VH). The procedure enables us to calculate the probability of any kind of long-term hazardous event for which we are interested, accounting for the intrinsic stochastic nature of volcanic eruptions and our limited knowledge regarding related processes. For the input, the code incorporates results from numerical models simulating the impact of hazardous volcanic phenomena on an area and data from the eruptive history. For the output, the code provides a wide and exhaustive set of spatiotemporal probabilities of different events; these probabilities are estimated by means of a Bayesian approach that allows all uncertainties to be properly accounted for. The code is able to deal with many eruptive settings simultaneously, weighting each with its own probability of occurrence. In a companion paper, we give a detailed example of application of this tool to the Campi Flegrei caldera, in order to estimate the hazard from tephra fall.  相似文献   

10.
This paper discusses some aspects of flood frequency analysis using the peaks-over-threshold model with Poisson arrivals and generalized Pareto (GP) distributed peak magnitudes under nonstationarity, using climate covariates. The discussion topics were motivated by a case study on the influence of El Niño–Southern Oscillation on the flood regime in the Itajaí river basin, in Southern Brazil. The Niño3.4 (DJF) index is used as a covariate in nonstationary estimates of the Poisson and GP distributions scale parameters. Prior to the positing of parametric dependence functions, a preliminary data-driven analysis was carried out using nonparametric regression models to estimate the dependence of the parameters on the covariate. Model fits were evaluated using asymptotic likelihood ratio tests, AIC, and Q–Q plots. Results show statistically significant and complex dependence relationships with the covariate on both nonstationary parameters. The nonstationary flood hazard measure design life level (DLL) was used to compare the relative performances of stationary and nonstationary models in quantifying flood hazard over the period of records. Uncertainty analyses were carried out in every step of the application using the delta method.  相似文献   

11.
辽西北地区农业干旱灾害风险评价与风险区划研究   总被引:5,自引:0,他引:5  
以辽西北29个农业县(市、区)为研究区域,选取辽西北最主要的玉米作物作为研究对象,从造成农业干旱灾害的致灾因子危险性、承灾体暴露性、脆弱性和抗旱减灾能力4个方面着手,利用自然灾害风险指数法、加权综合评价法和层次分析法,建立了农业干旱灾害风险指数(ADRI),用以表征农业干旱灾害风险程度;借助GIS技术,绘制辽西北农业干旱灾害风险评价区划图,将风险评价区划图与2006年辽西北受干旱影响粮食减产系数区划图对比,发现两者可以较好的匹配。研究结果可为当地农业干旱灾害预警、保险,以及有关部门的旱灾管理、减灾决策制定提供理论依据和指导。  相似文献   

12.
Seismic Hazard Assessment: Issues and Alternatives   总被引:3,自引:0,他引:3  
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used interchangeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been proclaimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications.  相似文献   

13.
This paper is intended to compare the hazard rate from the Bayesian approach with the hazard rate from the maximum likelihood estimate (MLE) method. The MLE of a parameter is appropriate as long as there are sufficient data. For various reasons, however, sufficient data may not be available, which may make the result of the MLE method unreliable. In order to resolve the problem, it is necessary to rely on judgment about unknown parameters. This is done by adopting the Bayesian approach. The hazard rate of a mixture model can be inferred from a method called Bayesian estimation. For eliciting a prior distribution which can be used in deriving a Bayesian estimate, a computerized-simulation method is introduced. Finally, a numerical example is given to illustrate the potential benefits of the Bayesian approach.  相似文献   

14.
A pattern recognition approach to liquefacation evaluation is propoesed. The state of any soil layer at a level ground site subject to seismic loads is represented by a pattern in a seven-dimensional feature space and can be classified into one of three classes: liquefiable cohesive soil, and non-liquefiable cohesionless soil. The liquefaction potential of the soil layer can be assessed according to the probabilities of the pattern belonging to the three classes. Training patterns derived from field data (piezocone (CPTU) data and maximum ground acceleration) from sites which liquefied or did not liquefy during earthquakes in New Zealand are randomly chosen to design a pattern recognition system to provide an optimal estimation of the liquefaction potential of any soil stratum of interest. Two recognition systems have been set up to estimate the state-conditional probability density function. One is based on a Parzen window approach in which no knowledge of the probabilistic structure of the training patterns is assumed; the other is based on a parameter estimation approach assuming a multivariate normal distribution. The error rate of recognition by the Parzen window approach is 6·9% when taking the window size as 1·5, and the error rate by the parameter estimation approach, which can be easily, is 7·7%. implemented without reference to our training patterns  相似文献   

15.
Two new closed‐form expressions representing the mean rate of exceedance of a given limit state are presented herein. These proposals overcome limitations that were identified with the original formulation of the well‐known SAC/FEMA approach. The new expressions involve new parametric functions for the modeling of the seismic hazard data and for the demand evolution for increasing values of the earthquake intensity measure. Given the carefully selected parametric form of these functions, mathematical tractability is able to be maintained to establish two new closed‐form solutions representing the mean rate of exceedance of a given limit state. The function proposed for the hazard exhibits nonlinear behavior in log‐log space and is able to represent the actual hazard data over a wider range of earthquake intensity levels. The function proposed for the demand evolution addresses issues related to the inadequate performance of the SAC/FEMA approach when force‐based demand parameters such as the shear force are considered. To illustrate the applicability of the new closed‐form solutions, the probability of occurrence of several limit states is determined for a reinforced concrete structure. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

16.
Extreme environmental events have considerable impacts on society. Preparation to mitigate or forecast accurately these events is a growing concern for governments. In this regard, policy and decision makers require accurate tools for risk estimation in order to take informed decisions. This work proposes a Bayesian framework for a unified treatment and statistical modeling of the main components of risk: hazard, vulnerability and exposure. Risk is defined as the expected economic loss or population affected as a consequence of a hazard event. The vulnerability is interpreted as the loss experienced by an exposed population due to hazard events. The framework combines data of different spatial and temporal supports. It produces a sequence of temporal risk maps for the domain of interest including a measure of uncertainty for the hazard and vulnerability. In particular, the considered hazard (rainfall) is interpolated from point-based measured rainfall data using a hierarchical spatio-temporal Kriging model, whose parameters are estimated using the Bayesian paradigm. Vulnerability is modeled using zero-inflated distributions with parameters dependent on climatic variables at local and large scales. Exposure is defined as the total population settled in the spatial domain and is interpolated using census data. The proposed methodology was applied to the Vargas state of Venezuela to map the spatio-temporal risk for the period 1970–2006. The framework highlights both high and low risk areas given extreme rainfall events.  相似文献   

17.
This study seeks to construct a hazard function for earthquake probabilities based on potential foreshocks. Earthquakes of magnitude 6.5 and larger that occurred between 1976 and 2000 in an offshore area of the Tohoku region of northeast Japan were selected as events for estimating probabilities. Later occurrences of multiple events and aftershocks were omitted from targets. As a result, a total of 14 earthquakes were employed in the assessment of models. The study volume spans 300 km (East-West) × 660 km (North-South) × 60 km in depth. The probability of a target earthquake occurring at a certain point in time-space depends on the number of small earthquakes that occurred per unit volume in that vicinity. In this study, we assume that the hazard function increases geometrically with the number of potential foreshocks within a constrained space-time window. The parameters for defining potential foreshocks are magnitude, spatial extent and lead time to the point of assessment. The time parameter is studied in ranges of 1 to 5 days (1-day steps), and spatial parameters in 20 to 100 km (20-km steps). The model parameters of the hazard function are determined by the maximum likelihood method. The most effective hazard function examined was the following case: When an earthquake of magnitude 4.5 to 6.5 occurs, the hazard for a large event is increased significantly for one day within a 20 km radius surrounding the earthquake. If two or more such earthquakes are observed, the model expects a 20,000 times greater probability of an earthquake of magnitude 6.5 or greater than in the absence of such events.  相似文献   

18.
Effectsofmagnitudeaccuracyandcomplete┐nesdataonseismichazardparametersHUI-CHENGSHAO(邵辉成),JIA-SHUXIE(谢家树),PINGWANG(王平)andYA-X...  相似文献   

19.
Regional bivariate modeling of droughts using L-comoments and copulas   总被引:1,自引:0,他引:1  
The regional bivariate modeling of drought characteristics using the copulas provides valuable information for water resources management and drought risk assessment. The regional frequency analysis (RFA) can specify the similar sites within a region using L-comoments approach. One of the important steps in the RFA is estimating regional parameters of the copula function. In the present study, an optimization-based method along with the adjusted charged system search are introduced and applied to estimate the regional parameters of the copula models. The capability of the proposed methodology is illustrated by copula functions on drought events. Three commonly used copulas containing Clayton, Frank and Gumbel are employed to derive the joint distribution of drought severity and duration. The result of the new method are compared to the method of moments and after applying several goodness-of-fit tests, the results indicate that the new method provides higher accuracy than the classic one. Furthermore, the results of the upper tail dependence coefficient indicate that the Gumbel copula is the best-fitted copula among the other ones for modeling drought characteristics.  相似文献   

20.
This paper describes the application of a methodology for the evaluation of debris-flow risk in alluvial fans by incorporating numerical simulations with Geographical Information Systems to identify potential debris-flow hazard areas. The methodology was applied to a small catchment located in the north-eastern part of Sicily, Italy where an extreme debris flow event occurred in October 2007. The adopted approach integrates a slope stability model that identifies the areas of potential shallow landslides under different meteorological conditions using a two-dimensional finite-element model based on the De Saint Venant equation for the debris-flow propagation. The mechanical properties of the debris were defined using both laboratory and in situ test results. The risk classification of the area under study was derived using total hydrodynamic force per unit width (impact pressure) as an indicator for event intensity. Based on the simulation results, a potential risk zone was identified and mapped.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号