首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines the potential development of a probabilistic design methodology, considering hysteretic energy demand, within the framework of performance‐based seismic design of buildings. This article does not propose specific energy‐based criteria for design guidelines, but explores how such criteria can be treated from a probabilistic design perspective. Uniform hazard spectra for normalized hysteretic energy are constructed to characterize seismic demand at a specific site. These spectra, in combination with an equivalent systems methodology, are used to estimate hysteretic energy demand on real building structures. A design checking equation for a (hypothetical) probabilistic energy‐based performance criterion is developed by accounting for the randomness of the earthquake phenomenon, the uncertainties associated with the equivalent system analysis technique, and with the site soil factor. The developed design checking equation itself is deterministic, and requires no probabilistic analysis for use. The application of the proposed equation is demonstrated by applying it to a trial design of a three‐storey steel moment frame. The design checking equation represents a first step toward the development of a performance‐based seismic design procedure based on energy criterion, and additional works needed to fully implement this are discussed in brief at the end of the paper. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

2.
The use of nonlinear static procedures for performance‐based seismic design (PBSD) and assessment is a well‐established practice, which has found its way into modern codes for quite some time. On the other hand, near‐source (NS) ground motions are receiving increasing attention, because they can carry seismic demand systematically different and larger than that of the so‐called ordinary records. This is due to phenomena such as rupture forward directivity (FD), which can lead to distinct pulses appearing in the velocity time‐history of the ground motion. The framework necessary for taking FD into account in probabilistic seismic hazard analysis (PSHA) has recently been established. The objective of the present study is to discuss the extension of nonlinear static procedures, specifically the displacement coefficient method (DCM), with respect to the inelastic demand associated with FD. In this context, a methodology is presented for the implementation of the DCM toward estimating NS seismic demand, by making use of the results of NS‐PSHA and a semi‐empirical equation for NS‐FD inelastic displacement ratio. An illustrative application of the DCM, with explicit inclusion of NS‐pulse‐like effects, is given for a set of typical plane R/C frames designed under Eurocode provisions. Different scenarios are considered in the application and nonlinear dynamic analysis results are obtained and discussed with respect to the static procedure estimates. Conclusions drawn from the results may help to assess the importance of incorporating NS effects in PBSD. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
Previous comparison studies on seismic isolation have demonstrated its beneficial and detrimental effects on the structural performance of high‐speed rail bridges during earthquakes. Striking a balance between these 2 competing effects requires proper tuning of the controlling design parameters in the design of the seismic isolation system. This results in a challenging problem for practical design in performance‐based engineering, particularly when the uncertainty in seismic loading needs to be explicitly accounted for. This problem can be tackled using a novel probabilistic performance‐based optimum seismic design (PPBOSD) framework, which has been previously proposed as an extension of the performance‐based earthquake engineering methodology. For this purpose, a parametric probabilistic demand hazard analysis is performed over a grid in the seismic isolator parameter space, using high‐throughput cloud‐computing resources, for a California high‐speed rail (CHSR) prototype bridge. The derived probabilistic structural demand hazard results conditional on a seismic hazard level and unconditional, i.e., accounting for all seismic hazard levels, are used to define 2 families of risk features, respectively. Various risk features are explored as functions of the key isolator parameters and are used to construct probabilistic objective and constraint functions in defining well‐posed optimization problems. These optimization problems are solved using a grid‐based, brute‐force approach as an application of the PPBOSD framework, seeking optimum seismic isolator parameters for the CHSR prototype bridge. This research shows the promising use of seismic isolation for CHSR bridges, as well as the potential of the versatile PPBOSD framework in solving probabilistic performance‐based real‐world design problems.  相似文献   

4.
The lack of direct correspondence between control objectives and hazard risks over the lifetime of systems is a key shortcoming of current control techniques. This along with the inability to objectively analyze the benefits and costs of control solutions compared with conventional methods has hindered widespread application of control systems in seismic regions. To address these gaps, this paper offers 2 new contributions. First, it introduces risk‐based life cycle–cost (LCC) optimal control algorithms, where LCC is incorporated as the performance objective in the control design. Two strategies called risk‐based linear quadratic regulator and unconstrained risk‐based regulator are subsequently proposed. The considered costs include the initial cost of the structure and control system, LCC of maintenance, and probabilistically derived estimates of seismic‐induced repair costs and losses associated with downtime, injuries, and casualties throughout the life of the structure. This risk‐based framework accounts for uncertainties in both system properties and hazard excitations and uses outcrossing rate theory to estimate fragilities for various damage states. The second contribution of this work is a risk‐based probabilistic framework for LCC analysis of existing and proposed control strategies. The proposed control designs are applied to the nonlinear model of a 4‐story building subjected to seismic excitations. Results show that these control methods reduce the LCC of the structure significantly compared with the status quo option (benefits of up to $1 351 000). The advancements offered in this paper enhance the cost‐effectiveness of control systems and objectively showcase their benefits for risk‐informed decision making.  相似文献   

5.
This discussion examines the conclusion reached in the paper that in a single‐story asymmetric‐plan building the maximum displacement demand in the different resisting elements is reached for the same deformation configuration of the system and that the resultant of the seismic forces producing such demand is located at the center of resistance. It is shown that this conclusion is valid only for the particular model studied and cannot be generalized. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

6.
A versatile, simulation‐based framework for risk assessment and probabilistic sensitivity analysis of base‐isolated structures is discussed in this work. A probabilistic foundation is used to address the various sources of uncertainties, either excitation or structural, and to characterize seismic risk. This risk is given, in this stochastic setting, by some statistics of the system response over the adopted probability models and stochastic simulation is implemented for its evaluation. An efficient, sampling‐based approach is also introduced for establishing a probabilistic sensitivity analysis to identify the importance of each of the uncertain model parameters in affecting the overall risk. This framework facilitates use of complex models for the structural system and the excitation. The adopted structural model explicitly addresses nonlinear characteristics of the isolators and of any supplemental dampers, and the effect of seismic pounding of the base to the surrounding retaining walls. An efficient stochastic ground motion model is also discussed for characterizing future near‐fault ground motions and relating them to the seismic hazard for the structural site. An illustrative example is presented that emphasizes the results from the novel probabilistic sensitivity analysis and their dependence on seismic pounding occurrences and on addition of supplemental dampers. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

7.
A software prototype of a simulation service software environment, called DOSE (distributed object‐based software environment), is developed to realize the integrated simulation of an urban system under the risk of urban‐scale hazards such as earthquakes. DOSE infrastructure is built on three basic building blocks, namely: modularity, scalability, and interoperability. In this paper, the application of DOSE to real‐world urban systems is described in order to provide an evidence for DOSE modularity and scalability. An overview of DOSE is presented and then followed by a beverage application to simulate earthquake hazard in an urban system. The urban system is developed for the city of Kobe (Kobe district) with dimensions of 700 × 500 (m) and Bunkyo ward (Tokyo district) with dimensions of 800 × 600 (m) where DOSE simulation participants are identified for each district. The effectiveness of data exchange among different participants through a distributed service exchange network is described as an evidence for DOSE modularity that facilitates the integration process. On the other hand, the effectiveness of processing time when applying the simulation to different urban system sizes and/or using different third‐party applications is described as an evidence for DOSE scalability. The details of the underlying infrastructure of DOSE are beyond the scope of this paper and are presented in an accompanying paper work. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

8.
The performance and serviceability of structural systems during their lifetime can be significantly affected by the occurrence of extreme events. Despite their low probability, there is a potential for multiple occurrences of such hazards during the relatively long service life of systems. This paper introduces a comprehensive framework for the assessment of lifecycle cost of infrastructures subject to multiple hazard events throughout their decision‐making time horizon. The framework entails the lifecycle costs of maintenance and repair, as well as the salvage value of the structure at the end of the decision‐making time horizon. The primary features of the proposed framework include accounting for the possibility of multiple hazard occurrences, incorporating effects of incomplete repair actions on the accumulated damage through damage state‐dependent repair times, and requiring limited resources in terms of input data and computational costs. A dynamic programming procedure is proposed to calculate the expected damage condition of the structure for each possibility of the number of hazard incidents based on state‐dependent fragility curves. The proposed framework is applied to a moment‐frame building located in a region with high seismicity, and lifecycle costs are evaluated for six retrofit plans. The results displayed variation in the ranking of the retrofit actions with respect to decision‐making time horizon. Furthermore, the sensitivity analyses demonstrated that disregarding repair time in the lifecycle cost analysis can result in false identification of unsafe retrofit actions as optimal and reliable strategies. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

9.
Two existing, contemporary ground motion selection and modification procedures – (i) exact conditional spectrum (CS‐exact) and (ii) generalized conditional intensity measure (GCIM) – are evaluated in their ability to accurately estimate seismic demand hazard curves (SDHCs) of a given structure at a specified site. The amount of effort involved in implementing these procedures to compute a single SDHC is studied, and a case study is chosen where rigorous benchmark SDHCs can be determined for evaluation purposes. By comparing estimates from ground motion selection and modification procedures with the benchmark, we conclude that estimates from CS‐exact are unbiased in many of the cases considered. The estimates from GCIM are even more accurate, as they are unbiased for most – but not all – of the cases where estimates from CS‐exact are biased. We find that it is possible to obtain biased SDHCs from GCIM, even after employing a very diverse collection of intensity measures to select ground motions and implementing its bias‐checking feature, because it is usually difficult to identify intensity measures that are truly ‘sufficient’ for the response of a complex, multi‐degree‐of‐freedom system. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

10.
In the design and assessment of structures, the aspects regarding the future performance are gaining increased attention. A wide range of performance measures is covered by ‘sustainability’ to reflect these aspects. There is the need for well established methods for quantifying the metrics of sustainability. In this paper, a framework for assessing the time‐variant sustainability of bridges associated with multiple hazards considering the effects of structural deterioration is presented. The approach accounts for the effects of flood‐induced scour on seismic fragility. Sustainability is quantified in terms of its social, environmental, and economic metrics. These include the expected downtime and number of fatalities, expected energy waste and carbon dioxide emissions, and the expected loss. The proposed approach is illustrated on a reinforced concrete bridge. The effects of corrosion on reinforcement bars and concrete cover spalling are accounted. The seismic fragility curves at different points in time are obtained through nonlinear finite element analyses. The variation of the metrics of sustainability in time is presented. The effects of flood‐induced scour on both seismic fragility and metrics are also investigated. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

11.
This study evaluates the effect of considering ground motion duration when selecting hazard‐consistent ground motions for structural collapse risk assessment. A procedure to compute source‐specific probability distributions of the durations of ground motions anticipated at a site, based on the generalized conditional intensity measure framework, is developed. Targets are computed for three sites in Western USA, located in distinct tectonic settings: Seattle, Eugene, and San Francisco. The effect of considering duration when estimating the collapse risk of a ductile reinforced concrete moment frame building, designed for a site in Seattle, is quantified by conducting multiple stripe analyses using groups of ground motions selected using different procedures. The mean annual frequency of collapse (λcollapse) in Seattle is found to be underestimated by 29% when using typical‐duration ground motions from the PEER NGA‐West2 database. The effect of duration is even more important in sites like Eugene (λcollapse underestimated by 59%), where the seismic hazard is dominated by large magnitude interface earthquakes, and less important in sites like San Francisco (λcollapse underestimated by 7%), where the seismic hazard is dominated by crustal earthquakes. Ground motion selection procedures that employ causal parameters like magnitude, distance, and Vs30 as surrogates for ground motion duration are also evaluated. These procedures are found to produce poor fits to the duration and response spectrum targets because of the limited number of records that satisfy typical constraints imposed on the ranges of the causal parameters. As a consequence, ground motions selected based on causal parameters are found to overestimate λcollapse by 53%. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
In this short communication, we respond to the comments made by Dr Brendon A. Bradley and provide additional context to our paper under discussion.Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

13.
In this paper, a distributed object‐based software environment (DOSE) has been developed to facilitate the integrated simulation of an urban system under the risk of urban‐scale hazards such as earthquakes. It is understood that individual simulation participants perform their simulation services in separate environments, bartering service exchange relationships to get what they need to resolve their part of the problem. This is the communication gap between the scientists on one side and the end users who need to understand knowledge and employ it on the other side. The authors envision a distributed simulation service software environment running in parallel with the activities of simulation participants. DOSE has lent itself to integrate interdisciplinary participants through an infrastructure that has three basic building blocks, namely: modularity, scalability, and interoperability. The modular, object‐based, design of DOSE architecture is described in terms of key functionalities of four distinct layers, namely: resource, core, domain, and interface layers. DOSE scalability in terms of urban system size and participant third‐party application complexity is enabled through the interface layer. A message passing model is developed using the Message Passing Interface standard and a control room is provided to schedule the interaction/communication among model processes. DOSE interoperability with the vulnerability analysis third‐party applications is enabled through the Industry Foundation Classes (IFC) standard. An adopted analogy between DOSE and construction industry is employed to provide interpretation and implementation for DOSE interoperability. While interfacing IFC object model to solve DOSE interoperability questions, an extension model for the structural view of IFC is proposed and accepted by the International Alliance for Interoperability. The DOSE application for real‐world urban systems is beyond the scope of this paper and is presented in an accompanying paper work. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

14.
数据库的建立是信息系统形成的基础,而对灾害的深入分析和研究有赖于信息系统的支持。本文简述了火山灾害研究的一般方法,提出了长白山火山灾害数据库建立的基本设想,并讨论了其未来的应用方向。  相似文献   

15.
The paper under discussion presents a series of quasi‐static tests used to examine the behavior of steel reinforced concrete (SRC) walls subjected to high axial force and lateral cyclic loading. A total of six wall specimens were designed, including five SRC walls and one reinforced concrete (RC) wall. In the ‘Summary’ section of the discussed paper, the authors state that: ‘The use of SRC walls has gained popularity in the construction of high‐rise buildings because of their superior performance over conventional RC walls’. The authors also proposed that, the SRC wall specimens showed increased flexural strength and deformation capacity relative to their RC wall counterpart. The discussion is prompted to rectify some statements and conclusions of the paper under discussion. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

16.
我国防震减灾工作中的地震应急信息系统建设   总被引:2,自引:0,他引:2  
本文描述了我国地震应急快速响应信息系统建设的有关内容,包括系统的目标、总体框架、内容和工作流程。该系统以全国地震计算机网络系统为支撑环境,以GIS软件为应用开发平台,可以实现对破坏性地震的短临预测信息跟踪与处理,对大震速报信息进行快速响应,并可进行灾害损失快速评估、震后地震活动趋势判断、防震减灾应急对策信息服务、应急指挥综合信息显示等。该系统作为中国地震局“九五”重点项目正在建设之中。  相似文献   

17.
Rupture directivity effects in ground motion are known since many years to both seismologists and earthquake engineers, i.e. in sites that are in a particular geometrical configuration with respect to the rupture, the velocity fault‐normal signals may show a large pulse which occurs at the beginning of the record and contains the most of energy. The results are waveforms different from ordinary ground motions recorded in the far field or in geometrical conditions not favorable with respect to directivity. Current attenuation laws are not able to capture such effect well, if at all, and current probabilistic seismic hazard analysis is not able to predict the resulting peculiar spectral shape. Moreover, it is believed that structures with dynamic behavior in a range of periods related to the pulse period may be subjected to underestimated seismic demand. In the paper this is investigated and increments in both elastic and inelastic seismic actions are quantified using a large dataset of records, from the next generation attenuation project (NGA), in which a fraction is comprised of velocity pulses identified in other studies. These analyses employ recently developed tools and procedures to assess directivity effects and to quantify the associated threat in terms of seismic action on structures. Subsequently, the same tools are used in one of the first attempts to identify near‐source effects in the data recorded during a normal faulting earthquake, the mainshock of the recent Abruzzo (central Italy) sequence, leading to conclude that pulse‐like effects are likely to have occurred in the event, that is (1) observation of pulse‐like records in some near‐source stations is in fair agreement with existing predictive models, (2) the increment in seismic demand shown by pulse‐like ground motion components complies with the results of the analysis of the NGA data, and (3) seismic demand in non‐impulsive recordings is generally similar to what expected for ordinary records. The results may be useful as a benchmark for inclusion of near‐source effect in design values of seismic action and structural risk analysis. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

18.
This paper outlines the application of a new data‐based mechanistic (DBM) modelling methodology to the characterization of the sediment transmission dynamics in a small upland reservoir, Wyresdale Park, Lancashire. The DBM modelling strategy exploits advanced statistical procedures to infer the dynamic model structure and its associated parameters directly from the instrumented data, producing a parametrically efficient, continuous time, transfer function model which relates suspended sediment load at the reservoir inflow to the outflow at the event scale. The associated differential equation model parameters have physical attributes which can be interpreted in terms of sediment transmission processes and associated reservoir trap efficiency. Sedigraph analysis suggests that wind‐induced resuspension episodically supplies an additional load to the reservoir outlet. The stochastic nature of the DBM model makes it ideal for evaluating the effects of uncertainty through Monte Carlo simulations (MCS) for discharge and sediment transmission. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

19.
Existing design procedures for determining the separation distance between adjacent buildings subjected to seismic pounding risk are based on approximations of the buildings' peak relative displacement. These procedures are characterized by unknown safety levels and thus are not suitable for use within a performance‐based earthquake engineering framework. This paper introduces an innovative reliability‐based methodology for the design of the separation distance between adjacent buildings. The proposed methodology, which is naturally integrated into modern performance‐based design procedures, provides the value of the separation distance corresponding to a target probability of pounding during the design life of the buildings. It recasts the inverse reliability problem of the determination of the design separation distance as a zero‐finding problem and involves the use of analytical techniques in order to evaluate the statistics of the dynamic response of the buildings. Both uncertainty in the seismic intensity and record‐to‐record variability are taken into account. The proposed methodology is applied to several different buildings modeled as linear elastic single‐degree‐of‐freedom (SDOF) and multi‐degree‐of‐freedom (MDOF) systems, as well as SDOF nonlinear hysteretic systems. The design separation distances obtained are compared with the corresponding estimates that are based on several response combination rules suggested in the seismic design codes and in the literature. In contrast to current seismic code design procedures, the newly proposed methodology provides consistent safety levels for different building properties and different seismic hazard conditions. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

20.
This paper provides a generic equation for the evaluation of the maximum earthquake magnitude mmax for a given seismogenic zone or entire region. The equation is capable of generating solutions in different forms, depending on the assumptions of the statistical distribution model and/or the available information regarding past seismicity. It includes the cases (i) when earthquake magnitudes are distributed according to the doubly-truncated Gutenberg-Richter relation, (ii) when the empirical magnitude distribution deviates moderately from the Gutenberg-Richter relation, and (iii) when no specific type of magnitude distribution is assumed. Both synthetic, Monte-Carlo simulated seismic event catalogues, and actual data from Southern California, are used to demonstrate the procedures given for the evaluation of mmax.The three estimates of mmax for Southern California, obtained by the three procedures mentioned above, are respectively: 8.32 ± 0.43, 8.31 ± 0.42 and 8.34 ± 0.45. All three estimates are nearly identical, although higher than the value 7.99 obtained by Field et al. (1999). In general, since the third procedure is non-parametric and does not require specification of the functional form of the magnitude distribution, its estimate of the maximum earthquake magnitude mmax is considered more reliable than the other two which are based on the Gutenberg-Richter relation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号