首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
A Bayesian procedure for Probabilistic Tsunami Hazard Assessment   总被引:1,自引:1,他引:0  
In this paper, a Bayesian procedure is implemented for the Probability Tsunami Hazard Assessment (PTHA). The approach is general and modular incorporating all significant information relevant for the hazard assessment, such as theoretical and empirical background, analytical or numerical models, instrumental and historical data. The procedure provides the posterior probability distribution that integrates the prior probability distribution based on the physical knowledge of the process and the likelihood based on the historical data. Also, the method deals with aleatory and epistemic uncertainties incorporating in a formal way all sources of relevant uncertainty, from the tsunami generation process to the wave propagation and impact on the coasts. The modular structure of the procedure is flexible and easy to modify and/or update as long as new models and/or information are available. Finally, the procedure is applied to an hypothetical region, Neverland, to clarify the PTHA evaluation in a realistic case.  相似文献   

2.
Typically, if uncertainty in subsurface parameters is addressed, it is done so using probability theory. Probability theory is capable of only handling one of the two types of uncertainty (aleatory), hence epistemic uncertainty is neglected. Dempster–Shafer evidence theory (DST) is an approach that allows analysis of both epistemic and aleatory uncertainty. In this paper, DST combination rules are used to combine measured field data on permeability, along with the expert opinions of hydrogeologists (subjective information) to examine uncertainty. Dempster’s rule of combination is chosen as the combination rule of choice primarily due to the theoretical development that exists and the simplicity of the data. Since Dempster’s rule does have some criticisms, two other combination rules (Yager’s rule and the Hau–Kashyap method) were examined which attempt to correct the problems that can be encountered using Dempster’s rule. With the particular data sets used here, there was not a clear superior combination rule. Dempster’s rule appears to suffice when the conflict amongst the evidence is low.  相似文献   

3.
Bay of Bengal cyclone extreme water level estimate uncertainty   总被引:4,自引:3,他引:1  
  相似文献   

4.
The character and importance of uncertainty in dam safety risk analysis drives how risk assessments are used in practice. The current interpretation of uncertainty is that, in addition to the aleatory risk which arises from presumed uncertainty in the world, it comprises the epistemic aspects of irresolution in a model or forecast, specifically model and parameter uncertainty. This is true in part but it is not all there is to uncertainty in risk analysis. The physics of hazards and of failure may be poorly understood, which goes beyond uncertainty in its conventional sense. There may be alternative scenarios of future conditions, for example non-stationarity in the environment, which cannot easily be forecast. There may also be deep uncertainties of the type associated with climate change. These are situations in which analysts do not know or do not agree on the system characterisation relating actions to consequences or on the probability distributions for key parameters. All of these facets are part of the uncertainty in risk analysis with which we must deal.  相似文献   

5.
作为地震灾害评估的理论基础,地震动力学主要研究与地震活动有关的断裂机制、破裂过程、震源辐射和由此而引起的地震波的传播及地面运动规律。对地震力学、震源辐射和能量释放等经典理论问题进行了系统研究。在此基础上,应用最新的定量地震学研究方法,以逻辑树的形式综合地震、地质和大地测量资料,提供了不同构造环境和断裂机制条件下地震灾害评估的概率分析和确定性分析实例。用于震源分析的典型构造类型包括板内地壳震源层、地壳活动断层及其速率、板块俯冲界面和俯冲板片。由于输入模型中不确定因素的存在,如输入参数的随机性和科学分析方法本身的不确定性,对分析结果的不确定性需审慎对待。通常对不同的模型或参量,包括地面衰减模型,进行加权平均可较为合理地减小结果的偏差:概率分析和确定性分析方法的结合亦为可取之有效途径。  相似文献   

6.
Hurricane surge events have caused devastating damage in active-hurricane areas all over the world. The ability to predict surge elevations and to use this information for damage estimation is fundamental for saving lives and protecting property. In this study, we developed a framework for evaluating hurricane flood risk and identifying areas that are more prone to them. The approach is based on the joint probability method with optimal sampling (JPM-OS) using surge response functions (SRFs) (JPM-OS-SRF). Derived from a discrete set of high-fidelity storm surge simulations, SRFs are non-dimensional, physics-based empirical equations with an algebraic form, used to rapidly estimate surge as a function of hurricane parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location). The advantage of an SRF-based approach is that a continuum of storm scenarios can be efficiently evaluated and used to estimate continuous probability density functions for surge extremes, producing more statistically stable surge hazard assessments without adding measurably to epistemic uncertainty. SRFs were developed along the coastline and then used to estimate maximum surge elevations with respect to a set of hurricane parameters. Integrating information such as ground elevation, property value and population with the JPM-OS-SRF allows quantification of storm surge-induced hazard impacts over the continuum of storm possibilities, yielding a framework to create the following risk-based products, which can be used to assist in hurricane hazard management and decision making: (1) expected annual loss maps; (2) flood damage versus return period relationships; and (3) affected business (e.g., number of business, number of employees) versus return period relationships. By employing several simplifying assumptions, the framework is demonstrated at three northern Gulf of Mexico study sites exhibiting similar surge hazard exposure. The framework results reveal Gulfport, MS, USA is at relatively more risk of economic loss than Corpus Christi, TX, USA, and Panama City, FL, USA. Note that economic processes are complex and very interrelated to most other human activities. Our intention here is to present a methodology to quantify the flood damage (i.e., infrastructure economic loss, number of businesses affected, number of employees in these affected businesses and sales volume in these affected businesses) but not to discuss the complex interactions of these damages with other economic activities and recovery plans.  相似文献   

7.
淮河息县站流量概率预报模型研究   总被引:11,自引:0,他引:11  
应用美国天气局采用的由Roman Krzysztofowicz开发的贝叶斯统计理论建立概率水文预报理论框架,即以分布函数形式定量地描述水文预报不确定度,研究了淮河息县站流量概率预报模型。理论和经验表明,概率预报至少与确定性预报一样有价值,特别当预报不确定度较大时,概率预报比现行确定性预报具有更高的经济价值。  相似文献   

8.
China is prone to highly frequent earthquakes due to specific geographical location, which could cause significant losses to society and economy. The task of seismic hazard analysis is to estimate the potential level of ground motion parameters that would be produced by future earthquakes. In this paper, a novel method based on fuzzy logic techniques and probabilistic approach is proposed for seismic hazard analysis (FPSHA). In FPSHA, we employ fuzzy sets for quantification of earthquake magnitude and source-to-site distance, and fuzzy inference rules for ground motion attenuation relationships. The membership functions for earthquake magnitude and source-to-site distance are provided based on expert judgments, and the construction of fuzzy rules for peak ground acceleration relationships is also based on expert judgment. This methodology enables to include aleatory and epistemic uncertainty in the process of seismic hazard analysis. The advantage of the proposed method is in its efficiency, reliability, practicability, and precision. A case study is investigated for seismic hazard analysis of Kunming city in Yunnan Province, People’s Republic of China. The results of the proposed fuzzy logic-based model are compared to other models, which confirms the accuracy in predicting the probability of exceeding a certain level of the peak ground acceleration. Further, the results can provide a sound basis for decision making of disaster reduction and prevention in Yunnan province.  相似文献   

9.
Uncertainty in surfactant–polymer flooding is an important challenge to the wide-scale implementation of this process. Any successful design of this enhanced oil recovery process will necessitate a good understanding of uncertainty. Thus, it is essential to have the ability to quantify this uncertainty in an efficient manner. Monte Carlo simulation is the traditional uncertainty quantification approach that is used for quantifying parametric uncertainty. However, the convergence of Monte Carlo simulation is relatively low, requiring a large number of realizations to converge. This study proposes the use of the probabilistic collocation method in parametric uncertainty quantification for surfactant–polymer flooding using four synthetic reservoir models. Four sources of uncertainty were considered: the chemical flood residual oil saturation, surfactant and polymer adsorption, and the polymer viscosity multiplier. The output parameter approximated is the recovery factor. The output metrics were the input–output model response relationship, the probability density function, and the first two moments. These were compared with the results obtained from Monte Carlo simulation over a large number of realizations. Two methods for solving for the coefficients of the output parameter polynomial chaos expansion are compared: Gaussian quadrature and linear regression. The linear regression approach used two types of sampling: full-tensor product nodes and Chebyshev-derived nodes. In general, the probabilistic collocation method was applied successfully to quantify the uncertainty in the recovery factor. Applying the method using the Gaussian quadrature produced more accurate results compared with using the linear regression with full-tensor product nodes. Applying the method using the linear regression with Chebyshev derived sampling also performed relatively well. Possible enhancements to improve the performance of the probabilistic collocation method were discussed. These enhancements include improved sparse sampling, approximation order-independent sampling, and using arbitrary random input distribution that could be more representative of reality.  相似文献   

10.
Although risk analysis today is considered to include three separate aspects (1) identifing sources of risk, (2) estimating probabilities quantitatively, and (3) evaluating consequences of risk, here only estimation of probabilities for natural geologic events, processes, and phenomena is addressed. Ideally, evaluation of potential future hazards includes an objective determination of probabilities that have been derived from past occurrences of identical events or components contributing to complex processes or phenomena. In practice, however, data which would permit objective estimation of those probabilities of interest may not be adequate, or may not even exist.Another problem that arises normally, regardless of the extent of data, is that risk assessments involve estimating extreme values. Probabilities are required for events that are the greatest or rarest because they commonly will have the greatest consequences; the largest, or rarest, events always fall in tails of frequency distributions. Rarely are extreme values accurately predictable even when an empirical frequency distribution is established well by data.In the absence of objective methods for estimating probabilities of natural events or processes, subjective probabilities for the hazard must be established through Bayesian methods, expert opinion, or Delphi methods. Alternative solutions may involve consequence analysis which may demonstrate that, although an event may occur, its consequences are sufficiently small that it safely may be ignored or by establishing bounds which may demonstrate that although probabilities are not known they cannot exceed a maximum value that is sufficiently small so that associated risk may be considered to be negligible.Uncertainty of every probability determination must be stated for each component of an event, process, or phenomenon. These uncertainties also must be propagated through the quantitative analysis so that a realistic estimate of total uncertainty can be associated with each final probability estimate for a geologic hazard.This paper was presented (by title) at Emerging Concepts, MGUS-87 Conference, Redwood City, California, 13–15 April 1987.  相似文献   

11.

Hurricane surge events have caused devastating damage in active-hurricane areas all over the world. The ability to predict surge elevations and to use this information for damage estimation is fundamental for saving lives and protecting property. In this study, we developed a framework for evaluating hurricane flood risk and identifying areas that are more prone to them. The approach is based on the joint probability method with optimal sampling (JPM-OS) using surge response functions (SRFs) (JPM-OS-SRF). Derived from a discrete set of high-fidelity storm surge simulations, SRFs are non-dimensional, physics-based empirical equations with an algebraic form, used to rapidly estimate surge as a function of hurricane parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location). The advantage of an SRF-based approach is that a continuum of storm scenarios can be efficiently evaluated and used to estimate continuous probability density functions for surge extremes, producing more statistically stable surge hazard assessments without adding measurably to epistemic uncertainty. SRFs were developed along the coastline and then used to estimate maximum surge elevations with respect to a set of hurricane parameters. Integrating information such as ground elevation, property value and population with the JPM-OS-SRF allows quantification of storm surge-induced hazard impacts over the continuum of storm possibilities, yielding a framework to create the following risk-based products, which can be used to assist in hurricane hazard management and decision making: (1) expected annual loss maps; (2) flood damage versus return period relationships; and (3) affected business (e.g., number of business, number of employees) versus return period relationships. By employing several simplifying assumptions, the framework is demonstrated at three northern Gulf of Mexico study sites exhibiting similar surge hazard exposure. The framework results reveal Gulfport, MS, USA is at relatively more risk of economic loss than Corpus Christi, TX, USA, and Panama City, FL, USA. Note that economic processes are complex and very interrelated to most other human activities. Our intention here is to present a methodology to quantify the flood damage (i.e., infrastructure economic loss, number of businesses affected, number of employees in these affected businesses and sales volume in these affected businesses) but not to discuss the complex interactions of these damages with other economic activities and recovery plans.

  相似文献   

12.
Some Bayesian methods of dealing with inaccurate or vague data are introduced in the framework of seismic hazard assessment. Inaccurate data affected by heterogeneous errors are modeled by a probability distribution instead of the usual value plus a random error representation; these data are generically called imprecise. The earthquake size and the number of events in a certain time are modeled as imprecise data. Imprecise data allow us to introduce into the estimation procedures the uncertainty inherent in the inaccuracy and heterogeneity of the measuring systems from which the data were obtained. The problem of estimating the parameter of a Poisson process is shown to be feasible by the use of Bayesian techniques and imprecise data. This background technique can be applied to a general problem of seismic hazard estimation. Initially, data in a regional earthquake catalog are assumed imprecise both in size and location (i.e errors in the epicenter or spreading over a given source). By means of scattered attenuation laws, the regional catalog can be translated into a so-called site catalog of imprecise events. The site catalog is then used to estimate return periods or occurrence probabilities, taking into account all sources of uncertainty. Special attention is paid to priors in the Bayesian estimation. They can be used to introduce additional information as well as scattered frequency-size laws for local events. A simple example is presented to illustrate the capabilities of this methodology.  相似文献   

13.
A method based on Bayesian techniques has been applied to evaluate the seismic hazard in the two test areas selected by the participants in the ESC/SC8-TERESA project: Sannio-Matese in Italy and the northern Rhine region (BGN). A prior site occurrence model (prior SOM) is obtain from a seismicity distribution modeled in wide seismic sources. The posterior occurrence model (posterior SOM) is calculated after a Bayesian correction which, basically, recovers the spatial information of the epicenter distribution and considers attenuation and location errors, not using source zones. The uncertainties of the occurrence probabilities are evaluated in both models.The results are displayed in terms of probability and variation coefficient contour maps for a chosen intensity level, and with plots of mean return period versus intensity in selected test sites, including the 90% probability intervals.It turns out that the posterior SOM gives a better resolution in the probability estimate, decreasing its uncertainty, especially in low seismic activity regions.  相似文献   

14.
Sea Level Rise and Its Risk Management   总被引:2,自引:0,他引:2  
Sea level rise is among the most severe societal consequences of anthropogenic climate change. Significant advance has been achieved in recent years in the study of future sea level rise and its risk management practice: ①Sea level rise is considered as a kind of hazard,its future plausible scenarios and their probabilities are necessary to be predicted and estimated,and the upper limit with very low probability and high consequences should be emphasized. For this purpose,a complete probability distribution framework has been developed to predict the scenarios and probabilities of future sea level rise with Representative Concentration Pathways (RCPs) and the Shared Socioeconomic Pathways (SSPs) in recent years. ② For a high emissions scenario,it was found that Antarctic Ice Sheet might make a contribution to Global Mean Sea Level (GMSL) rise as high as 78150 cm (mean value 114 cm) by 2100. For the same scenario,the IPCC Fifth Assessment Report gave an Antarctic contribution of only -8+14 cm (mean value 4 cm). ③ Recent studies recommended a revised worst-case (Extreme) GMSL rise scenario of 2.5 m from previous 2.0 m by 2100. It is recognized that GMSL rise will not stop at 2100; rather,it will continue to rise for centuries afterwards,but the degree of uncertainty related to sea level rise will increase. ④ Approaches of combining the upper-bound scenario and a central estimate or mid-range scenario, Adaptation Pathways and robust decision-making are developed to provide a set of long-term planning envelope. These decision-making methods are used widely in coastal risk management related to future sea level rise. Sea level rise and its risk management need to enhance monitoring,analysis and simulation to predict the global,regional and local seal level rise scenarios and the probabilities with different time scales,reduce the estimate uncertainty, assess its upper limits, and enhance decision methods and their application under deep uncertain, in order to meet the needs of climate change adaptation planning,decision-making and long-term risk management in coastal regions.  相似文献   

15.
Stability analysis generally relies on the estimate of failure probability P. When information is scarce, incomplete, imprecise or vague, this estimate is imprecise. To represent epistemic uncertainty, possibility distributions have shown to be a more flexible tool than probability distributions. The joint propagation of possibilistic and probabilistic information can rely on more advanced techniques such as the classical random sampling of the cumulative probability distribution F and of the intervals from the possibility distributions π. The imprecise probability P is then associated with a random interval, which can be summarized by a pair of indicators bounding it. In the present paper, we propose a graphical tool to explore the sensitivity on these indicators. This is conducted by means of the contribution to sample probability of failure plot based on the ordering of the randomly generated levels of confidence associated with the quantiles of F and to the α-cuts of π. This presents several advantages: (1) the contribution of both types of uncertainty, aleatoric and epistemic, can be compared in a unique setting; (2) the analysis is conducted in a post-processing step, i.e. at no extra computational cost; (3) it allows highlighting the regions of the quantiles and of the nested intervals which contribute the most to the bounds of P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry).  相似文献   

16.
This study investigated contributory factors to flood hazard around Scotland. There is a need to develop preliminary assessments of areas potentially vulnerable to flooding for compliance with the European Union Directive on the Assessment and Management of Flood Risks (2007/60/EC). Historical accounts of coastal flood events in Scotland, notably in a storm in January 2005, had shown that estimates of risk based on still water levels required further information to identify sites at which waves and surges could combine. Additionally, it was important to add the effect of future sea-level rise and other drivers from published sources. Analysis of multiple years’ tidal data at seven sites, including estuaries, compared recorded water levels at high-return periods to those derived from a spatially interpolated numerical model contained within a publicly available flood risk map. For gauges with the longest records, increases were seen over time that reflected rises in mean sea level. Exposure to wave energy was computed from prevailing wind strength and direction at 36 stations, related to wave fetch and incident wind direction. Although the highest wave exposure was at open coast locations exposed to the long Atlantic fetch, GIS analysis of coastal rasters identified other areas in or close to estuaries that also had high exposure. Projected sea-level change, when added to the surge and wave analyses, gives a spatially extensive structured variable flood risk assessment for future coastal flood hazard to complement the public flood risk map. Such tools can help fulfil the requirements of the EC Directive and may be a useful approach in other regions with high spatial variability in coastal flood risk related to exposure to waves and wind.  相似文献   

17.
Flood hazard evaluation is an important input for Nuclear Power Plants external events safety studies. In the present study, flood hazard at various nuclear sites in India due to rainfall has been evaluated. Hazard estimation is a statistical procedure by which rainfall intensity versus occurrence frequency is estimated from historical records of rainfall data and extrapolated with asymptotic extreme value distribution. Rainfall data needed for flood hazard assessment are daily annual maximum rainfall (24?h data). The observed data points have been fitted using Gumbel, power law and exponential distribution, and return period has been estimated. To study the stationarity of rainfall data, a moving window estimate of the parameters has been performed. The rainfall pattern is stationary in both coastal and inland regions over the period of observation. The coastal regions show intense rainfall and higher variability than inland regions. Based on the plant layout, catchment area and drainage capacity, the prototype fast breeder reactor (PFBR) site is unlikely to be flooded.  相似文献   

18.
Uncertainties in observed data and in processing field and laboratory tests are major concerns. Assigning reasonable coefficients of variation to the parameters in the conventional analyses indicates that a site with deterministic factors of safety of 1.5 can actually have liquefaction triggering probability above 20%. About a third of the variance comes from uncertainty in the load, which is independent of the resistance. Researchers have traditionally presented the results of case studies in the form of charts showing instances in which liquefaction did and did not occur and have developed relations to separate the two. Although the original researchers developed the separations informally, recent work has applied statistical methods. These give the sampling distributions of the observed data rather than the probability of triggering given the data. Researchers have addressed this issue using Bayesian methods, adopting non-informative priors. Published curves of liquefaction probabilities can be interpreted as likelihood ratios. Other independent work demonstrates that geological, meteorological, and historical data can be used to develop prior probabilities, so it may not be necessary to assume a non-informative prior. The actual prior can then be combined with the likelihood ratios to provide rational probabilities of liquefaction. We recommend that researchers publish their likelihood ratios and allow engineers faced with particular sites to use those to update their own priors.  相似文献   

19.
Time independent seismic hazard analysis in Alborz and surrounding area   总被引:1,自引:0,他引:1  
The Bayesian probability estimation seems to have efficiencies that make it suitable for calculating different parameters of seismicity. Generally this method is able to combine prior information on seismicity while at the same time including statistical uncertainty associated with the estimation of the parameters used to quantify seismicity, in addition to the probabilistic uncertainties associated with the inherent randomness of earthquake occurrence. In this article a time-independent Bayesian approach, which yields the probability that a certain cut-off magnitude will be exceeded at certain time intervals is examined for the region of Alborz, Iran, in order to consider the following consequences for the city of Tehran. This area is located within the Alpine-Himalayan active mountain belt. Many active faults affect the Alborz, most of which are parallel to the range and accommodate the present day oblique convergence across it. Tehran, the capital of Iran, with millions of inhabitants is located near the foothills of the southern Central Alborz. This region has been affected several times by historical and recent earthquakes that confirm the importance of seismic hazard assessment through it. As the first step in this study an updated earthquake catalog is compiled for the Alborz. Then, by assuming a Poisson distribution for the number of earthquakes which occur at a certain time interval, the probabilistic earthquake occurrence is computed by the Bayesian approach. The highest probabilities are found for zone AA and the lowest probabilities for zones KD and CA, meanwhile the overall probability is high.  相似文献   

20.
Thanks to modelling advances and the increase in computational resources in recent years, it is now feasible to perform 2-D urban flood simulations at very high spatial resolutions and to conduct flood risk assessments at the scale of single buildings. In this study, we explore the sensitivity of flood loss estimates obtained in such micro-scale analyses to the spatial representation of the buildings in the 2D flood inundation model and to the hazard attribution methods in the flood loss model. The results show that building representation has a limited effect on the exposure values (i.e. the number of elements at risk), but can have a significant impact on the hazard values attributed to the buildings. On the other hand, the two methods for hazard attribution tested in this work result in remarkably different flood loss estimates. The sensitivity of the predicted flood losses to the attribution method is comparable to the one associated with the vulnerability curve. The findings highlight the need for incorporating these sources of uncertainty into micro-scale flood risk prediction methodologies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号