首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis   总被引:1,自引:0,他引:1  
The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell?CMcGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang?CTang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.  相似文献   

2.
In risk assessment studies it is important to determine how uncertain and imprecise knowledge should be included into the simulation and assessment models. Thus, proper evaluation of uncertainties has become a major concern in environmental and health risk assessment studies. Previously, researchers have used probability theory, more commonly Monte Carlo analysis, to incorporate uncertainty analysis in health risk assessment studies. However, in conducting probabilistic health risk assessment, risk analyst often suffers from lack of data or the presence of imperfect or incomplete knowledge about the process modeled and also the process parameters. Fuzzy set theory is a tool that has been used in propagating imperfect and incomplete information in health risk assessment studies. Such analysis result in fuzzy risks which are associated with membership functions. Since possibilistic health risk assessment studies are relatively new, standard procedures for decision-making about the acceptability of the resulting fuzzy risk with respect to a crisp standard set by the regulatory agency are not fully established. In this paper, we are providing a review of several available approaches which may be used in decision-making. These approaches involve defuzzification techniques, the possibility and the necessity measures. In this study, we also propose a new measure, the risk tolerance measure, which can be used in decision making. The risk tolerance measure provides an effective metric for evaluating the acceptability of a fuzzy risk with respect to a crisp compliance criterion. Fuzzy risks with different membership functions are evaluated with respect to a crisp compliance criterion by using the possibility, the necessity, and the risk tolerance measures and the results are discussed comparatively.  相似文献   

3.
Natural hazards have the potential to trigger complex chains of events in technological installations leading to disastrous effects for the surrounding population and environment. The threat of climate change of worsening extreme weather events exacerbates the need for new models and novel methodologies able to capture the complexity of the natural-technological interaction in intuitive frameworks suitable for an interdisciplinary field such as that of risk analysis. This study proposes a novel approach for the quantification of risk exposure of nuclear facilities subject to extreme natural events. A Bayesian Network model, initially developed for the quantification of the risk of exposure from spent nuclear material stored in facilities subject to flooding hazards, is adapted and enhanced to include in the analysis the quantification of the uncertainty affecting the output due to the imprecision of data available and the aleatory nature of the variables involved. The model is applied to the analysis of the nuclear power station of Sizewell B in East Anglia (UK), through the use of a novel computational tool. The network proposed models the direct effect of extreme weather conditions on the facility along several time scenarios considering climate change predictions as well as the indirect effects of external hazards on the internal subsystems and the occurrence of human error. The main novelty of the study consists of the fully computational integration of Bayesian Networks with advanced Structural Reliability Methods, which allows to adequately represent both aleatory and epistemic aspects of the uncertainty affecting the input through the use of probabilistic models, intervals, imprecise random variables as well as probability bounds. The uncertainty affecting the output is quantified in order to attest the significance of the results and provide a complete and effective tool for risk-informed decision making.  相似文献   

4.
Assessing uncertainty in estimation of seismic response for PBEE   总被引:1,自引:0,他引:1       下载免费PDF全文
State‐of‐the‐art approaches to probabilistic assessment of seismic structural reliability are based on simulation of structural behavior via nonlinear dynamic analysis of computer models. Simulations are carried out considering samples of ground motions supposedly drawn from specific populations of signals virtually recorded at the site of interest. This serves to produce samples of structural response to evaluate the failure rate, which in turn allows to compute the failure risk (probability) in a time interval of interest. This procedure alone implies that uncertainty of estimation affects the probabilistic results. The latter is seldom quantified in risk analyses, although it may be relevant. This short paper discusses some basic issues and some simple statistical tools, which can aid the analyst towards the assessment of the impact of sample variability on fragility functions and the resulting seismic structural risk. On the statistical inference side, the addressed strategies are based on consolidated results such as the well‐known delta method and on some resampling plans belonging to the bootstrap family. On the structural side, they rely on assumptions and methods typical in performance‐based earthquake engineering applications. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

5.
The ability to describe variables in a health risk model through probability theory enables us to estimate human health risk. These types of risk assessment are interpreted as probabilistic risk assessment (PRA). Generally, PRA requires specific estimate of the parameters of the probability density of the input variables. In all circumstances, such estimates of the parameters may not be available due to the lack of knowledge or information. Such types of variables are treated as uncertain variables. These types of information are often termed uncertainty which are interpreted through fuzzy theory. The ability to describe uncertainty through fuzzy set theory enables us to process both random variable and fuzzy variable in a single framework. The method of processing aleatory and epistemic uncertainties into a same framework is coined as hybrid method. In this paper, we are going to talk about such type of hybrid methodology for human health risk assessment. Risk assessment on human health through different pathways of exposure has been attempted many a times combining Monte Carlo analysis and extension principle of fuzzy set theory. The emergence of credibility theory enables transforming fuzzy variable into credibility distribution function which can be used in those hybrid analyses. Hence, an attempt, for the first time, has been made to combine probability theory and credibility theory to estimate risk in human health exposure. This method of risk assessment in the presence of credibility theory and probability theory is identified as probabilistic-credibility method (PCM). The results obtained are then interpreted through probability theory, unlike the other hybrid methodology where the results are interpreted in terms of possibility theory. The results obtained are then compared with probability-fuzzy risk assessment (PFRA) method. Generally, decision under hybrid methodology is made on the index of optimism. An optimistic decision maker estimates from the \(\alpha\)-cut at 1, whereas a pessimistic decision maker estimates from the \(\alpha\)-cut at 0. The PCM is an optimistic approach as the decision is always made at \(\alpha\)=1.  相似文献   

6.
In this study, a multistage scenario-based interval-stochastic programming (MSISP) method is developed for water-resources allocation under uncertainty. MSISP improves upon the existing multistage optimization methods with advantages in uncertainty reflection, dynamics facilitation, and risk analysis. It can directly handle uncertainties presented as both interval numbers and probability distributions, and can support the assessment of the reliability of satisfying (or the risk of violating) system constraints within a multistage context. It can also reflect the dynamics of system uncertainties and decision processes under a representative set of scenarios. The developed MSISP method is then applied to a case of water resources management planning within a multi-reservoir system associated with joint probabilities. A range of violation levels for capacity and environment constraints are analyzed under uncertainty. Solutions associated different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help water managers to identify desired policies under various economic, environmental and system-reliability conditions. Besides, sensitivity analyses demonstrate that the violation of the environmental constraint has a significant effect on the system benefit.  相似文献   

7.
Environmental risk management is an integral part of risk analyses. The selection of different mitigating or preventive alternatives often involve competing and conflicting criteria, which requires sophisticated multi-criteria decision-making (MCDM) methods. Analytic hierarchy process (AHP) is one of the most commonly used MCDM methods, which integrates subjective and personal preferences in performing analyses. AHP works on a premise that decision-making of complex problems can be handled by structuring the complex problem into a simple and comprehensible hierarchical structure. However, AHP involves human subjectivity, which introduces vagueness type uncertainty and necessitates the use of decision-making under uncertainty. In this paper, vagueness type uncertainty is considered using fuzzy-based techniques. The traditional AHP is modified to fuzzy AHP using fuzzy arithmetic operations. The concept of risk attitude and associated confidence of a decision maker on the estimates of pairwise comparisons are also discussed. The methodology of the proposed technique is built on a hypothetical example and its efficacy is demonstrated through an application dealing with the selection of drilling fluid/mud for offshore oil and gas operations.  相似文献   

8.
Artificial neural network (ANN) has been demonstrated to be a promising modelling tool for the improved prediction/forecasting of hydrological variables. However, the quantification of uncertainty in ANN is a major issue, as high uncertainty would hinder the reliable application of these models. While several sources have been ascribed, the quantification of input uncertainty in ANN has received little attention. The reason is that each measured input quantity is likely to vary uniquely, which prevents quantification of a reliable prediction uncertainty. In this paper, an optimization method, which integrates probabilistic and ensemble simulation approaches, is proposed for the quantification of input uncertainty of ANN models. The proposed approach is demonstrated through rainfall-runoff modelling for the Leaf River watershed, USA. The results suggest that ignoring explicit quantification of input uncertainty leads to under/over estimation of model prediction uncertainty. It also facilitates identification of appropriate model parameters for better characterizing the hydrological processes.  相似文献   

9.
10.
This paper introduces a risk-based decision process integrated into a drought early warning system (DEWS) for reservoir operation. It is to support policy making under uncertainty for drought management. Aspects of posterior risk, chances of option occurrences and the corresponding options to given chances, are provided to help decision makers to make better decisions. A new risk index is also defined to characterize decision makers’ attitudes toward risk. Decision makers can understand the inclination of attitude associated with any specific probability through accuracy assessment, and learn to adjust their attitudes in decision-making process. As a pioneering experiment, the Shihmen reservoir in northern Taiwan was tested. Over the simulation period (1964–2005), the expected overall accuracy approximated to 77%. The results show that the proposed approach is very practical and should find good use for reservoir operations.  相似文献   

11.
J.J. Yu 《水文科学杂志》2013,58(12):2117-2131
Abstract

A generalized likelihood uncertainty estimation (GLUE) framework coupling with artificial neural network (ANN) models in two surrogate schemes (i.e. GAE-S1 and GAE-S2) was proposed to improve the efficiency of uncertainty assessment in flood inundation modelling. The GAE-S1 scheme was to construct an ANN to approximate the relationship between model likelihoods and uncertain parameters for facilitating sample acceptance/rejection instead of running the numerical model directly; thus, it could speed up the Monte Carlo simulation in stochastic sampling. The GAE-S2 scheme was to establish independent ANN models for water depth predictions to emulate the numerical models; it could facilitate efficient uncertainty analysis without additional model runs for locations concerned under various scenarios. The results from a study case showed that both GAE-S1 and GAE-S2 had comparable performances to GLUE in terms of estimation of posterior parameters, prediction intervals of water depth, and probabilistic inundation maps, but with reduced computational requirements. The results also revealed that GAE-S1 possessed a slightly better performance in accuracy (referencing to GLUE) than GAE-S2, but a lower flexibility in application. This study shed some light on how to apply different surrogate schemes in using numerical models for uncertainty assessment, and could help decision makers in choosing cost-effective ways of conducting flood risk analysis.  相似文献   

12.
Multi-criteria decision making under uncertainty for flood mitigation   总被引:1,自引:1,他引:0  
Designs of flood mitigation infrastructural systems are decision-making which are often made under various uncertainties involving multiple criteria. Under the condition of uncertainties, any chosen design alternative has the likelihood to perform inferior to other unselected designs in terms of the adopted performance indicators. This paper introduces a quantitative risk measure based on the concept of expected opportunity loss (EOL) for evaluating the consequence of making the wrong decision. The EOL can be used to assess the relative performance of multiple decision alternatives and is extended to deal with decision problems involving multiple criteria. In particular, the probabilistic features of the consequences associated with a design alternative is considered and used in the Preference Ranking Organization Method of Enrichment Evaluation (PROMETHEE) MCDM technique. The integration of PROMETHEE and decision making under uncertainty is demonstrated through an example of flood damage mitigation planning.  相似文献   

13.
地震岩相识别概率表征方法   总被引:4,自引:3,他引:1       下载免费PDF全文
储层岩相分布信息是油藏表征的重要参数,基于地震资料开展储层岩相识别通常具有较强的不确定性.传统方法仅获取唯一确定的岩相分布信息,无法解析反演结果的不确定性,增加了油藏评价的风险.本文引入基于概率统计的多步骤反演方法开展地震岩相识别,通过在其各个环节建立输入与输出参量的统计关系,然后融合各环节概率统计信息构建地震数据与储层岩相的条件概率关系以反演岩相分布概率信息.与传统方法相比,文中方法通过概率统计关系表征了地震岩相识别各个环节中地球物理响应关系的不确定性,并通过融合各环节概率信息实现了不确定性传递的数值模拟,最终反演的岩相概率信息能够客观准确地反映地震岩相识别结果的不确定性,为油藏评价及储层建模提供了重要参考信息.模型数据和实际资料应用验证了方法的有效性.  相似文献   

14.
The quantification of uncertainty in the simulations from complex physically based distributed hydrologic models is important for developing reliable applications. The generalized likelihood uncertainty estimation method (GLUE) is one of the most commonly used methods in the field of hydrology. The GLUE helps reduce the parametric uncertainty by deriving the probability distribution function of parameters, and help analyze the uncertainty in model output. In the GLUE, the uncertainty of model output is analyzed through Monte Carlo simulations, which require large number of model runs. This induces high computational demand for the GLUE to characterize multi-dimensional parameter space, especially in the case of complex hydrologic models with large number of parameters. While there are a lot of variants of GLUE that derive the probability distribution of parameters, none of them have addressed the computational requirement in the analysis. A method to reduce such computational requirement for GLUE is proposed in this study. It is envisaged that conditional sampling, while generating ensembles for the GLUE, can help reduce the number of model simulations. The mutual relationship between the parameters was used for conditional sampling in this study. The method is illustrated using a case study of Soil and Water Assessment Tool (SWAT) model on a watershed in the USA. The number of simulations required for the uncertainty analysis was reduced by 90 % in the proposed method compared to existing methods. The proposed method also resulted in an uncertainty reduction in terms of reduced average band width and high containing ratio.  相似文献   

15.
This paper develops a new method for decision-making under uncertainty. The method, Bayesian Programming (BP), addresses a class of two-stage decision problems with features that are common in environmental and water resources. BP is applicable to two-stage combinatorial problems characterized by uncertainty in unobservable parameters, only some of which is resolved upon observation of the outcome of the first-stage decision. The framework also naturally accommodates stochastic behavior, which has the effect of impeding uncertainty resolution. With the incorporation of systematic methods for decision search and Monte Carlo methods for Bayesian analysis, BP addresses limitations of other decision-analytic approaches for this class of problems, including conventional decision tree analysis and stochastic programming. The methodology is demonstrated with an illustrative problem of water quality pollution control. Its effectiveness for this problem is compared to alternative approaches, including a single-stage model in which expected costs are minimized and a deterministic model in which uncertain parameters are replaced by their mean values. A new term, the expected value of including uncertainty resolution, or EVIUR, is introduced and evaluated for the illustrative problem. It is a measure of the worth of incorporating the experimental value of decisions into an optimal decision-making framework. For the illustrative problem, the two-stage adaptive management framework extracted up to approximately 50% of the gains of perfect information. The strength and limitations of the method are discussed and conclusions are presented.  相似文献   

16.
The increasing effort to develop and apply nonstationary models in hydrologic frequency analyses under changing environmental conditions can be frustrated when the additional uncertainty related to the model complexity is accounted for along with the sampling uncertainty. In order to show the practical implications and possible problems of using nonstationary models and provide critical guidelines, in this study we review the main tools developed in this field (such as nonstationary distribution functions, return periods, and risk of failure) highlighting advantages and disadvantages. The discussion is supported by three case studies that revise three illustrative examples reported in the scientific and technical literature referring to the Little Sugar Creek (at Charlotte, North Carolina), Red River of the North (North Dakota/Minnesota), and the Assunpink Creek (at Trenton, New Jersey). The uncertainty of the results is assessed by complementing point estimates with confidence intervals (CIs) and emphasizing critical aspects such as the subjectivity affecting the choice of the models’ structure. Our results show that (1) nonstationary frequency analyses should not only be based on at-site time series but require additional information and detailed exploratory data analyses (EDA); (2) as nonstationary models imply that the time-varying model structure holds true for the entire future design life period, an appropriate modeling strategy requires that EDA identifies a well-defined deterministic mechanism leading the examined process; (3) when the model structure cannot be inferred in a deductive manner and nonstationary models are fitted by inductive inference, model structure introduces an additional source of uncertainty so that the resulting nonstationary models can provide no practical enhancement of the credibility and accuracy of the predicted extreme quantiles, whereas possible model misspecification can easily lead to physically inconsistent results; (4) when the model structure is uncertain, stationary models and a suitable assessment of the uncertainty accounting for possible temporal persistence should be retained as more theoretically coherent and reliable options for practical applications in real-world design and management problems; (5) a clear understanding of the actual probabilistic meaning of stationary and nonstationary return periods and risk of failure is required for a correct risk assessment and communication.  相似文献   

17.
This paper presents the development of a probabilistic multi‐model ensemble of statistically downscaled future projections of precipitation of a watershed in New Zealand. Climate change research based on the point estimates of a single model is considered less reliable for decision making, and multiple realizations of a single model or outputs from multiple models are often preferred for such purposes. Similarly, a probabilistic approach is preferable over deterministic point estimates. In the area of statistical downscaling, no single technique is considered a universal solution. This is due to the fact that each of these techniques has some weaknesses, owing to its basic working principles. Moreover, watershed scale precipitation downscaling is quite challenging and is more prone to uncertainty issues than downscaling of other climatological variables. So, multi‐model statistical downscaling studies based on a probabilistic approach are required. In the current paper, results from the three well‐reputed statistical downscaling methods are used to develop a Bayesian weighted multi‐model ensemble. The three members of the downscaling ensemble of this study belong to the following three broad categories of statistical downscaling methods: (1) multiple linear regression, (2) multiple non‐linear regression, and (3) stochastic weather generator. The results obtained in this study show that the new strategy adopted here is promising because of many advantages it offers, e.g. it combines the outputs of multiple statistical downscaling methods, provides probabilistic downscaled climate change projections and enables the quantification of uncertainty in these projections. This will encourage any future attempts for combining the results of multiple statistical downscaling methods. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

18.
Permanent fault displacements (PFDs) because of fault ruptures emerging at the surface are critical for seismic design and risk assessment of continuous pipelines. They impose significant compressive and tensile strains to the pipe cross‐section at pipe‐fault crossings. The complexity of fault rupture, inaccurate mapping of fault location and uncertainties in fault‐pipe crossing geometries require probabilistic approaches for assessing the PFD hazard and mitigating pipeline failure risk against PFD. However, the probabilistic approaches are currently waived in seismic design of pipelines. Bearing on these facts, this paper first assesses the probabilistic PFD hazard by using Monte Carlo‐based stochastic simulations whose theory and implementation are given in detail. The computed hazard is then used in the probabilistic risk assessment approach to calculate the failure probability of continuous pipelines under different PFD levels as well as pipe cross‐section properties. Our probabilistic pipeline risk computations consider uncertainties arising from complex fault rupture and geomorphology that result in inaccurate mapping of fault location and fault‐pipe crossings. The results presented in this paper suggest the re‐evaluation of design provisions in current pipeline design guidelines to reduce the seismic risk of these geographically distributed structural systems. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment   总被引:15,自引:4,他引:11  
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated.  相似文献   

20.
Uncertainty Analysis in Atmospheric Dispersion Modeling   总被引:1,自引:0,他引:1  
The concentration of a pollutant in the atmosphere is a random variable that cannot be predicted accurately, but can be described using quantities such as ensemble mean, variance, and probability distribution. There is growing recognition that the modeled concentrations of hazardous contaminants in the atmosphere should be described in a probabilistic framework. This paper discusses the various types of uncertainties in atmospheric dispersion models, and reviews sensitivity/uncertainty analysis methods to characterize and/or reduce them. Evaluation and quantification of the range of uncertainties in predictions yield a deeper insight into the capabilities and limitations of atmospheric dispersion models, and increase our confidence in decision-making based on models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号