首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Probabilistic-fuzzy health risk modeling   总被引:3,自引:2,他引:1  
Health risk analysis of multi-pathway exposure to contaminated water involves the use of mechanistic models that include many uncertain and highly variable parameters. Currently, the uncertainties in these models are treated using statistical approaches. However, not all uncertainties in data or model parameters are due to randomness. Other sources of imprecision that may lead to uncertainty include scarce or incomplete data, measurement error, data obtained from expert judgment, or subjective interpretation of available information. These kinds of uncertainties and also the non-random uncertainty cannot be treated solely by statistical methods. In this paper we propose the use of fuzzy set theory together with probability theory to incorporate uncertainties into the health risk analysis. We identify this approach as probabilistic-fuzzy risk assessment (PFRA). Based on the form of available information, fuzzy set theory, probability theory, or a combination of both can be used to incorporate parameter uncertainty and variability into mechanistic risk assessment models. In this study, tap water concentration is used as the source of contamination in the human exposure model. Ingestion, inhalation and dermal contact are considered as multiple exposure pathways. The tap water concentration of the contaminant and cancer potency factors for ingestion, inhalation and dermal contact are treated as fuzzy variables while the remaining model parameters are treated using probability density functions. Combined utilization of fuzzy and random variables produces membership functions of risk to individuals at different fractiles of risk as well as probability distributions of risk for various alpha-cut levels of the membership function. The proposed method provides a robust approach in evaluating human health risk to exposure when there is both uncertainty and variability in model parameters. PFRA allows utilization of certain types of information which have not been used directly in existing risk assessment methods.  相似文献   

2.
Probabilistic risk analysis is an effective tool for risk-informed decision-making related to the building facilities. All sources of the uncertainties should be considered in seismic risk assessment framework. Not only the levels of these uncertainties but also the effects on the performance of the buildings should be clearly identified. This paper aims to assess the impacts of the potential uncertainties on the seismic risk of steel frame equipped with steel panel wall (SPWF). Firstly, the performance limits of the SPWF structures are determined according to cyclic test results of two SPWF specimens. Then a validated numerical model of a 12-story SPWF building is modeled and used to perform the nonlinear time-history analysis, and the record-to-record uncertainty is identified by a set of ground motions derived from SAC project. Furthermore, comparisons are made on fragility curves for the building with or without considering the combining uncertainties in structural system, in defining performance limits and modeling technology. Finally, the annual probability and probability in 50 years for each performance limit is calculated and compared. The impacts of such uncertainties on seismic risk of SPWF building are quantified for risk-informed evaluation of the SPWF buildings.  相似文献   

3.
Discussiononuncertainties,attenuationofground motionandaseismicdesigncriterionTian-ZhongZHANG(张天中);Yun-ShengMA(马云生)andXiSHU(舒...  相似文献   

4.
Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis   总被引:1,自引:0,他引:1  
The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell?CMcGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang?CTang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.  相似文献   

5.
Although artificial neural networks (ANNs) have been applied in rainfall runoff modelling for many years, there are still many important issues unsolved that have prevented this powerful non‐linear tool from wide applications in operational flood forecasting activities. This paper describes three ANN configurations and it is found that a dedicated ANN for each lead‐time step has the best performance and a multiple output form has the worst result. The most popular form with multiple inputs and single output has the average performance. In comparison with a linear transfer function (TF) model, it is found that ANN models are uncompetitive against the TF model in short‐range predictions and should not be used in operational flood forecasting owing to their complicated calibration process. For longer range predictions, ANN models have an improved chance to perform better than the TF model; however, this is highly dependent on the training data arrangement and there are undesirable uncertainties involved, as demonstrated by bootstrap analysis in the study. To tackle the uncertainty issue, two novel approaches are proposed: distance analysis and response analysis. Instead of discarding the training data after the model's calibration, the data should be retained as an integral part of the model during its prediction stage and the uncertainty for each prediction could be judged in real time by measuring the distances against the training data. The response analysis is based on an extension of the traditional unit hydrograph concept and has a very useful potential to reveal the hydrological characteristics of ANN models, hence improving user confidence in using them in real time. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

6.
一种基于蒙特卡罗模拟的发震概率计算方法   总被引:1,自引:0,他引:1       下载免费PDF全文
郭星  潘华 《地震学报》2016,38(5):785-793
针对大震发生概率计算过程中的不确定性, 本文分别对不确定性及其处理方法进行了探讨. 考虑到不确定构成的复杂性, 提出了一种基于蒙特卡罗模拟的大震发生概率计算方法, 并以东昆仑断裂带塔藏段为计算实例, 利用蒙特卡罗法处理发震概率计算过程中的各种不确定性. 结果表明, 古地震数据的不完整性对计算结果的影响很大. 本文采用逻辑树法考虑古地震数据的不完整性, 得到塔藏段未来100年的大震发生概率为0.12.   相似文献   

7.
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.  相似文献   

8.
The objective of this article is to study as extensively as possible the uncertainties affecting the annual energy produced by a windmill. In the literature, the general approach is to estimate the mean annual energy from a transformation of a Weibull distribution law. Then the issue is reduced to estimating the coefficients of this distribution. This is obtained by classical statistical methods. Therefore, the uncertainties are mostly limited to those resulting from the statistical procedures. But in fact, the real uncertainty of the random variable which represents the annual energy cannot been reduced to the uncertainty on its mean and to the uncertainties induced from the estimation procedure. We propose here a model, which takes advantage of the fact that the annual energy production is the sum of many random variables representing the 10?min energy production during the year. Under some assumptions, we make use of the central limit theorem and show that an intrinsic uncertainties of wind power, usually not considered, carries an important risk. We also explain an observation coming from practice that the forecasted annual production is always overestimated, which creates a risk of reducing the profitability of the operation.  相似文献   

9.
Abstract

The investigation is concerned with the impact of initial uncertainties on predictions. The problem can be solved exactly for sufficiently simple non-linear systems where an exact solution to the deterministic problem is known. In this paper we shall use the advective equation as an example.

It is found that the behavior at large times of the system depends on the initial uncertainty and the nature of the probability density function.

In applications it is normally necessary to introduce a closure approximation because exact analytical solutions are unknown. Such a closure scheme based on the neglect of third and higher moments will be used in the example and solutions from the closure scheme will be compared with the exact solutions.

It is found that the asymptotic values of the uncertainty may be less than the initial uncertainty.  相似文献   

10.
In this study, a multistage scenario-based interval-stochastic programming (MSISP) method is developed for water-resources allocation under uncertainty. MSISP improves upon the existing multistage optimization methods with advantages in uncertainty reflection, dynamics facilitation, and risk analysis. It can directly handle uncertainties presented as both interval numbers and probability distributions, and can support the assessment of the reliability of satisfying (or the risk of violating) system constraints within a multistage context. It can also reflect the dynamics of system uncertainties and decision processes under a representative set of scenarios. The developed MSISP method is then applied to a case of water resources management planning within a multi-reservoir system associated with joint probabilities. A range of violation levels for capacity and environment constraints are analyzed under uncertainty. Solutions associated different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help water managers to identify desired policies under various economic, environmental and system-reliability conditions. Besides, sensitivity analyses demonstrate that the violation of the environmental constraint has a significant effect on the system benefit.  相似文献   

11.
Flood risk assessment is customarily performed using a design flood. Observed past flows are used to derive a flood frequency curve which forms the basis for a construction of a design flood. The simulation of a distributed model with the 1‐in‐T year design flood as an input gives information on the possible inundation areas, which are used to derive flood risk maps. The procedure is usually performed in a deterministic fashion, and its extension to take into account the design flood‐and flow routing model uncertainties is computer time consuming. In this study we propose a different approach to flood risk assessment which consists of the direct simulation of a distributed flow routing model for an observed series of annual maximum flows and the derivation of maps of probability of inundation of the desired return period directly from the obtained simulations of water levels at the model cross sections through an application of the Flood Level Frequency Analysis. The hydraulic model and water level quantile uncertainties are jointly taken into account in the flood risk uncertainty evaluation using the Generalized Likelihood Uncertainty Estimation (GLUE) approach. An additional advantage of the proposed approach lies in smaller uncertainty of inundation predictions for long return periods compared to the standard approach. The approach is illustrated using a design flood level and a steady‐state solution of a hydraulic model to derive maps of inundation probabilities. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
BinomialmodelonseismicriskanalysisJianWANG(王健)andZhen-LiangSHI(时振梁)(InstituteofGeophysics,StateSeismologicalBureau,Beijing100...  相似文献   

13.
Characterizing, understanding and better estimating uncertainties are key concerns for drawing robust conclusions when analyzing changing socio-hydrological systems. Here we suggest developing a perceptual model of uncertainty that is complementary to the perceptual model of the socio-hydrological system and we provide an example application to flood risk change analysis. Such a perceptual model aims to make all relevant uncertainty sources – and different perceptions thereof – explicit in a structured way. It is a first step to assessing uncertainty in system outcomes that can help to prioritize research efforts and to structure dialogue and communication about uncertainty in interdisciplinary work.  相似文献   

14.
In this paper we extend the generalized likelihood uncertainty estimation (GLUE) technique to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. Untransformed binary pattern data already have been used within GLUE to estimate domain‐averaged (zero‐dimensional) likelihoods, yet the pattern information embedded within such sources has not been used to estimate distributed uncertainty. Where pattern information has been used to map distributed uncertainty it has been transformed into a continuous function prior to use, which may introduce additional errors. To solve this problem we use here ‘raw’ binary pattern data to define a zero‐dimensional global performance measure for each simulation in a Monte Carlo ensemble. Thereafter, for each pixel of the distributed model we evaluate the probability that this pixel was inundated. This probability is then weighted by the measure of global model performance, thus taking into account how well a given parameter set performs overall. The result is a distributed uncertainty measure mapped over real space. The advantage of the approach is that it both captures distributed uncertainty and contains information on global likelihood that can be used to condition predictions of further events for which observed data are not available. The technique is applied to the problem of flood inundation prediction at two test sites representing different hydrodynamic conditions. In both cases, the method reveals the spatial structure in simulation uncertainty and simultaneously enables mapping of flood probability predicted by the model. Spatially distributed uncertainty analysis is shown to contain information over and above that available from global performance measures. Overall, the paper highlights the different types of information that may be obtained from mappings of model uncertainty over real and n‐dimensional parameter spaces. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

15.
ABSTRACT

This paper assesses how various sources of uncertainty propagate through the uncertainty cascade from emission scenarios through climate models and hydrological models to impacts, with a particular focus on groundwater aspects from a number of coordinated studies in Denmark. Our results are similar to those from surface water studies showing that climate model uncertainty dominates the results for projections of climate change impacts on streamflow and groundwater heads. However, we found uncertainties related to geological conceptualization and hydrological model discretization to be dominant for projections of well field capture zones, while the climate model uncertainty here is of minor importance. How to reduce the uncertainties on climate change impact projections related to groundwater is discussed, with an emphasis on the potential for reducing climate model biases through the use of fully coupled climate–hydrology models.
Editor D. Koutsoyiannis; Associate editor not assigned  相似文献   

16.
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster–Shafer (D–S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D–S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D–S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D–S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster–Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D–S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D–S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.  相似文献   

17.
CO2 capture and storage is recognized as a promising solution among others to tackle greenhouse gas emissions. This technology requires robust risk assessment and management from the early stages of the project (i.e. during the site selection phase, prior to injection), which is a challenging task due to the high level of aleatory and epistemic uncertainties. This paper aims at implementing and comparing two frameworks for dealing with uncertainties: a classical probabilistic framework and a probabilistic-fuzzy-based (i.e. jointly combining fuzzy sets and probabilities) one. The comparison of both frameworks is illustrated for assessing the risk related to leakage of brine through an abandoned well on a realistic site in the Paris basin (France). For brine leakage flow computation, a semi-analytical model, requiring 25 input parameters, is used. Depending on the framework, available data is represented in a different manner (either using classical probability laws or interval-valued tools). Though the fuzzy-probabilistic framework for uncertainty propagation is computationally more expensive, it presents the major advantage to highlight situations of high degree of epistemic uncertainty: this enables nuancing a too-optimistic decision-making only supported by a single probabilistic curve (i.e. using the Monte-Carlo results). On this basis, we demonstrate how fuzzy-based sensitivity analysis can help identifying how to reduce the imprecision in an effective way, which has useful applications for additional studies. This study highlights the importance of choices in the mathematical tools for representing the lack of knowledge especially in the early phases of the project, where data is scarce, incomplete and imprecise.  相似文献   

18.
Seismic risk assessment requires adoption of appropriate models for the earthquake hazard, the structural system and for its performance, and quantification of the uncertainties involved in these models through appropriate probability distributions. Characterization of the seismic hazard comprises undoubtedly the most critical component of this process, the one associated with the largest amount of uncertainty. For applications involving dynamic analysis this hazard is frequently characterized through stochastic ground motion models. This paper discusses a novel, global sensitivity analysis for the seismic risk with emphasis on such a stochastic ground motion modeling. This analysis aims to identify the overall (i.e. global) importance of each of the uncertain model parameters, or of groups of them, towards the total risk. The methodology is based on definition of an auxiliary density (distribution) function, proportional to the integrand of the integral quantifying seismic risk, and on comparison of this density to the initial probability distribution for the model parameters of interest. Uncertainty in the rest of the model parameters is explicitly addressed through integration of their joint auxiliary distribution to calculate the corresponding marginal distributions. The relative information entropy is used to quantify the difference between the compared density functions and an efficient approach based on stochastic sampling is introduced for estimating this entropy for all quantities of interest. The framework is illustrated in an example that adopts a source-based stochastic ground motion model, and valuable insight is provided for its implementation within structural engineering applications.  相似文献   

19.
The sensitivity and overall uncertainty in peak ground acceleration (PGA)estimates have been calculated for the city of Tabriz, northwestern Iran byusing a specific randomized blocks design. Eight seismic hazard models andparameters with randomly selected uncertainties at two levels have beenconsidered and then a linear model between predicted PGA at a givenprobability level and the uncertainties has been performed. The inputmodels and parameters are those related to the attenuation, magnituderupture-length and recurrence relationships with their uncertainties.Application of this procedure to the studied area indicates that effects ofthe simultaneous variation of all eight input models and parameters on thesensitivity of the seismic hazard can be investigated with a decreasingnumber of computations for all possible combinations at a fixed annualprobability. The results show that the choice of a mathematical model ofthe source mechanism, attenuation relationships and the definition ofseismic parameters are most critical in estimating the sensitivity of seismichazard evaluation, in particular at low levels of probability of exceedance.The overall uncertainty in the expected PGA for an annual probability of0.0021 (10% exceedence in 50 yr) is expressed by a coefficient ofvariation (CV) of about 34% at 68% confidence level for a distance ofabout 5km from the field of the major faults. The CV will decrease withincreasing site-source distance and remains constant, CV = 15%, fordistances larger than 15 km. Finally, treating alternative models on theoverall uncertainty are investigated by additional outliers in input decision.  相似文献   

20.
A mature mathematical technique called copula joint function is introduced in this paper, which is commonly used in the financial risk analysis to estimate uncertainty. The joint function is generalized to the n-dimensional Frank’s copula. In addition, we adopt two attenuation models proposed by YU and Boore et al, respectively, and construct a two-dimensional copula joint probabilistic function as an example to illustrate the uncertainty treatment at low probability. The results show that copula joint func...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号