首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A method is presented for incorporating the uncertainties associated with hypocentral locations in the formulation of probabilistic models of the time and space distributions of the activity of potential seismic sources, as well as of the resulting seismic hazard functions at sites in their vicinity. For this purpose, a bayesian framework of analysis is adopted, where the probabilistic models considered are assumed to have known forms and uncertain parameters, the distribution of the latter being the result of an a priori assessment and its updating through the incorporation of the direct statistical information, including the uncertainty associated with the relations between the actual hypocentral locations and the reported data. This uncertainty is incorporated in the evaluation of the likelihood function of the parameters to be estimated for a given sample of recorded locations. For the purpose of illustration, the method proposed is applied to the modelling of the seismic sources near a site close to the southern coast of Mexico. The results of two alternate algorithms for the incorporation of location uncertainties are compared with those arising from neglecting those uncertainties. One of them makes use of Monte Carlo simulation, while the other is based on a closed-form analytical integration following the introduction of some simplifying assumptions. For the particular case studied, accounting for location uncertainties gives place to significant changes in the probabilistic models of the seismic sources. Deviations of the same order of magnitude can be ascribed to differences in the mathematical and/or numerical tools used in the uncertainty analysis. The resulting variability of the seismic hazard at the site of interest is less pronounced than that affecting the estimates of activity of individual seismic sources.  相似文献   

2.
Incorporation of uncertainties within an urban water supply management system has been a challenging topic for many years. In this study, an acceptability-index-based two-step interval programming (AITIP) model was developed for supporting urban water supply analysis under uncertainty. AITIP improved upon the traditional two-step interval programming (TIP) through incorporating the acceptability level of constraints violation into the optimization framework. A four-layer urban water supply system, including water sources, treatment facilities, reservoirs, and consuming zones, was used to demonstrate the applicability of proposed method. The results indicated that an AITIP model was valuable to help understand the effects of uncertainties related to cost, constraints and decision maker’s judgment in the water supply network, and capable of assisting urban water managers gain an in-depth insight into the tradeoffs between system cost and constraints-violation risk. Compared with TIP, the solutions from AITIP were of lower degree of uncertainty, making it more reliable to identify effective water supply patterns by adjusting decision variable values within their solution intervals. The study is useful in helping urban water managers to identify cost-effective management schemes in light of uncertainties in hydrology, environment, and decisions. The proposed optimization approach is expected to be applicable for a wide variety of water resources management problems.  相似文献   

3.
Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing modelling uncertainties, record to record (RTR) variations and cognitive uncertainties for moment resisting steel frames of various heights is discussed in detail. RTR uncertainty is used by incremental dynamic analysis (IDA), modelling uncertainties are considered through backbone curves and hysteresis loops of component, and cognitive uncertainty is presented in three levels of material quality. IDA is used to evaluate RTR uncertainty based on strong ground motion records selected by the k-means algorithm, which is favoured over Monte Carlo selection due to its time saving appeal. Analytical equations of the Response Surface Method are obtained through IDA results by the Cuckoo algorithm, which predicts the mean and standard deviation of the collapse fragility curve. The Takagi-Sugeno-Kang model is used to represent material quality based on the response surface coefficients. Finally, collapse fragility curves with the various sources of uncertainties mentioned are derived through a large number of material quality values and meta variables inferred by the Takagi-Sugeno-Kang fuzzy model based on response surface method coefficients. It is concluded that a better risk management strategy in countries where material quality control is weak, is to account for cognitive uncertainties in fragility curves and the mean annual frequency.  相似文献   

4.
Probabilistic risk analysis is an effective tool for risk-informed decision-making related to the building facilities. All sources of the uncertainties should be considered in seismic risk assessment framework. Not only the levels of these uncertainties but also the effects on the performance of the buildings should be clearly identified. This paper aims to assess the impacts of the potential uncertainties on the seismic risk of steel frame equipped with steel panel wall (SPWF). Firstly, the performance limits of the SPWF structures are determined according to cyclic test results of two SPWF specimens. Then a validated numerical model of a 12-story SPWF building is modeled and used to perform the nonlinear time-history analysis, and the record-to-record uncertainty is identified by a set of ground motions derived from SAC project. Furthermore, comparisons are made on fragility curves for the building with or without considering the combining uncertainties in structural system, in defining performance limits and modeling technology. Finally, the annual probability and probability in 50 years for each performance limit is calculated and compared. The impacts of such uncertainties on seismic risk of SPWF building are quantified for risk-informed evaluation of the SPWF buildings.  相似文献   

5.
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster–Shafer (D–S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D–S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D–S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D–S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster–Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D–S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D–S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.  相似文献   

6.
In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box–Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box–Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box–Cox transformation is not applied or non-calibrated parameter is used.  相似文献   

7.
8.
This work examines future flood risk within the context of integrated climate and hydrologic modelling uncertainty. The research questions investigated are (1) whether hydrologic uncertainties are a significant source of uncertainty relative to other sources such as climate variability and change and (2) whether a statistical characterization of uncertainty from a lumped, conceptual hydrologic model is sufficient to account for hydrologic uncertainties in the modelling process. To investigate these questions, an ensemble of climate simulations are propagated through hydrologic models and then through a reservoir simulation model to delimit the range of flood protection under a wide array of climate conditions. Uncertainty in mean climate changes and internal climate variability are framed using a risk‐based methodology and are explored using a stochastic weather generator. To account for hydrologic uncertainty, two hydrologic models are considered, a conceptual, lumped parameter model and a distributed, physically based model. In the conceptual model, parameter and residual error uncertainties are quantified and propagated through the analysis using a Bayesian modelling framework. The approach is demonstrated in a case study for the Coralville Dam on the Iowa River, where recent, intense flooding has raised questions about potential impacts of climate change on flood protection adequacy. Results indicate that the uncertainty surrounding future flood risk from hydrologic modelling and internal climate variability can be of the same order of magnitude as climate change. Furthermore, statistical uncertainty in the conceptual hydrological model can capture the primary structural differences that emerge in flood damage estimates between the two hydrologic models. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

9.
Simulations from hydrological models are affected by potentially large uncertainties stemming from various sources, including model parameters and observational uncertainty in the input/output data. Understanding the relative importance of such sources of uncertainty is essential to support model calibration, validation and diagnostic evaluation and to prioritize efforts for uncertainty reduction. It can also support the identification of ‘disinformative data’ whose values are the consequence of measurement errors or inadequate observations. Sensitivity analysis (SA) provides the theoretical framework and the numerical tools to quantify the relative contribution of different sources of uncertainty to the variability of the model outputs. In traditional applications of global SA (GSA), model outputs are aggregations of the full set of a simulated variable. For example, many GSA applications use a performance metric (e.g. the root mean squared error) as model output that aggregates the distances of a simulated time series to available observations. This aggregation of propagated uncertainties prior to GSA may lead to a significant loss of information and may cover up local behaviour that could be of great interest. Time‐varying sensitivity analysis (TVSA), where the aggregation and SA are repeated at different time steps, is a viable option to reduce this loss of information. In this work, we use TVSA to address two questions: (1) Can we distinguish between the relative importance of parameter uncertainty versus data uncertainty in time? (2) Do these influences change in catchments with different characteristics? To our knowledge, the results present one of the first quantitative investigations on the relative importance of parameter and data uncertainty across time. We find that the approach is capable of separating influential periods across data and parameter uncertainties, while also highlighting significant differences between the catchments analysed. Copyright © 2016 The Authors. Hydrological Processes. Published by John Wiley & Sons Ltd.  相似文献   

10.
As the profession moves toward the performance-based earthquake engineering design, it becomes more important and pressing to examine the uncertainty of the limit state model used for liquefaction potential evaluation. In this paper, the uncertainty of the Robertson and Wride model, a simplified model for liquefaction resistance and potential evaluation based on cone penetration test, is investigated in detail for its model uncertainty in the framework of first-order reliability analysis. The uncertainties of the parameters used in the Robertson and Wride model are also examined. The model uncertainty is estimated by calibration with a fairly large set of case histories. The results show that the uncertainty of the Robertson and Wride model may be characterized with a mean-to-nominal of 0.94 and a coefficient of variation of 0.15 based on the case histories examined.  相似文献   

11.
ABSTRACT

Prediction of design hydrographs is key in floodplain mapping using hydraulic models, which are either steady state or unsteady. The former, which require only an input peak, substantially overestimate the volume of water entering the floodplain compared to the more realistic dynamic case simulated by the unsteady models that require the full hydrograph. Past efforts to account for the uncertainty of boundary conditions using unsteady hydraulic modeling have been based largely on a joint flood frequency–shape analysis, with only a very limited number of studies using hydrological modeling to produce the design hydrographs. This study therefore presents a generic probabilistic framework that couples a hydrological model with an unsteady hydraulic model to estimate the uncertainty of flood characteristics. The framework is demonstrated on the Swannanoa River watershed in North Carolina, USA. Given its flexibility, the framework can be applied to study other sources of uncertainty in other hydrological models and watersheds.  相似文献   

12.
A framework for the validation of computational models used to predict seismic response based on observations from seismometer arrays is presented. The framework explicitly accounts for the epistemic uncertainty related to the unknown characteristics of the ‘site’ (i.e. the problem under consideration) and constitutive model parameters. A mathematical framework which makes use of multiple prediction–observation pairs is used to improve the statistical significance of inferences regarding the accuracy and precision of the computational methodology and constitutive model. The benefits of such a formal validation framework include: (i) development of consistent methods for determination of constitutive model parameters; (ii) rigorous, objective, and unbiased assessment of the validity of various constitutive models and computational methodologies for various problem types and ground motion intensities; and (iii) an improved understanding of the uncertainties in computational model assumptions, constitutive models and their parameters, relative to other seismic response uncertainties such as ground motion variability. Details regarding the implementation of such a framework to achieve the aforementioned benefits are also addressed.  相似文献   

13.
Errors and uncertainties in hydrological, hydraulic and environmental models are often substantial. In good modelling practice, they are quantified in order to supply decision-makers with important additional information on model limitations and sources of uncertainty. Several uncertainty analysis methods exist, often with various underlying assumptions. One of these methods is based on variance decomposition. The method allows splitting the variance of the total error in the model results (as estimated after comparing model results with observations) in its major contributing uncertainty sources. This paper discusses an advanced version of that method where error distributions for rainfall, other inputs and parameters are propagated in the model and the “rest” uncertainties considered as model structural errors for different parts of the model. By expert knowledge, the iid assumption that is often made in model error analysis is addressed upfront. The method also addresses the problems of heteroscedasticity and serial dependence of the errors involved. The method has been applied by the author to modelling applications of sewer water quantity and quality, river water quality and river flooding.  相似文献   

14.
We study the appraisal problem for the joint inversion of seismic and controlled source electro‐magnetic (CSEM) data and utilize rock‐physics models to integrate these two disparate data sets. The appraisal problem is solved by adopting a Bayesian model and we incorporate four representative sources of uncertainty. These are uncertainties in 1) seismic wave velocity, 2) electric conductivity, 3) seismic data and 4) CSEM data. The uncertainties in porosity and water saturation are quantified by a posterior random sampling in the model space of porosity and water saturation in a marine one‐dimensional structure. We study the relative contributions from the four individual sources of uncertainty by performing several statistical experiments. The uncertainties in the seismic wave velocity and electric conductivity play a more significant role on the variation of posterior uncertainty than do the seismic and CSEM data noise. The numerical simulations also show that the uncertainty in porosity is most affected by the uncertainty in the seismic wave velocity and that the uncertainty in water saturation is most influenced by the uncertainty in electric conductivity. The framework of the uncertainty analysis presented in this study can be utilized to effectively reduce the uncertainty of the porosity and water saturation derived from the integration of seismic and CSEM data.  相似文献   

15.
Multi-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box–Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney’s main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box–Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.  相似文献   

16.
ABSTRACT

This paper assesses how various sources of uncertainty propagate through the uncertainty cascade from emission scenarios through climate models and hydrological models to impacts, with a particular focus on groundwater aspects from a number of coordinated studies in Denmark. Our results are similar to those from surface water studies showing that climate model uncertainty dominates the results for projections of climate change impacts on streamflow and groundwater heads. However, we found uncertainties related to geological conceptualization and hydrological model discretization to be dominant for projections of well field capture zones, while the climate model uncertainty here is of minor importance. How to reduce the uncertainties on climate change impact projections related to groundwater is discussed, with an emphasis on the potential for reducing climate model biases through the use of fully coupled climate–hydrology models.
Editor D. Koutsoyiannis; Associate editor not assigned  相似文献   

17.
Landscape evolution models (LEMs) have the capability to characterize key aspects of geomorphological and hydrological processes. However, their usefulness is hindered by model equifinality and paucity of available calibration data. Estimating uncertainty in the parameter space and resultant model predictions is rarely achieved as this is computationally intensive and the uncertainties inherent in the observed data are large. Therefore, a limits-of-acceptability (LoA) uncertainty analysis approach was adopted in this study to assess the value of uncertain hydrological and geomorphic data. These were used to constrain simulations of catchment responses and to explore the parameter uncertainty in model predictions. We applied this approach to the River Derwent and Cocker catchments in the UK using a LEM CAESAR-Lisflood. Results show that the model was generally able to produce behavioural simulations within the uncertainty limits of the streamflow. Reliability metrics ranged from 24.4% to 41.2% and captured the high-magnitude low-frequency sediment events. Since different sets of behavioural simulations were found across different parts of the catchment, evaluating LEM performance, in quantifying and assessing both at-a-point behaviour and spatial catchment response, remains a challenge. Our results show that evaluating LEMs within uncertainty analyses framework while taking into account the varying quality of different observations constrains behavioural simulations and parameter distributions and is a step towards a full-ensemble uncertainty evaluation of such models. We believe that this approach will have benefits for reflecting uncertainties in flooding events where channel morphological changes are occurring and various diverse (and yet often sparse) data have been collected over such events.  相似文献   

18.
Abstract

Abstract The aim of this study was to estimate the uncertainties in the streamflow simulated by a rainfall–runoff model. Two sources of uncertainties in hydrological modelling were considered: the uncertainties in model parameters and those in model structure. The uncertainties were calculated by Bayesian statistics, and the Metropolis-Hastings algorithm was used to simulate the posterior parameter distribution. The parameter uncertainty calculated by the Metropolis-Hastings algorithm was compared to maximum likelihood estimates which assume that both the parameters and model residuals are normally distributed. The study was performed using the model WASMOD on 25 basins in central Sweden. Confidence intervals in the simulated discharge due to the parameter uncertainty and the total uncertainty were calculated. The results indicate that (a) the Metropolis-Hastings algorithm and the maximum likelihood method give almost identical estimates concerning the parameter uncertainty, and (b) the uncertainties in the simulated streamflow due to the parameter uncertainty are less important than uncertainties originating from other sources for this simple model with fewer parameters.  相似文献   

19.
Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.  相似文献   

20.
Decision-making in traffic regulations is challenging with uncertainty in the environment. In this research we extend probabilistic engineering design concepts to policy decision-making for urban traffic with variability from field data. City traffic is simulated using user equilibrium and cellular automata. A cellular automata (CA) model is developed by combing existing CA models with tailored rules for local traffic behaviors in Tainan, Taiwan. Both passenger sedans and motorcycles are considered with the possibility of passing between different types of vehicles. The tailpipe emissions from all mobile sources are modeled as Gaussian dispersion with finite line sources. Speed limits of all roads are selected as independent policy design variables, resulting in a problem with 50 dimensions. We first study the impacts of a particular policy-setting on traffic behaviors and on the environment under various sources of uncertainties. The genetic algorithm, combined with probabilistic analysis, is then used to obtain the optimal regulations with the minimal cost to the environment in compliance to the current ambient air quality standards.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号