首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The determination of slope stability for existing slopes is challenging, partly due to the spatial variability of soils. Reliability-based design can incorporate uncertainties and yield probabilities of slope failure. Field measurements can be utilised to constrain probabilistic analyses, thereby reducing uncertainties and generally reducing the calculated probabilities of failure. A method to utilise pore pressure measurements, to first reduce the spatial uncertainty of hydraulic conductivity, by using inverse analysis linked to the Ensemble Kalman Filter, is presented. Subsequently, the hydraulic conductivity has been utilised to constrain uncertainty in strength parameters, usually leading to an increase in the calculated slope reliability.  相似文献   

2.
Bay of Bengal cyclone extreme water level estimate uncertainty   总被引:4,自引:3,他引:1  
  相似文献   

3.
Oguz  Emir Ahmet  Depina  Ivan  Thakur  Vikas 《Landslides》2022,19(1):67-83

Uncertainties in parameters of landslide susceptibility models often hinder them from providing accurate spatial and temporal predictions of landslide occurrences. Substantial contribution to the uncertainties in landslide assessment originates from spatially variable geotechnical and hydrological parameters. These input parameters may often vary significantly through space, even within the same geological deposit, and there is a need to quantify the effects of the uncertainties in these parameters. This study addresses this issue with a new three-dimensional probabilistic landslide susceptibility model. The spatial variability of the model parameters is modeled with the random field approach and coupled with the Monte Carlo method to propagate uncertainties from the model parameters to landslide predictions (i.e., factor of safety). The resulting uncertainties in landslide predictions allow the effects of spatial variability in the input parameters to be quantified. The performance of the proposed model in capturing the effect of spatial variability and predicting landslide occurrence has been compared with a conventional physical-based landslide susceptibility model that does not account for three-dimensional effects on slope stability. The results indicate that the proposed model has better performance in landslide prediction with higher accuracy and precision than the conventional model. The novelty of this study is illustrating the effects of the soil heterogeneity on the susceptibility of shallow landslides, which was made possible by the development of a three-dimensional slope stability model that was coupled with random field model and the Monte Carlo method.

  相似文献   

4.
Soil and Water Assessment Tool (SWAT) is a river basin scale model widely used to study the impact of land management practices in large, complex watersheds. Even though model output uncertainties are generally recognized to affect watershed management decisions, those uncertainties are largely ignored in model applications. The uncertainties of SWAT simulations are quantified using various methods, but simultaneous attempt to calibrate a model so as to reduce the uncertainty are seldom done. This study aims to use an uncertainty reduction procedure that helps calibrate the SWAT model. The shuffled complex evolutionary metropolis algorithm for uncertainty analysis is employed for this purpose, and is demonstrated using the data from the St. Joseph River basin, USA. The values of the performance indices, the r2 and the Nash–Sutcliffe efficiency (NSE) for the simulations during calibration period was found to be 0.81 (same for r2 and NSE) and 0.79 for validation period indicating a good simulation by the model. The results also indicate that the algorithm helps reduce the uncertainty (percentage of coverage?=?62% and average width?=?19.2 m3/s), and also identifies the plausible range of parameters that simulate the processes with less uncertainty. The confidence bands of simulations are obtained that can be employed in making uncertainty-based decisions on watershed management practices.  相似文献   

5.
The past 12 years have seen significant steps forward in the science and practice of coastal flood analysis. This paper aims to recount and critically assess these advances, while helping identify next steps for the field. This paper then focuses on a key problem, connecting the probabilistic characterization of flood hazards to their physical mechanisms. Our investigation into the effects of natural structure on the probabilities of storm surges shows that several different types of spatial-, temporal-, and process-related organizations affect key assumptions made in many of the methods used to estimate these probabilities. Following a brief introduction to general historical methods, we analyze the two joint probability methods used in most tropical cyclone hazard and risk studies today: the surface response function and Bayesian quadrature. A major difference between these two methods is that the response function creates continuous surfaces, which can be interpolated or extrapolated on a fine scale if necessary, and the Bayesian quadrature optimizes a set of probability masses, which cannot be directly interpolated or extrapolated. Several examples are given here showing significant impacts related to natural structure that should not be neglected in hazard and risk assessment for tropical cyclones including: (1) differences between omnidirectional sampling and directional-dependent sampling of storms in near coastal areas; (2) the impact of surge probability discontinuities on the treatment of epistemic uncertainty; (3) the ability to reduce aleatory uncertainty when sampling over larger spatial domains; and (4) the need to quantify trade-offs between aleatory and epistemic uncertainties in long-term stochastic sampling.  相似文献   

6.
7.
Two-dimensional paper maps are well-established tsunami risk communication tools in coastal communities. Advances in GIS, geovisualization and spatial interface technologies suggest new opportunities to deliver tsunami risk communication using 3D, interactive and situated risk visualization. This paper introduces a set of geovisual interface constructs—dimensionality–interactivity–situatedness (DIS)—and evaluates their presence, absence and distribution in 129 examples of existing academic and public visual tsunami risk communication. The resulting analyses reveal structural differences in the distributions of DIS found in each of academic and public risk communication literatures, and opportunities for interactive location-aware risk communication. The second half of this paper reports on three new tsunami risk visualization interfaces informed by and developed to demonstrate how we might explore new undeveloped risk communication territory revealed by the DIS cube analysis. We discuss the design, rationale and implications of: EvacMap; ARRO3D; and Tsunamulator. These risk visualization interfaces deliver location-aware, user-centred risk maps, as well as virtual risk maps and tsunami simulations that can be viewed while standing in situ in coastal environments. This work is a first step intended to help the risk communication community systematically engage an emerging territory of interactive and location-aware 3D visualizations. This work aims to facilitate and encourage progress towards developing a new strand of interactive, situated geovisual risk communication research, by establishing these guiding constructs, their relationship to existing works and how they may inform the design of future systems and usability research.  相似文献   

8.
Comparing Training-Image Based Algorithms Using an Analysis of Distance   总被引:1,自引:1,他引:0  
As additional multiple-point statistical (MPS) algorithms are developed, there is an increased need for scientific ways for comparison beyond the usual visual comparison or simple metrics, such as connectivity measures. In this paper, we start from the general observation that any (not just MPS) geostatistical simulation algorithm represents two types of variability: (1) the within-realization variability, namely, that realizations reproduce a spatial continuity model (variogram, Boolean, or training-image based), (2) the between-realization variability representing a model of spatial uncertainty. In this paper, it is argued that any comparison of algorithms needs, at a minimum, to be based on these two randomizations. In fact, for certain MPS algorithms, it is illustrated with different examples that there is often a trade-off: Increased pattern reproduction entails reduced spatial uncertainty. In this paper, the subjective choice that the best algorithm maximizes pattern reproduction is made while at the same time maximizes spatial uncertainty. The discussion is also limited to fairly standard multiple-point algorithms and that our method does not necessarily apply to more recent or possibly future developments. In order to render these fundamental principles quantitative, this paper relies on a distance-based measure for both within-realization variability (pattern reproduction) and between-realization variability (spatial uncertainty). It is illustrated in this paper that this method is efficient and effective for two-dimensional, three-dimensional, continuous, and discrete training images.  相似文献   

9.
Significant uncertainties are associated with the definition of both the exploration targeting criteria and computational algorithms used to generate mineral prospectivity maps. In prospectivity modeling, the input and computational uncertainties are generally made implicit, by making a series of best-guess or best-fit decisions, on the basis of incomplete and imprecise information. The individual uncertainties are then compounded and propagated into the final prospectivity map as an implicit combined uncertainty which is impossible to directly analyze and use for decision making. This paper proposes a new approach to explicitly define uncertainties of individual targeting criteria and propagate them through a computational algorithm to evaluate the combined uncertainty of a prospectivity map. Applied to fuzzy logic prospectivity models, this approach involves replacing point estimates of fuzzy membership values by statistical distributions deemed representative of likely variability of the corresponding fuzzy membership values. Uncertainty is then propagated through a fuzzy logic inference system by applying Monte Carlo simulations. A final prospectivity map is represented by a grid of statistical distributions of fuzzy prospectivity. Such modeling of uncertainty in prospectivity analyses allows better definition of exploration target quality, as understanding of uncertainty is consistently captured, propagated and visualized in a transparent manner. The explicit uncertainty information of prospectivity maps can support further risk analysis and decision making. The proposed probabilistic fuzzy logic approach can be used in any area of geosciences to model uncertainty of complex fuzzy systems.  相似文献   

10.
In natural hazard risk assessment situations are encountered where information on the portfolio of exposure is only available in a spatially aggregated form, hindering a precise risk assessment. Recourse might be found in the spatial disaggregation of the portfolio of exposure to the resolution of the hazard model. Given the uncertainty inherent to any disaggregation, it is argued that the disaggregation should be performed probabilistically. In this paper, a methodology for probabilistic disaggregation of spatially aggregated values is presented. The methodology is exemplified with the disaggregation of a portfolio of buildings in two communes in Switzerland and the results are compared to sample observations. The relevance of probabilistic disaggregation uncertainty in natural hazard risk assessment is illustrated with the example of a simple flood risk assessment.  相似文献   

11.
In the European Alps, the concept of risk has increasingly been applied in order to reduce the susceptibility of society to mountain hazards. Risk is defined as a function of the magnitude and frequency of a hazard process times consequences; the latter being quantified by the value of elements at risk exposed and their vulnerability. Vulnerability is defined by the degree of loss to a given element at risk resulting from the impact of a natural hazard. Recent empirical studies suggested a dependency of the degree of loss on the hazard impact, and respective vulnerability (or damage-loss) functions were developed. However, until now, only little information is available on the spatial characteristics of vulnerability on a local scale; considerable ranges in the loss ratio for medium process intensities only provide a hint that there might be mutual reasons for lower or higher loss rates. In this paper, we therefore focus on the spatial dimension of vulnerability by searching for spatial clusters in the damage ratio of elements at risk exposed. By using the software SaTScan, we applied an ordinal data model and a normal data model in order to detect spatial distribution patterns of five individual torrent events in Austria. For both models, we detected some significant clusters of high damage ratios, and consequently high vulnerability. Moreover, secondary clusters of high and low values were found. Based on our results, the assumption that lower process intensities result in lower damage ratios, and therefore in lower vulnerability, and vice versa, has to be partly rejected. The spatial distribution of vulnerability is not only dependent on the process intensities but also on the overall land use pattern and the individual constructive characteristics of the buildings exposed. Generally, we suggest the use of a normal data model for test sites exceeding a minimum of 30 elements at risk exposed. As such, the study enhanced our understanding of spatial vulnerability patterns on a local scale.  相似文献   

12.
The uncertainty in terms of soil characterisation is studied to assess its effect on the structural behaviour of extended structures as sheet pile walls. A finite element model is used. This integrates a numerical model of the soil–structure interaction together with a stochastic model that allows characterising the soil variability. The model serves in propagating the variability and the system parameter uncertainties. Discussion is mainly focused on two points: (1) testing the sensitivity of the structural behaviour of a sheet pile wall to different geotechnical parameters and (2) assessing the influence of spatial variability of soil properties on the structural behaviour by identifying the most sensitive geotechnical parameter and the most significant correlation length values. The findings showed that in assessing the sheet pile wall’s structural behaviour, there are spatial variability parameters that cannot be considered negligible. In this study, soil friction angle is found to be an important parameter.  相似文献   

13.
Ivan G. Wong 《Natural Hazards》2014,72(3):1299-1309
The occurrence of several recent “extreme” earthquakes with their significant loss of life and the apparent failure to have been prepared for such disasters has raised the question of whether such events are accounted for in modern seismic hazard analyses. In light of the great 2011 Tohoku-Oki earthquake, were the questions of “how big, how bad, and how often” addressed in probabilistic seismic hazard analyses (PSHA) in Japan, one of the most earthquake-prone but most earthquake-prepared countries in the world? The guidance on how to properly perform PSHAs exists but may not be followed for a whole range of reasons, not all technical. One of the major emphases of these guidelines is that it must be recognized that there are significant uncertainties in our knowledge of earthquake processes and these uncertainties need to be fully incorporated into PSHAs. If such uncertainties are properly accounted for in PSHA, extreme events can be accounted for more often than not. This is not to say that no surprises will occur. That is the nature of trying to characterize a natural process such as earthquake generation whose properties also have random (aleatory) uncertainties. It must be stressed that no PSHA is ever final because new information and data need to be continuously monitored and addressed, often requiring an updated PSHA.  相似文献   

14.
The effect of uncertainty on cooperation between the partners sharing the natural resources remains unknown. Uncertainty may strengthen cooperation between partners, as it is necessary to implement cooperative mitigation policies, however, it may also serve as a cause of friction between parties, as it may aggravate existing trust issues or power asymmetries. Given the potential for such contrary outcomes, we provide criteria to examine empirically how uncertainties in a transboundary setting seem to promote or impede cooperation. Taking Israeli–Palestinian Annapolis round and post-Annapolis negotiations as a case study, this work identifies the effect of uncertainties related to water on negotiation positions. Social and political uncertainties, which tend to be more associated with uncertainty regarding interpretation rather than a lack of information, play a much stronger role in water negotiations than do technical or physical uncertainties that often dominate in other resource issues. Many of the criteria used to assess the effect of uncertainty indicate that partners attempted to address uncertainties in an ostensibly cooperative manner, accepting negotiation venues and rules. However, confronting uncertainty stemming from interpretation of information often around social issues tends to result in additional uncertainties associated with delaying negotiations, spillover effects and power implications, each with negative implications in terms of cooperation. As such, mechanisms proposed to address these uncertainties also tend to be more disputed. The only type of mechanisms that did not appear to aggravate the effects of these uncertainties and perhaps the only that would be indicative of some type of cooperation, even if low level, are those that deal data and information exchange and research.  相似文献   

15.
This paper integrates random field simulation of soil spatial variability with numerical modeling of coupled flow and deformation to investigate consolidation in spatially random unsaturated soil. The spatial variability of soil properties is simulated using the covariance matrix decomposition method. The random soil properties are imported into an interactive multiphysics software COMSOL to solve the governing partial differential equations. The effects of the spatial variability of Young's modulus and saturated permeability together with unsaturated hydraulic parameters on the dissipation of excess pore water pressure and settlement are investigated using an example of consolidation in a saturated‐unsaturated soil column because of loading. It is found that the surface settlement and the pore water pressure profile during the process of consolidation are significantly affected by the spatially varying Young's modulus. The mean value of the settlement of the spatially random soil is more than 100% greater than that of the deterministic case, and the surface settlement is subject to large uncertainty, which implies that consolidation settlement is difficult to predict accurately based on the conventional deterministic approach. The uncertainty of the settlement increases with the scale of fluctuation because of the averaging effect of spatial variability. The effects of spatial variability of saturated permeability ksat and air entry parameters are much less significant than that of elastic modulus. The spatial variability of air entry value parameters affects the uncertainties of settlement and excess pore pressure mostly in the unsaturated zone. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

16.
气候变化对中国农业温度阈值影响研究及其不确定性分析   总被引:10,自引:0,他引:10  
气候变化影响的温度阈值已经成为国际谈判的焦点问题。利用区域气候模式和作物模型相连接的方法,在50 km×50 km的网格尺度上模拟了IPCC (International Panel of Climate Change) SRES (Special Report on Emissions Scenarios) A2和B2情景下2011—2040年、2041—2070年和2071—2100年我国3种主要粮食作物(水稻、小麦和玉米)的平均单产变化情景,结合同时段的中国地区温度升高状况分析了造成我国主要粮食作物产量持续下降的升温阈值,并对阈值研究中的不确定性进行了初步分析。结果表明:如果同时考虑升温和CO2的肥效作用对作物的影响,目前预测的气温升高范围(0.9~3.9℃)中将不存在威胁我国粮食生产的温度阈值;而如果仅考虑升温对作物的影响时,全国平均温度升高2℃以后,将导致我国粮食单产水平的持续下降,威胁未来的粮食生产;气候变化适应措施如充分灌溉、播种期的调整和品种更替对阈值的确定有一定的影响,如单考虑充分灌溉可以使上述升温阈值延后到2.5℃左右,而播种期的调整和品种的更替也会对产量和产量变率造成一定的影响,进而调整温度阈值;目前研究的不确定性主要来源于情景、方法和适应措施3个方面。  相似文献   

17.
Cone Penetration Test (CPT) is widely utilized to gain regular geotechnical parameters such as compression modulus, cohesion coefficient and internal friction angle by transformation model in the site investigation. However, it is challenging to obtain simultaneously the unknown coefficients and error of a transformation model, given the intrinsic uncertainty (i.e., spatial variability) of geomaterial and the epistemic uncertainty of geotechnical investigation. A Bayesian approach is therefore proposed calibrating the transformation model based on spatial random field theory. The approach consists of three key elements: (1) three-dimensional anisotropic spatial random field theory; (2) classifications of measurement and error, and the uncertainty propagation diagram of geotechnical investigation; and (3) the unknown coefficients and error calibration of the transformation model given Bayesian inverse modeling method. The massive penetration resistance data from CPT, which is denoted as a spatial random field variable to account for the spatial variability of soil, are classified as type A data. Meanwhile, a few laboratory test data such as the compression modulus are defined as type B data. Based on the above two types of data, the unknown coefficients and error of the transformation model are inversely calibrated with consideration of intrinsic uncertainty of geomaterial, epistemic uncertainties such as measurement errors, prior knowledge uncertainty of transformation model itself, and computing uncertainties of statistical parameters as well as Bayesian method. Baseline studying indicates the proposed approach is applicable to calibrate the transformation model between CPT data and regular geotechnical parameter within spatial random field theory. Next, the calibrated transformation model was compared with classical linear regression in cross-validation, and then it was implemented at three-dimensional site characterization of the background project.  相似文献   

18.
Seemann  Mark  Onur  Tuna  Cloutier-Fisher  Denise 《Natural Hazards》2011,58(3):1253-1273
Comprehensive risk assessments are fundamental to effective emergency management. These assessments need to identify the range of hazards (or perils) an entity is exposed to and quantify the specific threats associated with each of those hazards. While hazard identification is commonly, if not formally, conducted in most circumstances, specific threat analysis is often overlooked for a variety of reasons, one of which is poor communication with subject matter experts. This poor communication is often attributable to an adherence to scientific jargon and missed opportunities to simplify information. In Canada, for example, earthquake hazard calculations have been readily available to engineers and scientists for decades. This hazard information, however, is expressed in terms of peak ground accelerations (PGA) or spectral accelerations (SA) that are foreign concepts to most emergency managers, community decision-makers and the public-at-large. There is, therefore, a need to more clearly, simply and effectively express seismic hazard information to the non-scientific community. This paper provides crustal, sub-crustal and subduction interface earthquake shaking probabilities, expressed as simple percentages for each of 57 locations across Vancouver Island, British Columbia, Canada. Calculations present the likelihood of earthquake shaking on Vancouver Island as the probabilities of exceeding each of three shaking intensity thresholds (“widely felt”; onset of “non-structurally damaging” shaking; and onset of “structurally damaging” shaking) over four timeframes (10, 25, 50 and 100 years). Results are based on the latest Geological Survey of Canada hazard models used for the 2010 national building code and are presented in both tabular and graphic formats. This simplified earthquake hazard information is offered to aid local residents, organizations and governments in understanding and assessing their risk and to encourage and facilitate sound earthquake preparedness funding decisions.  相似文献   

19.
胡国华  夏军 《冰川冻土》2002,24(4):433-437
以概率论和灰色系统理论方法为基础,利用灰色概率、灰色概率分布、灰色期望及灰色方差等基本概念,针对环境系统的随机不确定性和灰色不确定性,建立了基于灰色概率的非突发性环境风险度的量化方法. 将非突发性环境风险归因于环境系统的随机不确定性和灰色不确定性,将影响环境容量和环境负荷耗用量的变量的分布处理成灰色概率分布,并用具有灰色概率形式的环境风险度来量化环境系统的非突发性失效风险性. 最后,将具有灰色概率形式的环境风险度转化成一般的系统失效风险率,进而用改进一阶二矩法进行计算. 作为算例给出了该方法应用于嘉陵江苍溪段有机污染风险度的估算.  相似文献   

20.
Monte Carlo Simulation (MCS) method has been widely used in probabilistic analysis of slope stability, and it provides a robust and simple way to assess failure probability. However, MCS method does not offer insight into the relative contributions of various uncertainties (e.g., inherent spatial variability of soil properties and subsurface stratigraphy) to the failure probability and suffers from a lack of resolution and efficiency at small probability levels. This paper develop a probabilistic failure analysis approach that makes use of the failure samples generated in the MCS and analyzes these failure samples to assess the effects of various uncertainties on slope failure probability. The approach contains two major components: hypothesis tests for prioritizing effects of various uncertainties and Bayesian analysis for further quantifying their effects. Equations are derived for the hypothesis tests and Bayesian analysis. The probabilistic failure analysis requires a large number of failure samples in MCS, and an advanced Monte Carlo Simulation called Subset Simulation is employed to improve efficiency of generating failure samples in MCS. As an illustration, the proposed probabilistic failure analysis approach is applied to study a design scenario of James Bay Dyke. The hypothesis tests show that the uncertainty of undrained shear strength of lacustrine clay has the most significant effect on the slope failure probability, while the uncertainty of the clay crust thickness contributes the least. The effect of the former is then further quantified by a Bayesian analysis. Both hypothesis test results and Bayesian analysis results are validated against independent sensitivity studies. It is shown that probabilistic failure analysis provides results that are equivalent to those from additional sensitivity studies, but it has the advantage of avoiding additional computational times and efforts for repeated runs of MCS in sensitivity studies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号