首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 17 毫秒
1.
杜克平  薛坤 《湖泊科学》2016,28(3):654-660
水体辐射传输方程是复杂的微积分方程,只能利用数值方法求解,如Monte Carlo光线追踪法、不变嵌入法、离散坐标法等,其中,Monte Carlo方法是目前解决水体水下光场三维问题的唯一有效方法.根据辐射传输理论,开发了水下光场的Monte Carlo模拟模型,主要包含大气、水-气界面、层化水体和水底边界4个模块.实现了模拟任意太阳角度、不同水体固有光学属性和任意深度条件下,考虑大气、粗糙水面和水底边界的水下光场,能够获取辐亮度、辐照度等辐射量的空间分布.该模型暂不考虑Raman散射、偏振、内部光源的影响.实现了GPU加速水下光场Monte Carlo模拟,并用Mobley等提出的海洋光学标准问题中的问题1~6进行验证.在两种计算环境下,通过对不同边界条件下的CPU、GPU运行时间及加速比的对比,发现GPU计算可以达到几百至上千倍的加速比.  相似文献   

2.
A. Veihe  J. Quinton 《水文研究》2000,14(5):915-926
Knowledge about model uncertainty is essential for erosion modelling and provides important information when it comes to parameterizing models. In this paper a sensitivity analysis of the European soil erosion model (EUROSEM) is carried out using Monte Carlo simulation, suitable for complex non‐linear models, using time‐dependent driving variables. The analysis revealed some important characteristics of the model. The variability of the static output parameters was generally high, with the hydrologic parameters being the most important ones, especially saturated hydraulic conductivity and net capillary drive followed by the percentage basal area for the hydrological and vegetation parameters and detachability and cohesion for the soil erosion parameters. Overall, sensitivity to vegetation parameters was insignificant. The coefficient of variation for the sedigraph was higher than for the hydrograph, especially from the beginning of the rainstorm and up to the peak, and may explain difficulties encountered when trying to match simulated hydrographs and sedigraphs with observed ones. The findings from this Monte Carlo simulation calls for improved within‐storm modelling of erosion processes in EUROSEM. Information about model uncertainty will be incorporated in a new EUROSEM user interface. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

3.
High concentrations of ammonia in a river can cause fish kills and harms to other aquatic organisms. A simple water quality model is needed to predict the probability of ammonia concentration violations as compared to the US Environmental Protection Agencys ammonia criteria. A spreadsheet with Random Monte Carlo (RMC) simulations to model ammonia concentrations at the mixing point (between a river and the effluent of a wastewater treatment plant) was developed with the use of Microsoft Excel and Crystal Ball add-in software. The model uses effluent and river ammonia, alkalinity, and total carbonate data to determine the probability density functions (PDFs) for the Monte Carlo simulations. Normal, lognormal, exponential and uniform probability distributions were tested using the Chi-square method and p-value associated with it to choose the best fit to the random data selected from the East Burlington wastewater treatment plant in North Carolina and the Clinch River in Tennessee. It is suggested that different options be tested with a minimum of three classes and a maximum of n/5 classes (n = number of data points) and the highest probability (p-value) for the PDF being tested be chosen. The results indicted that six violations to the EPA criterion for maximum concentration (CMC) were predicted when using 2000 RMC simulations and PDFs fitted to the available data, which violate the current criterion of no more than one violation over 3 years. All violations occur when the pH of the blend ranges from 8.0 to 9.0. No violations were found to the criteria of chronic concentration (CCC) using RMC.  相似文献   

4.
The goal of the presented research was the derivation of flood hazard maps, using Monte Carlo simulation of flood propagation at an urban site in the UK, specifically an urban area of the city of Glasgow. A hydrodynamic model describing the propagation of flood waves, based on the De Saint Venant equations in two‐dimensional form capable of accounting for the topographic complexity of the area (preferential outflow paths, buildings, manholes, etc.) and for the characteristics of prevailing imperviousness typical of the urban areas, has been used to derive the hydrodynamic characteristics of flood events (i.e. water depths and flow velocities). The knowledge of the water depth distribution and of the current velocities derived from the propagation model along with the knowledge of the topographic characteristics of the urban area from digital map data allowed for the production of hazard maps based on properly defined hazard indexes. These indexes are evaluated in a probabilistic framework to overcome the classical problem of single deterministic prediction of flood extent for the design event and to introduce the concept of the likelihood of flooding at a given point as the sum of data uncertainty, model structural error and parameterization uncertainty. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

5.
A sensitivity analysis of the surface and catchment characteristics in the European soil erosion model (EUROSEM) was carried out with special emphasis on rills and rock fragment cover. The analysis focused on the use of Monte Carlo simulation but was supplemented by a simple sensitivity analysis where input variables were increased and decreased by 10%. The study showed that rock fragments have a significant effect upon the static output parameters of total runoff, peak flow rate, total soil loss and peak sediment discharge, but with a high coefficient of variation. The same applied to the average hydrographs and sedigraphs although the peak of the graphs was associated with a low coefficient of variation. On average, however, the model was able to simulate the effect of rock fragment cover quite well. The sensitivity analysis through the Monte Carlo simulation showed that the model is particularly sensitive to changes in parameters describing rills and the length of the plane when no rock fragments are simulated but that the model also is sensitive to changes in the fraction of non‐erodible material and interrill slope when rock fragments were embedded in the topsoil. For rock fragments resting on the surface, changes in parameter values did not affect model output significantly. The simple sensitivity analysis supported the findings from the Monte Carlo simulation and illustrates the importance when choosing input parameters to describe both rills and rock fragment cover when modelling with EUROSEM. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

6.
The expected head and standard deviation of the head from the first order Taylor series approximation is compared to Monte Carlo simulation, for steady flow in a confined aquifer with transmissivity as a random variable. Emphasis is on the effect of changes in the covariance structure of the transmissivity, and pumping rates, on the errors in the first order Taylor series approximation. The accuracy of the first order Taylor series approximation is found to be particularly sensitive to pumping rates. With significant pumping the approximation is found to under estimate both the expected drawdown and head variance, and the error increases as the pumping rate increases. This can lead to large errors in probability constraints based on moments from the first order Taylor series approximation.  相似文献   

7.
Monte Carlo procedures were used to evaluate the effects of spatial variations in the values of the infiltration parameter on the results of the ANSWERS distributed runoff and erosion model. Simulation results obtained were compared with measured values. Field infiltration measurements indicated spatial correlation at much smaller distances than the size of an element. Therefore, at first only the error of the mean had to be taken into consideration for block infiltration rates. Consequently, not only single hydrographs were produced, but also error bands. Secondly, nine other hypothetical spatial correlation structures were also evaluated using Monte Carlo methods. in particular at low nugget variances, increasing spatial correlation of infiltration resulted in increasing coefficients of variation in model outputs. In general, rainstorms with low rainfall intensities were more difficult to simulate accurately than extreme events with high rainfall intensities. This is explained by the greater influence of the infiltration uncertainties at low rainfall intensities.  相似文献   

8.
Fragility curves represent the conditional probability that a structure's response may exceed the performance limit for a given ground motion intensity. Conventional methods for computing building fragilities are either based on statistical extrapolation of detailed analyses on one or two specific buildings or make use of Monte Carlo simulation with these models. However, the Monte Carlo technique usually requires a relatively large number of simulations to obtain a sufficiently reliable estimate of the fragilities, and it is computationally expensive and time consuming to simulate the required thousands of time history analyses. In this paper, high‐dimensional model representation based response surface method together with the Monte Carlo simulation is used to develop the fragility curve, which is then compared with that obtained by using Latin hypercube sampling. It is used to replace the algorithmic performance‐function with an explicit functional relationship, fitting a functional approximation, thereby reducing the number of expensive numerical analyses. After the functional approximation has been made, Monte Carlo simulation is used to obtain the fragility curve of the system. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
Previous studies comparing sediment fingerprinting un-mixing models report large differences in their accuracy. The representation of tracer concentrations in source groups is perhaps the largest difference between published studies. However, the importance of decisions concerning the representation of tracer distributions has not been explored explicitly. Accordingly, potential sediment sources in four contrasting catchments were intensively sampled. Virtual sample mixtures were formed using between 10 and 100% of the retrieved samples to simulate sediment mobilization and delivery from subsections of each catchment. Source apportionment used models with a transformed multivariate normal distribution, normal distribution, 25th–75th percentile distribution and a distribution replicating the retrieved source samples. The accuracy and precision of model results were quantified and the reasons for differences were investigated. The 25th–75th percentile distribution produced the lowest mean inaccuracy (8.8%) and imprecision (8.5%), with the Sample Based distribution being next best (11.5%; 9.3%). The transformed multivariate (16.9%; 17.3%) and untransformed normal distributions (16.3%; 20.8%) performed poorly. When only a small proportion of the source samples formed the virtual mixtures, accuracy decreased with the 25th–75th percentile and Sample Based distributions so that when <20% of source samples were used, the actual mixture composition infrequently fell outside of the range of uncertainty shown in un-mixing model outputs. Poor performance was due to combined random Monte Carlo numbers generated for all tracers not being viable for the retrieved source samples. Trialling the use of a 25th–75th percentile distribution alongside alternatives may result in significant improvements in both accuracy and precision of fingerprinting estimates, evaluated using virtual mixtures. Caution should be exercised when using a normal type distribution, without exploration of alternatives, as un-mixing model performance may be unacceptably poor.  相似文献   

10.
Entrainment of underlying bed sediment by a debris flow can significantly increase the debris‐flow magnitude. To study this phenomenon, a theoretical approach to assessing bed‐sediment entrainment is presented. The approach is based on a static approximation that bed‐sediment entrainment occurs when the shearing stress of the flow is sufficiently high to overcome the basal resistance of the bed sediment. In order to delineate erodible zones in a channel, we analyze the critical condition of this static equilibrium model, and subsequently propose a new concept of a critical line to detect the entrainment reaches in a channel. Considering the spatial and temporal uncertainties of the input parameter, the approach is further incorporated within a Monte Carlo method, and the distribution of entrainment zones and post‐entrainment volumes can be analyzed. This approach is illustrated by back‐analysis of the 2010 Yohutagawa debris‐flow event, Japan. Results from 10 000 trials of Monte Carlo simulation are compared with the in situ surveys. It is shown that the present approach can be satisfactorily used to delineate erodible zones and estimate possible entrainment volume of the event. Discussion regarding the sensitivities and limitations of the approach concludes the paper. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

11.
基于CGCM2对未来100年气候的9个模拟试验,对中国半干旱地区青海湖、岱海和呼伦湖及其流域,运用蒙特卡罗分析法模拟湖泊水量对气候变化的响应以及相应的概率.结果表明,从2020s,2050s和2080s三个时期温度增加的发生频率高于75%的分布看,温度将稳定增加2-5℃.未来的年平均温度增幅将超过了过去50年的观测记录,与过去一万年期间高温期的变化幅度相当.三个时期75%以上发生频率的温度和降水变化将会分别引起青海湖流域为-5%至 10%,呼伦湖流域为-7%至 5%,岱海流域为 2%至 12%的降水变化.虽然未来年降水总量的变幅没有超过过去50年器测记录变幅,更不及全新世的降水变化量,但湖泊水量对气候变化的反映变率较变幅要大.模拟的气候变化在75%概率的情况下,未来3个湖泊水量将有累计30%-45%的变化,变幅在±10%之间.快速的湖泊水量变化不能不引起对不远未来的水资源状况的重视和警备.  相似文献   

12.
Permanent fault displacements (PFDs) because of fault ruptures emerging at the surface are critical for seismic design and risk assessment of continuous pipelines. They impose significant compressive and tensile strains to the pipe cross‐section at pipe‐fault crossings. The complexity of fault rupture, inaccurate mapping of fault location and uncertainties in fault‐pipe crossing geometries require probabilistic approaches for assessing the PFD hazard and mitigating pipeline failure risk against PFD. However, the probabilistic approaches are currently waived in seismic design of pipelines. Bearing on these facts, this paper first assesses the probabilistic PFD hazard by using Monte Carlo‐based stochastic simulations whose theory and implementation are given in detail. The computed hazard is then used in the probabilistic risk assessment approach to calculate the failure probability of continuous pipelines under different PFD levels as well as pipe cross‐section properties. Our probabilistic pipeline risk computations consider uncertainties arising from complex fault rupture and geomorphology that result in inaccurate mapping of fault location and fault‐pipe crossings. The results presented in this paper suggest the re‐evaluation of design provisions in current pipeline design guidelines to reduce the seismic risk of these geographically distributed structural systems. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

13.
A six‐year monitoring programme characterized the migration/dispersion patterns of sediment slugs generated following typhoon‐induced disturbances in 1993 and 1997 along a single‐thread gravel‐bed stream, Oyabu Creek, on Kyushu Island, Japan. This laterally con?ned creek comprises rif?e–pool sequences with intervening bedrock outcrops. The passage of sediment pulses associated with sediment slug processes re?ected, and was controlled by, the rif?e–pool structures which provided channel bed roughness, the volume of sediment stored along valley ?oors, and the distribution of bedrock outcrops. Changes to bed material size following major sediment inputs during the disturbance events also exerted an in?uence on subsequent sediment slug processes. The sequence of rainfall events, together with changes to channel bed structure, induced different phases in the sediment slug processes. The capacity of a reach to store or trap sediment, as recorded by the longitudinal structure of the channel, varied during these differing phases. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

14.
Y. Chebud  A. Melesse 《水文研究》2013,27(10):1475-1483
Lake Tana is the largest fresh water body situated in the north‐western highlands of Ethiopia. In addition to its ecological services, it serves for local transport, electric power generation, fishing, recreational purposes, and source of dry season irrigation water supply. Evidence shows that the lake has dried at least once at about 15,000–17,000 before present owing to a combination of high evaporation and low precipitation events. Past attempts to understand and simulate historical fluctuation of Lake Tana based on simplistic water balance approach of inflow, outflow, and storage have failed to capture well‐known events of drawdown and rise of the lake that have happened in the last 44 years. This study tested different stochastic methods of lake level and volume simulation for supporting Lake Tana operational planning decision support. Three stochastic methods (perturbations approach, Monte Carlo methods, and wavelet analysis) were employed for lake level and volume simulation, and the results were compared with the stage level measurements. Forty‐four years of daily, monthly, and mean annual lake level data have shown a Gaussian variation with goodness of fit at 0.01 significant levels of the Kolmogorov–Smirnov test. The stochastic simulations predicted the lake stage level of the 1972, 1984, and 2002/2003 historical droughts 99% of the time. The information content (frequency) of fluctuation of Lake Tana for various periods was resolved using Wigner's Time‐Frequency Decomposition method. The wavelet analysis agreed with the perturbations and Monte Carlo simulations resolving the time (1970s, 1980s, and 2000s) in which low frequency and high spectral power fluctuation has occurred. The Monte Carlo method has shown its superiority for risk analysis over perturbation and deterministic method whereas wavelet analysis reconstructed historical record of lake stage level at daily and monthly time scales. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

15.
Landslides constitute one of the major natural hazards that could cause significant losses of life and property. Mapping or delineating areas prone to landsliding is therefore essential for land‐use activities and management decision making in hilly or mountainous regions. A landslide hazard map can be constructed by a qualitative combination of maps of site conditions, including geology, topography and geomorphology, by statistical methods through correlating landslide occurrence with geologic and geomorphic factors, or by using safety factors from stability analysis. A landslide hazard map should provide information on both the spatial and temporal probabilities of landsliding in a certain area. However, most previous studies have focused on susceptibility mapping, rather than on hazard mapping in a spatiotemporal context. This study aims at developing a predictive model, based on both quasi‐static and dynamic variables, to determine the probability of landsliding in terms of space and time. The study area selected is about 13 km2 in North Lantau, Hong Kong. The source areas of the landslides caused by the rainstorms of 18 July 1992 and 4–5 November 1993 were interpreted from multi‐temporal aerial photographs. Landslide data, lithology, digital elevation model data, land cover, and rainfall data were digitized into a geographic information system database. A logistic regression model was developed using lithology, slope gradient, slope aspect, elevation, slope shape, land cover, and rolling 24 h rainfall as independent variables, since the dependent variable could be expressed in a dichotomous way. This model achieved an overall accuracy of 87·2%, with 89·5% of landslide grid cells correctly classified and found to be performing satisfactorily. The model was then applied to rainfalls of a variety of periods of return, to predict the probability of landsliding on natural slopes in space and time. It is observed that the modelling techniques described here are useful for predicting the spatiotemporal probability of landsliding and can be used by land‐use planners to develop effective management strategies. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

16.
Increasing recognition of the deleterious environmental effects of excessive fine sediment delivery to watercourses means that reliable sediment source assessment represents a fundamental component of catchment planning targeting the protection of freshwater resources and their ecological integrity. Sediment tracing or fingerprinting approaches have been increasingly used to provide catchment scale sediment source information, but there is a need to continue refining existing procedures especially with respect to uncertainty analysis during mass balance modelling. Consequently, an updated Monte Carlo numerical modelling framework was devised and tested, incorporating both conventional and robust statistics coupled with random and Latin Hypercube Sampling (LHS) together with local and genetic algorithm (GA) optimisation. A sediment sourcing study undertaken in the River Axe catchment, southwest England, suggested that the use of robust statistics and LHS with GA optimisation generated the best performance with respect to predicting measured bed sediment geochemistry in six out of eight model applications. On this basis, the catchment‐wide average median sediment source contributions were predicted to be 38 ± 1% (pasture topsoils), 3 ± 1% (cultivated topsoils), 37 ± 1% (damaged road verges) and 22 ± 1% (channel banks/subsurface sources). Using modelling frameworks which provide users with flexibility to compare local and global optimisation during uncertainty analysis is recommended for future sediment tracing studies. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

17.
脉冲中子伽马能谱测井是利用多探测器获取非弹性散射伽马、热中子俘获伽马能谱及伽马时间谱,通过C/O和地层宏观吸收截面来确定饱和度,但俘获伽马能谱的信息没有得到很好利用,在地层水矿化度较高的情况下,由于俘获伽马能谱的Si,Ca和H的能窗内总的伽马计数与Fe能窗内的伽马计数比值与含氯量有关,因此由俘获伽马能谱确定含水饱和度是可行的,利用MCNP(蒙特卡罗方法)模拟研究了三能窗计数与Fe能窗计数比值与饱和度、岩性、孔隙度、泥质含量、井眼持水率、地层水矿化度以及井眼套管等条件下的关系,为在高矿化度地区利用俘获伽马能谱确定饱和度的可行性在理论上提供了支持.  相似文献   

18.
The conventional integral approach is very well established in probabilistic seismic hazard assessment (PSHA). However, Monte‐Carlo (MC) simulations can become an efficient and flexible alternative against conventional PSHA when more complicated factors (e.g. spatial correlation of ground shaking) are involved. This study aims at showing the implementation of MC simulation techniques for computing the annual exceedance rates of dynamic ground‐motion intensity measures (GMIMs) (e.g. peak ground acceleration and spectral acceleration). We use multi‐scale random field technique to incorporate spatial correlation and near‐fault directivity while generating MC simulations to assess the probabilistic seismic hazard of dynamic GMIMs. Our approach is capable of producing conditional hazard curves as well. We show various examples to illustrate the potential use of the proposed procedures in the hazard and risk assessment of geographically distributed structural systems. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

19.
Probabilistic thresholds for triggering shallow landslides by rainfall are developed using two approaches: a logistic regression model and Iverson's physically based model. Both approaches are applied to a 180 km2 area in northern Italy. For the physically based model a Monte Carlo approach is used to obtain probabilities of slope failure associated with differing combinations of rainfall intensity and duration as well as differing topographic settings. For the logistic regression model hourly and daily rainfall data and split‐sample testing are used to explore the effect of antecedent rainfall on triggering thresholds. It is demonstrated that both the statistical and physically based models provide stochastic thresholds that express the probability of landslide triggering. The resulting thresholds are comparable, even though the two approaches are conceptually different. The physically based model also provides an estimate of the percentage of potentially unstable areas in which failure can be triggered with a certain probability. The return period of rainfall responsible for landslide triggering is studied by using a Gumbel scaling model of rainfall intensity–duration–frequency curves. It is demonstrated that antecedent rainfall must be taken into account in landslide forecasting, and a method is proposed to correct the rainfall return period by filtering the rainfall maxima with a fixed threshold of antecedent rainfall. This correction produces an increase of the return periods, especially for rainstorms of short duration. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

20.
Much research has been conducted for physics‐based ground‐motion simulation to reproduce seismic response of soil and structures precisely and to mitigate damages caused by earthquakes. We aimed at enabling physics‐based ground‐motion simulations of complex three‐dimensional (3D) models with multiple materials, such as a digital twin (high‐fidelity 3D model of the physical world that is constructed in cyberspace). To perform one case of such simulation requires high computational cost and it is necessary to perform a number of simulations for the estimation of parameters or consideration of the uncertainty of underground soil structure data. To overcome this problem, we proposed a fast simulation method using graphics processing unit computing that enables a simulation with small computational resources. We developed a finite‐element‐based method for large‐scale 3D seismic response analysis with small programming effort and high maintainability by using OpenACC, a directive‐based parallel programming model. A lower precision variable format was introduced to achieve further speeding up of the simulation. For an example usage of the developed method, we applied the developed method to soil liquefaction analysis and conducted two sets of simulations that compared the effect of countermeasures against soil liquefaction: grid‐form ground improvement to strengthen the earthquake resistance of existing houses and replacement of liquefiable backfill soil of river wharves for seismic reinforcement of the wharf structure. The developed method accelerates the simulation and enables us to quantitatively estimate the effect of countermeasures using the high‐fidelity 3D soil‐structure models on a small cluster of computers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号