首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Virtual California: Fault Model, Frictional Parameters, Applications   总被引:1,自引:0,他引:1  
Virtual California is a topologically realistic simulation of the interacting earthquake faults in California. Inputs to the model arise from field data, and typically include realistic fault system topologies, realistic long-term slip rates, and realistic frictional parameters. Outputs from the simulations include synthetic earthquake sequences and space-time patterns together with associated surface deformation and strain patterns that are similar to those seen in nature. Here we describe details of the data assimilation procedure we use to construct the fault model and to assign frictional properties. In addition, by analyzing the statistical physics of the simulations, we can show that that the frictional failure physics, which includes a simple representation of a dynamic stress intensity factor, leads to self-organization of the statistical dynamics, and produces empirical statistical distributions (probability density functions: PDFs) that characterize the activity. One type of distribution that can be constructed from empirical measurements of simulation data are PDFs for recurrence intervals on selected faults. Inputs to simulation dynamics are based on the use of time-averaged event-frequency data, and outputs include PDFs representing measurements of dynamical variability arising from fault interactions and space-time correlations. As a first step for productively using model-based methods for earthquake forecasting, we propose that simulations be used to generate the PDFs for recurrence intervals instead of the usual practice of basing the PDFs on standard forms (Gaussian, Log-Normal, Pareto, Brownian Passage Time, and so forth). Subsequent development of simulation-based methods should include model enhancement, data assimilation and data mining methods, and analysis techniques based on statistical physics.  相似文献   

2.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment   总被引:15,自引:4,他引:11  
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated.  相似文献   

3.
As part II of a sequence of two papers, previously developed L-moments by Hosking (1990), and the LH-moments by Wang (1997) are further investigated. The LH-moments (L to L4) are used to develop the regional parameters of the generalized extreme value distribution, generalized Pareto (GPA) distribution and the generalized logistic (GLO) distributions. These respective probability distribution functions (PDFs) are evaluated in terms of their performances. Flood peaks by the corresponding PDFs are compared with those generated by Monte Carlo simulation of randomized data, considering the respective LH-moments. The influence of the LH-moments on estimated PDFs are studied by evaluating the relative bias (RBIAS) in quantile estimation due to variability of the k parameter. Karkhe watershed located in western Iran was used as a case study area. Part I of this study identified the study area as regions A and B. The minimum calculated relative root mean square error (RRMSE) and RBIAS between simulated flood peaks and flood peaks by the corresponding PDFs were used in PDF selection, considering the respective LH-moments. The boxplots of the RRMSE tests identified the L3 level of the GPA distribution as the suitable PDF for sample sizes 20 and 80; for region A. Similar results were found for the RBIAS test. As for region B, the boxplots of the RRMSE tests indicated similar results for the three PDFs. However, the boxplots of the RBIAS tests identified the L4 level of the GLO most suitable for sample sizes 20 and 80. Relative efficiencies of the LH-moments were investigated, measured as RRMSE ratios of L-moments over the respective LH-moments. For the most parts the findings of this part of the study were similar to those of part I.  相似文献   

4.
In many fields of study, and certainly in hydrogeology, uncertainty propagation is a recurring subject. Usually, parametrized probability density functions (PDFs) are used to represent data uncertainty, which limits their use to particular distributions. Often, this problem is solved by Monte Carlo simulation, with the disadvantage that one needs a large number of calculations to achieve reliable results. In this paper, a method is proposed based on a piecewise linear approximation of PDFs. The uncertainty propagation with these discretized PDFs is distribution independent. The method is applied to the upscaling of transmissivity data, and carried out in two steps: the vertical upscaling of conductivity values from borehole data to aquifer scale, and the spatial interpolation of the transmissivities. The results of this first step are complete PDFs of the transmissivities at borehole locations reflecting the uncertainties of the conductivities and the layer thicknesses. The second step results in a spatially distributed transmissivity field with a complete PDF at every grid cell. We argue that the proposed method is applicable to a wide range of uncertainty propagation problems.  相似文献   

5.
This paper proposes an approach to estimating the uncertainty related to EPA Storm Water Management Model model parameters, percentage routed (PR) and saturated hydraulic conductivity (Ksat), which are used to calculate stormwater runoff volumes. The methodology proposed in this paper addresses uncertainty through the development of probability distributions for urban hydrologic parameters through extensive calibration to observed flow data in the Philadelphia collection system. The established probability distributions are then applied to the Philadelphia Southeast district model through a Monte Carlo approach to estimate the uncertainty in prediction of combined sewer overflow volumes as related to hydrologic model parameter estimation. Understanding urban hydrology is critical to defining urban water resource problems. A variety of land use types within Philadelphia coupled with a history of cut and fill have resulted in a patchwork of urban fill and native soils. The complexity of urban hydrology can make model parameter estimation and defining model uncertainty a difficult task. The development of probability distributions for hydrologic parameters applied through Monte Carlo simulations provided a significant improvement in estimating model uncertainty over traditional model sensitivity analysis. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

6.
With the potentially devastating consequences of flooding, it is crucial that uncertainties in the modelling process are quantified in flood simulations. In this paper, the impact of uncertainties in design losses on peak flow estimates is investigated. Simulations were carried out using a conceptual rainfall–runoff model called RORB in four catchments along the east coast of New South Wales, Australia. Monte Carlo simulation was used to evaluate parameter uncertainty in design losses, associated with three loss models (initial loss–continuing loss, initial loss–proportional loss and soil water balance model). The results show that the uncertainty originating from each loss model differs and can be quite significant in some cases. The uncertainty in the initial loss–proportional loss model was found to be the highest, with estimates up to 2.2 times the peak flow, whilst the uncertainty in the soil water balance model was significantly less, with up to 60 % variability in peak flows for an annual exceedance probability of 0.02. Through applying Monte Carlo simulation a better understanding of the predicted flows is achieved, thus providing further support for planning and managing river systems.  相似文献   

7.
The Aki-Utsu maximum likelihood method is widely used for estimation of the Gutenberg-Richter b-value, but not all authors are conscious of the method’s limitations and implicit requirements. The Aki/Utsu method requires a representative estimate of the population mean magnitude; a requirement seldom satisfied in b-value studies, particularly in those that use data from small geographic and/or time windows, such as b-mapping and b-vs-time studies. Monte Carlo simulation methods are used to determine how large a sample is necessary to achieve representativity, particularly for rounded magnitudes. The size of a representative sample weakly depends on the actual b-value. It is shown that, for commonly used precisions, small samples give meaningless estimations of b. Our results give estimates on the probabilities of getting correct estimates of b for a given desired precision for samples of different sizes. We submit that all published studies reporting b-value estimations should include information about the size of the samples used.  相似文献   

8.
In weather forecasting, current and past observational data are routinely assimilated into numerical simulations to produce ensemble forecasts of future events in a process termed “model steering”. Here we describe a similar approach that is motivated by analyses of previous forecasts of the Working Group on California Earthquake Probabilities (WGCEP). Our approach is adapted to the problem of earthquake forecasting using topologically realistic numerical simulations for the strike-slip fault system in California. By systematically comparing simulation data to observed paleoseismic data, a series of spatial probability density functions (PDFs) can be computed that describe the probable locations of future large earthquakes. We develop this approach and show examples of PDFs associated with magnitude M > 6.5 and M > 7.0 earthquakes in California.  相似文献   

9.
The correlation between the b-values of acoustic emissions (AEs) and the phase of the moon was investigated at the Underground Research Laboratory (URL) in Canada. The same data as those used in Iwata (2002) were examined, which showed that the occurrence of AEs is correlated with the phase of the moon. It was expected, therefore, that the b-value of the AEs would also be sensitive to tidal stress/strain fluctuations. We investigated the variation of the b-values as a function of the phase of the moon. Results show that b-values immediately following the times of full/new moon are higher than those at other times. Using AIC (Akaike Information Criterion) and random (Monte Carlo) simulations, it was confirmed that this feature is statistically significant. We also investigated whether or not there was a change in the b-values immediately before the times of full/new moon, but no statistically significant change was observed. The results suggest that the effect of stress/strain fluctuations on AE occurrences at the URL is asymmetric to the times of full/new moon.All authors are members of the Academic Robotics Group. In listing The Academic Robotics Group, the authors are endeavoring to place each of the participant institutions on an equal footing in terms of effort and authorship. M. A. Talamini is serving as presenter.  相似文献   

10.
In risk analysis, a complete characterization of the concentration distribution is necessary to determine the probability of exceeding a threshold value. The most popular method for predicting concentration distribution is Monte Carlo simulation, which samples the cumulative distribution function with a large number of repeated operations. In this paper, we first review three most commonly used Monte Carlo (MC) techniques: the standard Monte Carlo, Latin Hypercube sampling, and Quasi Monte Carlo. The performance of these three MC approaches is investigated. We then apply stochastic collocation method (SCM) to risk assessment. Unlike the MC simulations, the SCM does not require a large number of simulations of flow and solute equations. In particular, the sparse grid collocation method and probabilistic collocation method are employed to represent the concentration in terms of polynomials and unknown coefficients. The sparse grid collocation method takes advantage of Lagrange interpolation polynomials while the probabilistic collocation method relies on polynomials chaos expansions. In both methods, the stochastic equations are reduced to a system of decoupled equations, which can be solved with existing solvers and whose results are used to obtain the expansion coefficients. Then the cumulative distribution function is obtained by sampling the approximate polynomials. Our synthetic examples show that among the MC methods, the Quasi Monte Carlo gives the smallest variance for the predicted threshold probability due to its superior convergence property and that the stochastic collocation method is an accurate and efficient alternative to MC simulations.  相似文献   

11.
针对现有的河道水流洪水演算模型只能模拟单一变量(流量或水位)的问题,以水流连续方程和河段蓄水量的两种不同表达形式(蓄水量等于平均过水断面面积与河段长乘积,蓄水量等于河段平均流量与传播时间的乘积)为基础,对马斯京根模型进行了通用性改进,提出了双变量耦合通用演算模型.选取了四大水系(包括内陆河流和入海河流)的16个河段汛期洪水资料进行模型检验,模型验证考虑了地理范围、不同的河段特征和水力特征、洪水量级等因素,全面地检验了模型结构的合理性和模拟实际洪水的有效性.将双变量耦合通用演算模型与传统的马斯京根法进行了效果比较,结果表明双变量耦合通用演算模型的模拟精度高于马斯京根法,模拟效果比马斯京根法稳定一些,而且具有较好的通用性.  相似文献   

12.
A statistical riverine litter propagation (RLP) model based on importance sampling Monte Carlo (ISMC) simulation was developed in order to predict the frequency distribution of certain litter types in river reaches. The model was preliminarily calibrated for plastic sheeting by a pilot study conducted on the River Taff, Wales (UK). Litter movement was predominantly controlled by reach characteristics, such as vegetation overhang and water-course obstructions. These affects were modeled in the simulations, by utilizing geometric distributions of river reaches in the time domain. The proposed model satisfactorily simulated the dosing experiments performed at the River Taff. It was concluded from the preliminary calibrations that, the RLP model can be efficiently utilized to portray litter propagation at any arbitrarily selected river site, provided that the stream flows and reach characteristics are calibrated by representative probability distributions of similar sections. Therefore, the RLP model can be considered as a new statistical technique that can predict litter propagation in river sections.  相似文献   

13.
Breeding ornamental fish in wastewater was a successful solution not only to decrease sanitary risks but also to encourage fish growth. In fact, the secondary treated effluent was used to grow a walking catfish (Clarias batrachus), a western mosquitofish (Gambusia affinis; Poeciliidae), and a leopard pleco (Glyptoperichthys gibbiceps). The growth rate of fish reared in final treated wastewater was significantly higher than of treated effluent (25 and 50%) and the relative growth rate during 2 months reached 2, 4, and 2.5, respectively. Bacterial loads were important in the gills compared to other fish organs (intestine, skin, and edible muscles). However, the total aerobic germs ranged between 2 × 103 and 3.4 × 103 cfu/g in the edible fish species cultured in secondary treated effluent. The pathogenic bacteria Aeromonas hydrophila was absent in all examined fish muscles. However, the presence of tested fishes did not prevent the reduction of the treatment biological parameter (BOD and COD) at half, in the three treated wastewater proportions (25, 50, and 100%) and thereafter, they clearly participated to the tertiary biological treatment of used water. Further bacteriological and physico‐chemical analyses indicated that the use of treated wastewater in aquaculture is safe and risks to human health are reduced.  相似文献   

14.
Three kinds of the widely-used cloudiness parameterizations are compared with data produced from the cloud-resolving model(CRM) simulations of the tropical cloud system. The investigated schemes include those based on relative humidity(RH), the semi-empirical scheme using cloud condensate as a predictor, and the statistical scheme based on probability distribution functions(PDFs). Results show that all three schemes are successful in reproducing the timing of cloud generation, except for the RH-based scheme, in which low-level clouds are artificially simulated during cloudless days. In contrast, the low-level clouds are well simulated in the semi-empirical and PDF-based statistical schemes, both of which are close to the CRM explicit simulations. In addition to the Gaussian PDF, two alternative PDFs are also explored to investigate the impact of different PDFs on cloud parameterizations. All the PDF-based parameterizations are found to be inaccurate for high cloud simulations, in either the magnitude or the structure. The primary reason is that the investigated PDFs are symmetrically assumed, yet the skewness factors in deep convective cloud regimes are highly significant, indicating the symmetrical assumption is not well satisfied in those regimes. Results imply the need to seek a skewed PDF in statistical schemes so that it can yield better performance in high cloud simulations.  相似文献   

15.
Community-scale simulations were performed to investigate the risk to groundwater and indoor air receptors downgradient of a contaminated site following the remediation of a long-term source. Six suites of Monte Carlo simulations were performed using a numerical model that accounted for groundwater flow, reactive solute transport, soil gas flow, and vapour intrusion in buildings. The model was applied to a three-dimensional, community-scale (250 m × 1000 m × 14 m) domain containing heterogeneous, spatially correlated distributions of the hydraulic conductivity, fraction of organic carbon, and biodegradation rate constant, which were varied between realizations. Analysis considered results from both individual realizations as well as the suite of Monte Carlo simulations expressed through several novel, integrated parameters, such as the probability of exceeding a regulatory standard in either groundwater or indoor air. Results showed that exceedance probabilities varied considerably with the consideration of biodegradation in the saturated zone, and were less sensitive to changes in the variance of hydraulic conductivity or the incorporation of heterogeneous distributions of organic carbon at this spatial scale. A sharp gradient in exceedance probability existed at the lateral edges of the plumes due to variability in lateral dispersion, which defined a narrow region of exceedance uncertainty. Differences in exceedance probability between realizations (i.e., due to heterogeneity uncertainty) were similar to differences attributed to changes in the variance of hydraulic conductivity or fraction of organic carbon. Simulated clean-up times, defined by reaching an acceptable exceedance probability, were found to be on the order of decades to centuries in these community-scale domains. Results also showed that the choice of the acceptable exceedance probability level (e.g., 1 vs. 5 %) would likely affect clean up times on the order of decades. Moreover, in the scenarios examined here, the risk of exceeding indoor air standards was greater than that of exceeding groundwater standards at all times and places. Overall, simulations of coupled transport processes combined with novel spatial and temporal quantification metrics for Monte Carlo analyses, provide practical tools for assessing risk in wider communities when considering site remediation.  相似文献   

16.
Soils in post‐wildfire environments are often characterized by a low infiltration capacity with a high degree of spatial heterogeneity relative to unburned areas. Debris flows are frequently initiated by run‐off in recently burned steeplands, making it critical to develop and test methods for incorporating spatial variability in infiltration capacity into hydrologic models. We use Monte Carlo simulations of run‐off generation over a soil with a spatially heterogenous saturated hydraulic conductivity (Ks) to derive an expression for an aerially averaged saturated hydraulic conductivity ( ) that depends on the rainfall rate, the statistical properties of Ks, and the spatial correlation length scale associated with Ks. The proposed method for determining is tested by simulating run‐off on synthetic topography over a wide range of spatial scales. Results provide a simplified expression for an effective saturated hydraulic conductivity that can be used to relate a distribution of small‐scale Ks measurements to infiltration and run‐off generation over larger spatial scales. Finally, we use a hydrologic model based on to simulate run‐off and debris flow initiation at a recently burned catchment in the Santa Ana Mountains, CA, USA, and compare results to those obtained using an infiltration model based on the Soil Conservation Service Curve Number.  相似文献   

17.
Data assimilation is widely used to improve flood forecasting capability, especially through parameter inference requiring statistical information on the uncertain input parameters (upstream discharge, friction coefficient) as well as on the variability of the water level and its sensitivity with respect to the inputs. For particle filter or ensemble Kalman filter, stochastically estimating probability density function and covariance matrices from a Monte Carlo random sampling requires a large ensemble of model evaluations, limiting their use in real-time application. To tackle this issue, fast surrogate models based on polynomial chaos and Gaussian process can be used to represent the spatially distributed water level in place of solving the shallow water equations. This study investigates the use of these surrogates to estimate probability density functions and covariance matrices at a reduced computational cost and without the loss of accuracy, in the perspective of ensemble-based data assimilation. This study focuses on 1-D steady state flow simulated with MASCARET over the Garonne River (South-West France). Results show that both surrogates feature similar performance to the Monte-Carlo random sampling, but for a much smaller computational budget; a few MASCARET simulations (on the order of 10–100) are sufficient to accurately retrieve covariance matrices and probability density functions all along the river, even where the flow dynamic is more complex due to heterogeneous bathymetry. This paves the way for the design of surrogate strategies suitable for representing unsteady open-channel flows in data assimilation.  相似文献   

18.
In the past, arithmetic and geometric means have both been used to characterise pathogen densities in samples used for microbial risk assessment models. The calculation of total (annual) risk is based on cumulative independent (daily) exposures and the use of an exponential dose–response model, such as that used for exposure to Giardia or Cryptosporidium. Mathematical analysis suggests that the arithmetic mean is the appropriate measure of central tendency for microbial concentration with respect to repeated samples of daily exposure in risk assessment. This is despite frequent characterisation of microbial density by the geometric mean, since the microbial distributions may be Log normal or skewed in nature. Mathematical derivation supporting the use of the arithmetic mean has been based on deterministic analysis, prior assumptions and definitions, the use of point-estimates of probability, and has not included from the outset the influence of an actual distribution for microbial densities. We address these issues by experiments using two real-world pathogen datasets, together with Monte Carlo simulation, and it is revealed that the arithmetic mean also holds in the case of a daily dose with a finite distribution in microbial density, even when the distribution is very highly-skewed, as often occurs in environmental samples. Further, for simplicity, in many risk assessment models, the daily infection risk is assumed to be the same for each day of the year and is represented by a single value, which is then used in the calculation of p Σ, which is a numerical estimate of annual risk, P Σ, and we highlight the fact that is simply a function of the geometric mean of the daily complementary risk probabilities (although it is sometimes approximated by the arithmetic mean of daily risk in the low dose case). Finally, the risk estimate is an imprecise probability with no indication of error and we investigate and clarify the distinction between risk and uncertainty assessment with respect to the predictive model used for total risk assessment.  相似文献   

19.
Conceptual hydrological models are popular tools for simulating land phase of hydrological cycle. Uncertainty arises from a variety of sources such as input error, calibration and parameters. Hydrologic modeling researches indicate that parametric uncertainty has been considered as one of the most important source. The objective of this study was to evaluate parameter uncertainty and its propagation in rainfall-runoff modeling. This study tried to model daily flows and calculate uncertainty bounds for Karoon-III basin, Southwest of Iran, using HEC-HMS (SMA). The parameters were represented by probability distribution functions (PDF), and the effect on simulated runoff was investigated using Latin Hypercube Sampling (LHS) on Monte Carlo (MC). Three chosen parameters, based on sensitivity analysis, were saturated-hydraulic-conductivity (Ks), Clark storage coefficient (R) and time of concentration (t c ). Uncertainty associated with parameters were accounted for, by representing each with a probability distribution. Uncertainty bounds was calculated, using parameter sets captured from LHS on parameters PDF of sub-basins and propagating to the model. Results showed that maximum reliability (11%) resulted from Ks propagating. For three parameters, underestimation was more than overestimation. Maximum sharpness and standard deviation (STD) was resulted from propagating Ks. Cumulative Distribution Function (CDF) of flow and uncertainty bounds showed that as flow increased, the width of uncertainty bounds increased for all parameters.  相似文献   

20.
The results of a comparison between chemical water quality determinants and river water fluorescence on the River Tyne, NE England, demonstrate that tryptophan‐like fluorescence intensity shows statistically significant relationships between nitrate, phosphate, ammonia, biochemical oxygen demand (BOD) and dissolved oxygen. Tryptophan‐like fluorescence intensity at the 280 nm excitation/350 nm emission wavelength fluorescence centre correlates with both phosphate (r = 0·80) and nitrate (r = 0·87), whereas tryptophan‐like fluorescence intensity at the 220 nm excitation/350 nm emission wavelength centre correlates with BOD (r = 0·85), ammonia (r = 0·70) and dissolved oxygen (r = ?0·65). The strongest correlations are between tryptophan‐like fluorescence intensity and nitrate and phosphate, which in the Tyne catchment derive predominantly from point and diffuse source sewage inputs. The correlation between BOD and the tryptophan‐like fluorescence intensity suggests that this fluorescence centre is related to the bioavailable or labile dissolved organic matter pool. The weakest correlations are observed between tryptophan‐like fluorescence intensity and ammonia concentration and dissolved oxygen. The weaker correlation with ammonia is due to removal of the ammonia signal by wastewater treatment, and that with dissolved oxygen due to the natural aeration of the river such that this is not a good indicator of water quality. The observed correlations only hold true when treated sewage, sewerage overflows or cross connections, or agricultural organic pollutants dominate the water quality—this is not true for two sites where airport deicer (propylene glycol, which is non‐fluorescent) or landfill leachate (which contains high concentrations of humic and fulvic‐like fluorescent DOM) dominate the dissolved organic matter in the river. Mean annual tryptophan‐like fluorescence intensity agrees well with the General Water Quality Assessment as determined by the England and Wales environmental regulators, the Environment Agency. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号