首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Chemical etching of pure melt-grown forsterite crystals is reported here for the first time. Cleaved forsterite crystals of known purity, and polished sections of the same crystals have been successfully etched revealing dislocations, subgrain boundaries, inclusions and growth imperfections.  相似文献   

2.
本文提出了计算含矿单元数概率分布函数的新方法,在此基础上,把证据权模型矿产资源评价方法得到的含矿网格单元作为蒙特卡洛模型中的统计对象,得到了与证据权模型找矿法结合的含矿网格单元蒙特卡洛法矿产资源潜力评价方法。该方法可以把评价工作区的矿产空间分布信息、成矿模式和找矿模型、成矿-找矿空间信息引入蒙特卡洛方法中,无需估计矿床个数分布,减少了工作步骤和评价误差,可提高工作效率和自动化。对实际案例的应用表明该方法是可行的。  相似文献   

3.
《岩土力学》2017,(11):3341-3346
描述变量相关性的多元分布模型有着广泛的工程应用。利用多元线性自回归模型和排序算法,研究了具有指定边缘分布和指定相关结构的多维随机变量的生成算法。基于MATLAB语言编程实现了该算法。将该算法应用于结构可靠度和边坡可靠度直接抽样Monte Carlo模拟计算问题中,解决了传统Monte Carlo模拟难以考虑变量互相关性的难题。数值实验表明,该方法计算结果精确,可以放松理论模型的理想化要求,能更真实的反映问题实际。将该方法与一次二阶矩方法计算结果对比发现,一次二阶矩方法往往对可靠度有过高的估计。此外失效概率对变量分布类型比较敏感,相同的安全系数可能对应不同的失效概率。提出的方法可以应用于复杂结构系统的可靠度计算问题,并可推广应用于其他领域的相依变量的仿真问题。  相似文献   

4.
5.
Field observed performance of slopes can be used to back calculate input parameters of soil properties and evaluate uncertainty of a slope stability analysis model. In this paper, a new probabilistic method is proposed for back analysis of slope failure. The proposed back analysis method is formulated based on Bayes’ theorem and solved using the Markov chain Monte Carlo simulation method with a Metropolis–Hasting algorithm. The method is very flexible as any type of prior distribution can be used. The method is also computationally efficient when a response surface method is employed to approximate the slope stability model. An illustrative example of back analysis of a hypothetical slope failure is presented. Effects of jumping distribution functions and number of samples on the efficiency of Markov chains are studied. It is found that the covariance matrix of the jumping function can be set to be one half of the covariance of the prior distribution to achieve a reasonable acceptance rate and that 80,000 samples seem to be sufficient to obtain robust posterior statistics for the example. It is also found that the correlation of cohesion and friction angle of soil does not affect the posterior statistics and the remediation design of the slope significantly, while the type of the prior distribution seems to have much influence on the remediation design.  相似文献   

6.
This paper presents a new methodology for slope reliability analysis by integrating the technologies of updated support vector machine (SVM) and Monte Carlo simulation (MCS). MCS is a powerful tool that may be used to solve a broad range of reliability problems and has therefore become widely used in slope reliability analysis. However, MCS often involves a great number of slope stability analysis computations, a process that requires excessive time consumption. The updated SVM is introduced in order to build the relationship between factor of safety and random variables of slope, contributing to reducing a large number of normal computing tasks and enlarging the problem scale and sample size of MCS. In the algorithm of the updated SVM, the particle swarm optimization method is adopted in order to seek the optimal SVM parameters, enhancing the performance of SVM for solving complex problems in slope stability analysis. Finally, the integrating method is applied to a classic slope for addressing the problem of reliability analysis. The results of this study indicate that the new methodology is capable of obtaining positive results that are consistent with the results of classic solutions; therefore, the methodology is proven to be a powerful and effective tool in slope reliability analysis.  相似文献   

7.
Dislocation modelling of an earthquake fault is of great importance due to the fact that ground surface response may be predicted by the model. However, geological features of a fault cannot be measured exactly, and therefore these features and data involve uncertainties. This paper presents a Monte Carlo based random model of faults with finite element method incorporating split node technique to impose the effects of discontinuities. Length and orientation of the fault are selected as random parameters in the domain model, and hence geometrical uncertainties are encountered. Mean and standard deviation values, as well as probability density function of ground surface responses due to the dislocation are computed. Based on analytical and numerical calculation of dislocation, two approaches of Monte Carlo simulations are proposed. Various comparisons are examined to illustrate the capability of both methods for random simulation of faults.  相似文献   

8.
Geotechnical models are usually associated with considerable amounts of model uncertainty. In this study, the model uncertainty of a geotechnical model is characterised through a systematic comparison between model predictions and past performance data. During such a comparison, model input parameters (such as soil properties) may also be uncertain, and the observed performance may be subjected to measurement errors. To consider these uncertainties, the model uncertainty parameters, uncertain model input parameters and actual performance variables are modelled as random variables, and their distributions are updated simultaneously using Bayes’ theorem. When the number of variables to update is large, solving the Bayesian updating problem is computationally challenging. A hybrid Markov Chain Monte Carlo simulation is employed in this paper to decompose the high-dimensional Bayesian updating problem into a series of updating problems in lower dimensions. To increase the efficiency of the Markov chain, the model uncertainty is first characterised with a first order second moment method approximately, and the knowledge learned from the approximate solution is then used to design key parameters in the Markov chain. Two examples are used to illustrate the proposed methodology for model uncertainty characterisation, with insights, discussions, and comparison with previous methods.  相似文献   

9.

变形速率是衡量构造活动强弱的重要参数, 对其准确限定一直是构造地貌、活动构造研究的重点。以河流地貌作为参考面限定构造变形速率是目前研究中最常用的手段。基于近年来的研究体会和具体的研究案例分析认为, 虽然目前已能获得可靠的变形量和地貌年龄数据, 但若想获得合理、可靠的变形速率, 需要关注这两类数据之间的匹配关系与由其构建的断裂滑动历史的合理性。相较于通过位错量与构造变形时间的比值, 或利用两者进行线性回归的方法来限定变形速率, 从模拟合理的断裂滑动历史的角度出发, 蒙特卡洛方法可有效减小变形速率估计的不确定性, 使变形速率估计更加合理, 从而为地震风险评价等提供可靠的基础资料。

  相似文献   

10.
基于蒙特卡罗边坡稳定二元体系的建立与应用   总被引:3,自引:0,他引:3  
桂勇  邓通发  罗嗣海  周军平 《岩土力学》2014,35(7):1979-1986
边坡是一个具有明显不确定性、模糊性和时变性的系统,安全系数及可靠度理论在边坡稳定评价上各有优缺点。二元体系是基于确定性指标(安全系数)和不确定指标(可靠度)建立的边坡稳定综合评价指标体系,兼有二者的优点,具有重要的理论意义和实践价值。考虑到边坡材料指标具有区间分布及稳定边坡的安全系数不能小于其临界值的特点,对纯数学理论模型进行修正,提出了一种更加符合工程实际的边坡稳定二元评价体系,同时选取蒙特卡罗模拟法,将该二元评价体系融入GeoStudio软件,借助GeoStudio软件强大的计算能力,形成一套完整而高效的边坡稳定二元指标分析方法。采用该方法进行了降雨条件下花岗岩残坡积土边坡稳定性分析,得出了有益的结论,验证了该方法的可行和高效。  相似文献   

11.
ABSTRACT

Physically-based distributed models are implemented for landslide susceptibility and hazard assessment around the world. Probabilistic methodologies are considered appropriate to study and quantify the uncertainties derived from the input parameters of these models. In this paper, three sets of Monte Carlo simulations, each one with 10,000 iterations, were applied for a slope stability analysis in a small basin of Envigado (Colombia), using the TRIGRS model, to characterise the uncertainty in the landslide assessment. Different parameters to determine the minimum number of realizations required to ensure a small variation in the failure probability were proposed and analyzed. The quality of the landslide susceptibility assessment was studied. Unexpected and probably erroneous results that may be common in the maps generated using this and other similar methodologies were identified and explained. Additionally, the distribution of the factor of safety was calculated for different grid cells of the basin, showing that the probability density function with the best adjustment to the frequency histogram of the factor of safety can vary between grid cells. The assumption of a normal distribution for the factor of safety would be inappropriate and would lead to miscalculations in this case study.  相似文献   

12.
李响  贾明涛  王李管  白云飞 《岩土力学》2009,30(4):1186-1190
基于实测节理面空间几何参数,采用蒙特卡罗方法对某大型镍矿III矿区节理空间进行了模拟。采用自主研发的三维矿岩块度预测软件MAKEBLOCK对其矿岩块度分布进行了预测和分析。结果表明,该矿区大多数矿岩块体体积小于 0.2 m3,绝大多数块体形状为盘状和块状,块体等效尺寸大多在0.2 m到1.5 m之间。预测结果可以作为自然崩落法工程设计与实施的参考依据。  相似文献   

13.
A new technique of temperature scaling method combined with the conventional Gibbs Ensemble Monte Carlo simulation was used to study liquid-vapor phase equilibria of the methane-ethane (CH4-C2H6) system. With this efficient method, a new set of united-atom Lennard-Jones potential parameters for pure C2H6 was found to be more accurate than those of previous models in the prediction of phase equilibria. Using the optimized potentials for liquid simulations (OPLS) potential for CH4 and the potential of this study for C2H6, together with a simple mixing rule, we simulated the equilibrium compositions and densities of the CH4-C2H6 mixtures with accuracy close to experiments. The simulated data are supplements to experiments, and may cover a larger temperature-pressure-composition space than experiments. Compared with some well-established equations of state such as Peng-Robinson equation of state (PR-EQS), the simulated results are found to be closer to experiments, at least in some temperature and pressure ranges.  相似文献   

14.
Spatial probabilistic modeling of slope failure using a combined Geographic Information System (GIS), infinite-slope stability model and Monte Carlo simulation approach is proposed and applied in the landslide-prone area of Sasebo city, southern Japan. A digital elevation model (DEM) for the study area has been created at a scale of 1/2500. Calculated results of slope angle and slope aspect derived from the DEM are discussed. Through the spatial interpolation of the identified stream network, the thickness distribution of the colluvium above Tertiary strata is determined with precision. Finally, by integrating an infinite-slope stability model and Monte Carlo simulation with GIS, and applying spatial processing, a slope failure probability distribution map is obtained for the case of both low and high water levels.  相似文献   

15.
The phenomenon of normal grain growth in pure single phase systems is modeled with the Monte Carlo technique and a series of simulations are performed in 2- and 3-dimensions. The results are compared with natural and experimental monomineralic rock samples. In these simulations various lattice models with different anisotropic features in grain boundary energy are examined in order to check the universality of the simulation results. The obtained microstructure varies with the artificial parameter T in each lattice model, where T means scaled temperature and controls thermal fluctuation on grain boundary motion. As T (thermal fluctuation) increases, the boundary energy anisotropy characterizing each lattice model becomes less important for the evolution of the microstructure. As a result the difference in the grain size distribution among the lattice models, which is significantly large for low T , is reduced with increasing T . The distribution independent of both the lattice model and the dimension is obtained at sufficiently high T and is very close to the normal distribution when carrying out the weighting procedure with a weight of the square of each grain radius. A comparison of the planar grain size distribution of the natural and experimental rock samples with the 3-D simulation results reveals that the simulations reproduce very well the distributions observed in the real rock samples. Although various factors such as the presence of secondary minerals and a fluid phase, which are not included in the simulation modeling, are generally considered to influence the real grain growth behavior, the good agreement of the distribution indicates that the overall grain growth behavior in real rocks may still be described by the simplified model used in the present simulations. Thus the grain size distribution obtained from the present simulations is possessed of the universal form characterizing real normal grain growth of which the driving force is essentially grain boundary energy reduction through grain boundary migration. Received: 7 January 1997 / Accepted: 25 August 1997  相似文献   

16.
System effects should be considered in the probabilistic analysis of a layered soil slope due to the potential existence of multiple failure modes. This paper presents a system reliability analysis approach for layered soil slopes based on multivariate adaptive regression splines (MARS) and Monte Carlo simulation (MCS). The proposed approach is achieved in a two-phase process. First, MARS is constructed based on a group of training samples that are generated by Latin hypercube sampling (LHS). MARS is validated by a specific number of testing samples which are randomly generated per the underlying distributions. Second, the established MARS is integrated with MCS to estimate the system failure probability of slopes. Two types of multi-layered soil slopes (cohesive slope and cφ slope) are examined to assess the capability and validity of the proposed approach. Each type of slope includes two examples with different statistics and system failure probability levels. The proposed approach can provide an accurate estimation of the system failure probability of a soil slope. In addition, the proposed approach is more accurate than the quadratic response surface method (QRSM) and the second-order stochastic response surface method (SRSM) for slopes with highly nonlinear limit state functions (LSFs). The results show that the proposed MARS-based MCS is a favorable and useful tool for the system reliability analysis of soil slopes.  相似文献   

17.
Geotechnical and Geological Engineering - The railway embankment slope is a complex open system including uncertainty of soil parameters. Considering the influencing factors with randomness,...  相似文献   

18.
19.
We explore the ability of the greedy algorithm to serve as an effective tool for the construction of reduced-order models for the solution of fully saturated groundwater flow in the presence of randomly distributed transmissivities. The use of a reduced model is particularly appealing in the context of numerical Monte Carlo (MC) simulations that are typically performed, e.g., within environmental risk assessment protocols. In this context, model order reduction techniques enable one to construct a surrogate model to reduce the computational burden associated with the solution of the partial differential equation governing the evolution of the system. These techniques approximate the model solution with a linear combination of spatially distributed basis functions calculated from a small set of full model simulations. The number and the spatial behavior of these basis functions determine the computational efficiency of the reduced model and the accuracy of the approximated solution. The greedy algorithm provides a deterministic procedure to select the basis functions and build the reduced-order model. Starting from a single basis function, the algorithm enriches the set of basis functions until the largest error between the full and the reduced model solutions is lower than a predefined tolerance. The comparison between the standard MC and the reduced-order approach is performed through a two-dimensional steady-state groundwater flow scenario in the presence of a uniform (in the mean) hydraulic head gradient. The natural logarithm of the aquifer transmissivity is modeled as a second-order stationary Gaussian random field. The accuracy of the reduced basis model is assessed as a function of the correlation scale and variance of the log-transmissivity. We explore the performance of the reduced model in terms of the number of iterations of the greedy algorithm and selected metrics quantifying the discrepancy between the sample distributions of hydraulic heads computed with the full and the reduced model. Our results show that the reduced model is accurate and is highly efficient in the presence of a small variance and/or a large correlation length of the log-transmissivity field. The flow scenarios associated with large variances and small correlation lengths require an increased number of basis functions to accurately describe the collection of the MC solutions, thus reducing significantly the computational advantages associated with the reduced model.  相似文献   

20.
Hurst's rescaled range analysis is a useful tool in the examination of a time series and is designed to measure memory content and determine its fractal texture. This study applies the Hurst method to a new earthquake catalogue for Greece. The study also adopts Monte Carlo simulations to provide a statistical test underpinning the Hurst analyses. Together these reveal basic temporal fractal characteristics in the earthquake occurrence time-histories' memory. Three regions are considered, approximately: all of Greece and some surrounding areas, and the sub-zones of the Hellenic Arc and the Gulf of Corinth. Three temporal textures are considered: elapsed time between earthquakes, strain energy release, and earthquake frequency. The elapsed temporal textures for the zone whole Greece indicate distinct characteristics in chronological order and possess long memory. These belong to the class non-random pattern. However, these characteristics generally disappear when the sub-zones are considered and become random patterns. The Monte Carlo simulations support this. Therefore, memoryless statistical seismic hazard estimates may not be suitable for whole Greece but could be useful for the sub-zones. The strain energy release temporal textures for whole Greece and for the sub-zones, no matter that these seem to possess long memory at first analysis, are all random patterns. In other words, the Monte Carlo simulations demonstrate that these patterns are much more likely to happen by chance. The seismic frequency textures for whole Greece and for the sub-zones suggest long memory, however, only the texture for the Hellenic Arc zone (MS ≥ 5.0) and that for whole Greece (MS ≥ 4.0) approach demonstrable non-random patterns. Except for these, other patterns happen by chance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号