首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 501 毫秒
1.
The active layer is the top layer of permafrost soils that thaws during the summer season due to increased ambient temperatures and solar radiation inputs. This layer is important because almost all biological activity takes place there luring the summer. The depth of active layer thaw is influenced by climatic conditions. Vegetation has also been found to have a strong impact on active layer thaw, because it can intercept incoming radiation, thereby insulating the soil from ambient conditions. In order to look at the role of vegetation and climate on active layer thaw, we measured thaw depth and the Normalized Difference Vegetation Index (NDVI; a proxy for aboveground plant biomass) along a latitudinal temperature gradient in arctic Alaska and Canada. At each site several measurements of thaw and NDVI were taken in areas with high amounts of vegetation and areas with little to no vegetation. Results show that the warmest regions, which had the greatest levels of NDVI, had relatively shallow thaw depths, and the coldest regions, which had the lowest levels of NDVI, also had relatively shallow thaw depths. The intermediate regions, which had moderate levels of NDVI and air temperature, had the greatest depth of thaw. These results indicate that temperature and vegetation interact to control the depth of the active layer across a range of arctic ecosystems. By developing a relationship to explain thaw depth through NDVI and temperature or latitude, the possibility exists to extrapolate thaw depth over large scales via remote sensing applications.  相似文献   

2.
This study aims to extend the multivariate adaptive regression splines(MARS)-Monte Carlo simulation(MCS) method for reliability analysis of slopes in spatially variable soils. This approach is used to explore the influences of the multiscale spatial variability of soil properties on the probability of failure(P_f) of the slopes. In the proposed approach, the relationship between the factor of safety and the soil strength parameters characterized with spatial variability is approximated by the MARS, with the aid of Karhunen-Loeve expansion. MCS is subsequently performed on the established MARS model to evaluate Pf.Finally, a nominally homogeneous cohesive-frictional slope and a heterogeneous cohesive slope, which are both characterized with different spatial variabilities, are utilized to illustrate the proposed approach.Results showed that the proposed approach can estimate the P_f of the slopes efficiently in spatially variable soils with sufficient accuracy. Moreover, the approach is relatively robust to the influence of different statistics of soil properties, thereby making it an effective and practical tool for addressing slope reliability problems concerning time-consuming deterministic stability models with low levels of P_f.Furthermore, disregarding the multiscale spatial variability of soil properties can overestimate or underestimate the P_f. Although the difference is small in general, the multiscale spatial variability of the soil properties must still be considered in the reliability analysis of heterogeneous slopes, especially for those highly related to cost effective and accurate designs.  相似文献   

3.
This paper presents probabilistic assessment of seismically-induced slope displacements considering uncertainties of seismic ground motions and soil properties.A stochastic ground motion model representing both the temporal and spectral non-stationarity of earthquake shakings and a three-dimensional rotational failure mechanism are integrated to assess Newmark-type slope displacements.A new probabilistic approach that incorporates machine learning in metamodeling technique is proposed,by combining relevance vector machine with polynomial chaos expansions(RVM-PCE).Compared with other PCE methods,the proposed RVM-PCE is shown to be more effective in estimating failure probabilities.The sensitivity and relative influence of each random input parameter to the slope displacements are discussed.Finally,the fragility curves for slope displacements are established for sitespecific soil conditions and earthquake hazard levels.The results indicate that the slope displacement is more sensitive to the intensities and strong shaking durations of seismic ground motions than the frequency contents,and a critical Arias intensity that leads to the maximum annual failure probabilities can be identified by the proposed approach.  相似文献   

4.
Grain-size distribution data,as a substitute for measuring hydraulic conductivity(K),has often been used to get K value indirectly.With grain-size distribution data of 150 sets of samples being input data,this study combined the Artificial Neural Network technology(ANN)and Markov Chain Monte Carlo method(MCMC),which replaced the Monte Carlo method(MC)of Generalized Likelihood Uncertainty Estimation(GLUE),to establish the GLUE-ANN model for hydraulic conductivity prediction and uncertainty analysis.By means of applying the GLUE-ANN model to a typical piedmont region and central region of North China Plain,and being compared with actually measured values of hydraulic conductivity,the relative error ranges are between 1.55%and 23.53%and between 14.08%and 27.22%respectively,the accuracy of which can meet the requirements of groundwater resources assessment.The global best parameter gained through posterior distribution test indicates that the GLUEANN model,which has satisfying sampling efficiency and optimization capability,is able to reasonably reflect the uncertainty of hydrogeological parameters.Furthermore,the influence of stochastic observation error(SOE)in grain-size analysis upon prediction of hydraulic conductivity was discussed,and it is believed that the influence can not be neglected.  相似文献   

5.
It is well known that the compressibility of crushable granular materials increases with the moisture content,due to the decrease of particle strength in a humid environment.An existing approach to take into account the effect of grain breakage in constitutive modeling consists in linking the evolution of the grain size distribution to the plastic work.But how the material humidity can affect this relationship is not clear,and experimental evidence is quite scarce.Based on compression tests on dry and saturated crushable sand recently reported by the present authors,a new non-linear relationship is proposed between the amount of particle breakage and the plastic work.The expression contains two parameters:(1)a material constant dependent on the grain characteristics and(2)a constant depending on the wetting condition(in this study,dry or saturated).A key finding is that the relationship does not depend on the stress path and,for a given wetting condition,only one set of parameters is necessary to reproduce the results of isotropic,oedometric,and triaxial compression tests.The relationship has been introduced into an elastoplastic constitutive model based on the critical state concept with a double yield surface for plastic sliding and compression.The breakage ratio is introduced into the expression of the elastic stiffness,the critical state line and the hardening compression pressure.Incremental stress-strain computations with the model allow the plastic work to be calculated and,therefore,the evolution of particle crushing can be predicted through the proposed non-linear relationship and reintroduced into the constitutive equations.Accurate predictions of the experimental results in terms of both stress-strain relationships and breakage ratio were obtained.  相似文献   

6.
Knowledge of pore-water pressure(PWP)variation is fundamental for slope stability.A precise prediction of PWP is difficult due to complex physical mechanisms and in situ natural variability.To explore the applicability and advantages of recurrent neural networks(RNNs)on PWP prediction,three variants of RNNs,i.e.,standard RNN,long short-term memory(LSTM)and gated recurrent unit(GRU)are adopted and compared with a traditional static artificial neural network(ANN),i.e.,multi-layer perceptron(MLP).Measurements of rainfall and PWP of representative piezometers from a fully instrumented natural slope in Hong Kong are used to establish the prediction models.The coefficient of determination(R^2)and root mean square error(RMSE)are used for model evaluations.The influence of input time series length on the model performance is investigated.The results reveal that MLP can provide acceptable performance but is not robust.The uncertainty bounds of RMSE of the MLP model range from 0.24 kPa to 1.12 k Pa for the selected two piezometers.The standard RNN can perform better but the robustness is slightly affected when there are significant time lags between PWP changes and rainfall.The GRU and LSTM models can provide more precise and robust predictions than the standard RNN.The effects of the hidden layer structure and the dropout technique are investigated.The single-layer GRU is accurate enough for PWP prediction,whereas a double-layer GRU brings extra time cost with little accuracy improvement.The dropout technique is essential to overfitting prevention and improvement of accuracy.  相似文献   

7.
The authors have studied the statistical characteristics of China’s molybdenum deposits to establish the grade-tonnage model based on the data updated to?the end of 2010. The results showed that each type of China’s molybdenum deposits complied with Lasky’s law approximately and the characteristics of grade-tonnage model obey the lognormal distribution. However, there are poor correlations between grade and tonnage respectively. Ultimately,?we aimed to fit the grade-tonnage model through the known distribution function, draw the cumulative?probability curves, and evaluated undiscovered mineral resources of China’s?molybdenum deposits by means of Monte Carlo simulation integrated in MRAS.  相似文献   

8.
With the rising needs of better prediction of the load-displacement performance of grouted anchors in an era of developing large-scale underground infrastructures,the existing methods in literature lack an accurate analytical model for the real-life projects or rigorous understanding of the parameters such as grouting pressures.This paper proposes Fast ICA-MARS as a novel data-driven approach for the prediction of the load-displacement performance of uplift-resisting grouted anchors.The hybrid and data-driven Fast ICA-MARS approach integrates the multivariate adaptive regression splines(MARS)technique with the Fast ICA algorithm which is for Independent Component Analysis(ICA).A database of 4315 observations for 479 different anchors from 7 different projects is established.The database is then used to train,validate and compare the Fast ICA-MARS approach with the classical MARS approach.The developed Fast ICA-MARS model can provide more accurate predictions than MARS.Moreover,the developed Fast ICA-MARS model is easy to interpret since the evaluation of the parameter importance of the independent components can be conducted along with the considerations of the correlations with the original variables.It is noteworthy to point out that the grouting pressures play a central role in the proposed model,which is considered of paramount importance in engineering practices but has not been properly taken into account in any prior analytical or empirical predictive models for the load-displacement relationships.  相似文献   

9.
This paper presents spatial variation of seismic hazard at the surface level for India,covering 6-38°N and 68-98°E.The most recent knowledge on seismic activity in the region has been used to evaluate the hazard incorporating uncertainties associated with the seismicity parameters using different modeling methodologies.Three types of seismic source models,viz.linear sources,gridded seismicity model and areal sources,were considered to model the seismic sources and different sets of ground motion prediction equations were used for different tectonic provinces to characterize the attenuation properties.The hazard estimation at bedrock level has been carried out using probabilistic approach and the results obtained from various methodologies were combined in a logic tree framework.The seismic site characterization of India was done using topographic slope map derived from Digital Elevation Model data.This paper presents estimation of the hazard at surface level,using appropriate site amplification factors corresponding to various site classes based on V_(S30) values derived from the topographic gradient.Spatial variation of surface level peak horizontal acceleration(PHA) for return periods of 475 years and 2475 years are presented as contour maps.  相似文献   

10.
Compression index Ccis an essential parameter in geotechnical design for which the effectiveness of correlation is still a challenge.This paper suggests a novel modelling approach using machine learning(ML)technique.The performance of five commonly used machine learning(ML)algorithms,i.e.back-propagation neural network(BPNN),extreme learning machine(ELM),support vector machine(SVM),random forest(RF)and evolutionary polynomial regression(EPR)in predicting Cc is comprehensively investigated.A database with a total number of 311 datasets including three input variables,i.e.initial void ratio e0,liquid limit water content wL,plasticity index Ip,and one output variable Cc is first established.Genetic algorithm(GA)is used to optimize the hyper-parameters in five ML algorithms,and the average prediction error for the 10-fold cross-validation(CV)sets is set as thefitness function in the GA for enhancing the robustness of ML models.The results indicate that ML models outperform empirical prediction formulations with lower prediction error.RF yields the lowest error followed by BPNN,ELM,EPR and SVM.If the ranges of input variables in the database are large enough,BPNN and RF models are recommended to predict Cc.Furthermore,if the distribution of input variables is continuous,RF model is the best one.Otherwise,EPR model is recommended if the ranges of input variables are small.The predicted correlations between input and output variables using five ML models show great agreement with the physical explanation.  相似文献   

11.
Oguz  Emir Ahmet  Depina  Ivan  Thakur  Vikas 《Landslides》2022,19(1):67-83

Uncertainties in parameters of landslide susceptibility models often hinder them from providing accurate spatial and temporal predictions of landslide occurrences. Substantial contribution to the uncertainties in landslide assessment originates from spatially variable geotechnical and hydrological parameters. These input parameters may often vary significantly through space, even within the same geological deposit, and there is a need to quantify the effects of the uncertainties in these parameters. This study addresses this issue with a new three-dimensional probabilistic landslide susceptibility model. The spatial variability of the model parameters is modeled with the random field approach and coupled with the Monte Carlo method to propagate uncertainties from the model parameters to landslide predictions (i.e., factor of safety). The resulting uncertainties in landslide predictions allow the effects of spatial variability in the input parameters to be quantified. The performance of the proposed model in capturing the effect of spatial variability and predicting landslide occurrence has been compared with a conventional physical-based landslide susceptibility model that does not account for three-dimensional effects on slope stability. The results indicate that the proposed model has better performance in landslide prediction with higher accuracy and precision than the conventional model. The novelty of this study is illustrating the effects of the soil heterogeneity on the susceptibility of shallow landslides, which was made possible by the development of a three-dimensional slope stability model that was coupled with random field model and the Monte Carlo method.

  相似文献   

12.
This study proposes a probabilistic analysis method for modeling rainfall-induced shallow landslide susceptibility by combining a transient infiltration flow model and Monte Carlo simulations. The spatiotemporal change in pore water pressure over time caused by rainfall infiltration is one of the most important factors causing landslides. Therefore, the transient infiltration hydrogeological model was adopted to estimate the pore water pressure within the hill slope and to analyze landslide susceptibility. In addition, because of the inherent uncertainty and variability caused by complex geological conditions and the limited number of available soil samples over a large area, this study utilized probabilistic analysis based on Monte Carlo simulations to account for the variability in the input parameters. The analysis was performed in a geographic information system (GIS) environment because GIS can deal efficiently with a large volume of spatial data. To evaluate its effectiveness, the proposed analysis method was applied to a study area that had experienced a large number of landslides in July 2006. For the susceptibility analysis, a spatial database of input parameters and a landslide inventory map were constructed in a GIS environment. The results of the landslide susceptibility assessment were compared with the landslide inventory, and the proposed approach demonstrated good predictive performance. In addition, the probabilistic method exhibited better performance than the deterministic alternative. Thus, analysis methods that account for uncertainties in input parameters are more appropriate for analysis of an extensive area, for which uncertainties may significantly affect the predictions because of the large area and limited data.  相似文献   

13.
In this paper, the authors present a probabilistic back-analysis of a recent slope failure at a site on Freeway No. 3 in northern Taiwan. Post-event investigations of this failure found uncertain strength parameters and deteriorating anchor systems as the most likely causes for failure. Field measurement after the event indicated an average slip surface of inclination 15°. To account for the uncertainties in input parameters, the probabilistic back analysis approach was adopted. First, the Markov Chain Monte Carlo (MCMC) simulation was used to back-calculate the geotechnical strength parameters and the anchor force. These inverse analysis results, which agreed closely with the findings of the post-event investigations, were then used to validate the maximum likelihood (ML) method, a computationally more efficient back-analysis approach. The improved knowledge of the geotechnical strength parameters and the anchor force gained through the probabilistic inverse analysis better elucidated the slope failure mechanism, which provides a basis for a more rational selection of remedial measures.  相似文献   

14.
A first‐order Taylor series method including direct derivative coding (DDC) is presented as a computationally efficient method for producing the probability distribution associated with calculated geotechnical performance. The probability distribution is employed in reliability analyses to calculate the probability of failure, valuable information that is not typically associated with deterministic analyses. The probability distribution also is used to identify important input parameters and to direct sampling efforts. Another approach to generate the probability distribution is the Monte Carlo (MC) method, however, Taylor series results generally are calculated in less time than the MC approach. One key to the implementation of the Taylor series approach is efficient approximation of the sensitivities required by the Taylor series calculation. DDC provides the technique to produce an efficient Taylor series algorithm. Directly coding the sensitivity analysis into the engineering model is accomplished by automatic and hand programming of derivatives. ADIFOR 2.0 was employed to automatically add derivatives to an existing engineering analysis model. For this paper a meshing program and 3D FEM for soil deformation is used to demonstrate the DDC approach. Although DDC requires a large up‐front programming effort, it is not site or data specific. Therefore, once the derivative programming has been performed, the numerical model can be applied to a wide variety of problems without additional user intervention. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

15.
陈彦  吴吉春 《水科学进展》2005,16(4):482-487
地下水数值模拟是目前定量研究地下水水量和水质的重要手段。使用基于随机理论的MonteCarlo方法来进行地下水数值模拟。这种方法能较好地考虑水文地质参数的空间变异性。主要将MonteCarlo方法和确定性模型模拟方法的模拟结果在渗透系数场、水头场、速度场和浓度场等方面进行了比较。结果表明:在模拟三维非均质含水层中的溶质运移问题时,充分考虑了含水层渗透系数空间变异性的MonteCarlo法比确定性方法更为有效,模拟精度提高了很多,且对模拟误差及误差来源有合理的数学解释。  相似文献   

16.
A review of probabilistic and deterministic liquefaction evaluation procedures reveals that there is a need for a comprehensive approach that accounts for different sources of uncertainty in liquefaction evaluations. For the same set of input parameters, different models provide different factors of safety and/or probabilities of liquefaction. To account for the different uncertainties, including both the model and measurement uncertainties, reliability analysis is necessary. This paper presents a review and comparative study of such reliability approaches that can be used to obtain the probability of liquefaction and the corresponding factor of safety. Using a simplified deterministic Seed method, this reliability analysis has been performed. The probability of liquefaction along with the corresponding factor of safety have been determined based on a first order second moment (FOSM) method, an advanced FOSM (Hasofer–Lind) reliability method, a point estimation method (PEM) and a Monte Carlo simulation (MCS) method. A combined method that uses both FOSM and PEM is presented and found to be simple and reliable for liquefaction analysis. Based on the FOSM reliability approach, the minimum safety factor value to be adopted for soil liquefaction analysis (depending on the variability of soil resistance, shear stress parameters and acceptable risk) has been studied and a new design safety factor based on a reliability approach is proposed.  相似文献   

17.
Uncertainty in surfactant–polymer flooding is an important challenge to the wide-scale implementation of this process. Any successful design of this enhanced oil recovery process will necessitate a good understanding of uncertainty. Thus, it is essential to have the ability to quantify this uncertainty in an efficient manner. Monte Carlo simulation is the traditional uncertainty quantification approach that is used for quantifying parametric uncertainty. However, the convergence of Monte Carlo simulation is relatively low, requiring a large number of realizations to converge. This study proposes the use of the probabilistic collocation method in parametric uncertainty quantification for surfactant–polymer flooding using four synthetic reservoir models. Four sources of uncertainty were considered: the chemical flood residual oil saturation, surfactant and polymer adsorption, and the polymer viscosity multiplier. The output parameter approximated is the recovery factor. The output metrics were the input–output model response relationship, the probability density function, and the first two moments. These were compared with the results obtained from Monte Carlo simulation over a large number of realizations. Two methods for solving for the coefficients of the output parameter polynomial chaos expansion are compared: Gaussian quadrature and linear regression. The linear regression approach used two types of sampling: full-tensor product nodes and Chebyshev-derived nodes. In general, the probabilistic collocation method was applied successfully to quantify the uncertainty in the recovery factor. Applying the method using the Gaussian quadrature produced more accurate results compared with using the linear regression with full-tensor product nodes. Applying the method using the linear regression with Chebyshev derived sampling also performed relatively well. Possible enhancements to improve the performance of the probabilistic collocation method were discussed. These enhancements include improved sparse sampling, approximation order-independent sampling, and using arbitrary random input distribution that could be more representative of reality.  相似文献   

18.
The failure probability of geotechnical structures with spatially varying soil properties is generally computed using Monte Carlo simulation (MCS) methodology. This approach is well known to be very time-consuming when dealing with small failure probabilities. One alternative to MCS is the subset simulation approach. This approach was mainly used in the literature in cases where the uncertain parameters are modelled by random variables. In this article, it is employed in the case where the uncertain parameters are modelled by random fields. This is illustrated through the probabilistic analysis at the serviceability limit state (SLS) of a strip footing resting on a soil with a spatially varying Young's modulus. The probabilistic numerical results have shown that the probability of exceeding a tolerable vertical displacement (P e) calculated by subset simulation is very close to that computed by MCS methodology but with a significant reduction in the number of realisations. A parametric study to investigate the effect of the soil variability (coefficient of variation and the horizontal and vertical autocorrelation lengths of the Young's modulus) on P e was presented and discussed. Finally, a reliability-based design of strip footings was presented. It allows one to obtain the probabilistic footing breadth for a given soil variability.  相似文献   

19.
Probabilistic and fuzzy reliability analysis of a sample slope near Aliano   总被引:13,自引:0,他引:13  
Slope stability assessment is a geotechnical problem characterized by many sources of uncertainty. Some of them, e.g., are connected to the variability of soil parameters involved in the analysis. Beginning from a correct geotechnical characterization of the examined site, only a complete approach to uncertainty matter can lead to a significant result. The purpose of this paper is to demonstrate how to model data uncertainty in order to perform slope stability analysis with a good degree of significance.

Once the input data have been determined, a probabilistic stability assessment (first-order second moment and Monte Carlo analysis) is performed to obtain the variation of failure probability vs. correlation coefficient between soil parameters. A first result is the demonstration of the stability of first-order second moment (FOSM) (both with normal and lognormal distribution assumption) and Monte Carlo (MC) solutions, coming from a correct uncertainty modelling. The paper presents a simple algorithm (Fuzzy First Order Second Moment, FFOSM), which uses a fuzzy-based analysis applied to data processing.  相似文献   


20.
This study presents the response of a vertically loaded pile in undrained clay considering spatially distributed undrained shear strength. The probabilistic study is performed considering undrained shear strength as random variable and the analysis is conducted using random field theory. The inherent soil variability is considered as source of variability and the field is modeled as two dimensional non-Gaussian homogeneous random field. Random field is simulated using Cholesky decomposition technique within the finite difference program and Monte Carlo simulation approach is considered for the probabilistic analysis. The influence of variance and spatial correlation of undrained shear strength on the ultimate capacity as summation of ultimate skin friction and end bearing resistance of pile are examined. It is observed that the coefficient of variation and spatial correlation distance are the most important parameters that affect the pile ultimate capacity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号