The prediction of time to slope failure (TOF) is one of the most pivotal concerns for both geological risk researchers and practitioners. Conventional inverse velocity method (IVM), based on the analysis of displacement monitoring data, has become an effective method to solve this problem because it is easy to perform and the prediction results are generally acceptable. Practically, some limitations like random instrumental noise, environmental noise, and measurement error are ubiquitous factors hampered the reliability of the prediction. In this work, traditional IVM method and modified IVM with three different filters are respectively detected on velocity time series from an landslide event in an open-pit coal mine with the propose of improving, in retrospect, the accuracy of failure predictions. Simultaneously, the effects of noise on the appraisal of IVM graphics are also assessed and explanation. The results demonstrate that the sliding process of landslides can be divided into three signature stages based on the IVM. Noteworthily, the slope failure critical point occurs at the end of the progressive stage and generally coincides with a major acceleration event in which almost integrity of the slope is lost, transitioning to a linear trend ever since. Additionally, the short-term smoothing filter (SSF) and long-term smoothing filter (LSF) models can provide more accuracy and useful information about the probable failure time. Finally, with the intention of enhancing the feasible use of the method and supporting pre-determined response plans, two-level alert procedures combing SSF and LSF are proposed.
The paper ‘Guidelines on the use of inverse velocity method as a tool for setting alarm thresholds and forecasting landslides and structure collapses’ by T. Carlà, E. Intrieri, F. Di Traglia, T. Nolesini, G. Gigli and N. Casagli deals with a sensitive topic for landslide risk management. Exploring the pre-failure behaviour of four different case histories, the authors proposed standard procedures for the application of the inverse velocity method (INV, Fukuzono 1985). Specifically, they suggested guidelines for the filtering of velocity data and an original and simple approach to automatically set the first and the second alarm thresholds using the inverse velocity method. The present discussion addresses three different topics: (1) data filter selection according to the features of monitoring instrument; (2) the importance of data sampling frequency for the forecasting analysis and (3) the influence of the starting point (SP in this discussion) for the application of INV analysis. Moreover, based on this matter, a new method is proposed to update the INV analysis on an ongoing basis. 相似文献
Sequence convolution formulae, based on the B-splines of I. J. Schoenberg provide simple and effective methods for smoothing and differentiating data sequences. Their time and frequency domain properties allow calculation of the degree of smoothing and noise rejection, and their z-transforms lead to the rapid calculation of formulae from a simple sequence of polynomials. As an example of their use, numerical differentiation is used to produce smooth velocity—depth profiles and to delineate major velocity discontinuities from time—depth data logged at a well. 相似文献
To solve large deformation geotechnical problems, a novel strain-smoothed particle finite element method (SPFEM) is proposed that incorporates a simple and effective edge-based strain smoothing method within the framework of original PFEM. Compared with the original PFEM, the proposed novel SPFEM can solve the volumetric locking problem like previously developed node-based smoothed PFEM when lower-order triangular element is used. Compared with the node-based smoothed PFEM known as “overly soft” or underestimation property, the proposed SPFEM offers super-convergent and very accurate solutions due to the implementation of edge-based strain smoothing method. To guarantee the computational stability, the proposed SPFEM uses an explicit time integration scheme and adopts an adaptive updating time step. Performance of the proposed SPFEM for geotechnical problems is first examined by four benchmark numerical examples: (a) bar vibrations, (b) large settlement of strip footing, (c) collapse of aluminium bars column, and (d) failure of a homogeneous soil slope. Finally, the progressive failure of slope of sensitive clay is simulated using the proposed SPFEM to show its outstanding performance in solving large deformation geotechnical problems. All results demonstrate that the novel SPFEM is a powerful and easily extensible numerical method for analysing large deformation problems in geotechnical engineering. 相似文献
A maximum-likelihood procedure for segmenting digital well-log data is presented. The method is based on a univariate state variable model in which an observed log is treated as a time-series consisting of two terms: a Gauss-Markov signal remaining constant over a segment, and an additive Gaussian, but not necessarily stationary, noise. The signal jumps by a random amount at a segment boundary. The inverse problem of log segmentation consists of detecting the segment boundaries from a given log. The problem is solved using a Bayesian approach in which the unknown parameters, the locations of segment boundaries and the jumps in the signal value, are estimated by maximizing the likelihood function for the observed data. An algorithm based on Kalman smoothing and single most likelihood replacement (SMLR) procedure is proposed. The performance of the method is illustrated with a case study comprising of multisuite log data from an exploratory well. The method is found to be rapid and robust. The resulting segments are found to be geologically consistent. 相似文献
Sequential Gaussian Simulation(SGSIM)as a stochastic method has been developed to avoid the smoothing effect produced in deterministic methods by generating various stochastic realizations.One of the main issues of this technique is,however,an intensive computation related to the inverse operation in solving the Kriging system,which significantly limits its application when several realizations need to be produced for uncertainty quantification.In this paper,a physics-informed machine learning(PIML)model is proposed to improve the computational efficiency of the SGSIM.To this end,only a small amount of data produced by SGSIM are used as the training dataset based on which the model can discover the spatial correlations between available data and unsampled points.To achieve this,the governing equations of the SGSIM algorithm are incorporated into our proposed network.The quality of realizations produced by the PIML model is compared for both 2D and 3D cases,visually and quantitatively.Furthermore,computational performance is evaluated on different grid sizes.Our results demonstrate that the proposed PIML model can reduce the computational time of SGSIM by several orders of magnitude while similar results can be produced in a matter of seconds. 相似文献
On September 5, 2019, the Veslemannen unstable rock slope (54,000 m3) in Romsdalen, Western Norway, failed catastrophically after 5 years of continuous monitoring. During this period, the rock slope weakened while the precursor movements increased progressively, in particular from 2017. Measured displacement prior to the failure was around 19 m in the upper parts of the instability and 4–5 m in the toe area. The pre-failure movements were usually associated with precipitation events, where peak velocities occurred 2–12 h after maximum precipitation. This indicates that the pore-water pressure in the sliding zones had a large influence on the slope stability. The sensitivity to rainfall increased greatly from spring to autumn suggesting a thermal control on the pore-water pressure. Transient modelling of temperatures suggests near permafrost conditions, and deep seasonal frost was certainly present. We propose that a frozen surface layer prevented water percolation to the sliding zone during spring snowmelt and early summer rainfalls. A transition from possible permafrost to a seasonal frost setting of the landslide body after 2000 was modelled, which may have affected the slope stability. Repeated rapid accelerations during late summers and autumns caused a total of 16 events of the red (high) hazard level and evacuation of the hazard zone. Threshold values for velocity were used in the risk management when increasing or decreasing hazard levels. The inverse velocity method was initially of little value. However, in the final phase before the failure, the inverse velocity method was useful for forecasting the time of failure. Risk communication was important for maintaining public trust in early-warning systems, and especially critical is the communication of the difference between issuing the red hazard level and predicting a landslide.
A mesh-free based limit analysis approach is proposed, to determine the upper bound solutions for the collapse loads associated with cohesive soils, under plane strain conditions. In the presented technique, the geometry of problem is just simulated by nodes and there is no need of mesh in the traditional sense. The process of finding an upper bound solution consists of combining limit analysis theory and a mesh-free numerical technique as a discretisation tool. To satisfy the required conditions for the admissibility of the discretised velocity field at the entire problem domain, a strain rate smoothing technique has been adopted. The outcome of proposed combination is a nonlinear optimisation problem which is solved by a direct iterative technique. The solution found by an iterative algorithm is an upper bound for limit load of the stability problem. The efficiency of the proposed method is demonstrated by solving different example problems in the soil mechanics engineering field, at the end of the paper. 相似文献
A combination of empirical and physically based hydrological models has been used to analyze historical data on rainfall and debris-flow occurrence in western Campania, to examine the correlation between rainfall and debris-flow events.
Rainfall data from major storms recorded in recent decades in western Campania were compiled, including daily series from several rain gauges located inside landslide areas, supplemented by hourly rainfall data from some of the principal storms.
A two-phase approach is proposed. During phase 1, soil moisture levels have been modelled as the hydrological balance between precipitation and evapotranspiration, on a daily scale, using the method of Thornthwaite [Geograph. Rev. 38 (1948) 55].
Phase 2 is related to the accumulation of surplus moisture from intense rainfall, leading to the development of positive pore pressures. These interactions take place on an hourly time scale by the “leaky barrel” (LB) model described by Wilson and Wiezoreck [Env. Eng. Geoscience, 1 (1995) 11]. In combination with hourly rainfall records, the LB model has been used to compare hydrological effects of different storms. The critical level of retained rain water has been fixed by the timing of debris-flow activity, related to recorded storm events.
New rainfall intensity–duration thresholds for debris-flow initiation in western Campania are proposed. These thresholds are related to individual rain gauge and assume a previously satisfied field capacity condition. The new thresholds are somewhat higher than those plotted by previous authors, but are thought to be more accurate and thus need less conservatism. 相似文献
The problem of multiphase phase flow in heterogeneous subsurface porous media is one involving many uncertainties. In particular, the permeability of the medium is an important aspect of the model that is inherently uncertain. Properly quantifying these uncertainties is essential in order to make reliable probabilistic-based predictions and future decisions. In this work, a measure-theoretic framework is employed to quantify uncertainties in a two-phase subsurface flow model in high-contrast media. Given uncertain saturation data from observation wells, the stochastic inverse problem is solved numerically in order to obtain a probability measure on the space of unknown permeability parameters characterizing the two-phase flow. As solving the stochastic inverse problem requires a number of forward model solves, we also incorporate the use of a conservative version of the generalized multiscale finite element method for added efficiency. The parameter-space probability measure is used in order to make predictions of saturation values where measurements are not available, and to validate the effectiveness of the proposed approach in the context of fine and coarse model solves. A number of numerical examples are offered to illustrate the measure-theoretic methodology for solving the stochastic inverse problem using both fine and coarse solution schemes. 相似文献