首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A number of problems in geology can be formulated so that they consist of optimizing a real-valued function (termed the objective function) on some interval or over some region. Many methods are available for solution if the function is unimodal within the domain of interest. Direct methods, involving only function evaluations, are particularly useful in geological problems where the objective function may be strongly nonlinear and constructed from sampled data. In practical problems, the objective function often is not unimodal. Standard optimization routines are not capable of distinguishing between local extrema or of locating the global extremum, which is the point of interest in most cases. The usual approach—trying several different starting points in the hope that the best local extremum found is the global extremum—is inefficient and unreliable. An ancillary algorithm has been developed which avoids these problems and which couples with a variety of local optimization routines. The algorithm first constructs a grid of objective function values over some feasible region. The region dimensions and grid spacings are based on specific problem considerations. First differences are then calculated for successive points along each grid line and monitored in sign only, which rapidly locates extrema. User interaction determines how many of these extrema will undergo further investigation, which is carried out by passing locations to a local optimization subroutine. The algorithm has proved successful on a number of problems. A geological example—determination of benthic mixing parameters in deep-sea sediments via minimization of stratigraphic offset between 18 O signals from two different species of planktonic foraminifera—is given. FORTRAN code is provided for the global optimization routine, a golden section search subroutine for one-dimensional objective functions, and a simplex subroutine for multidimensional problems.  相似文献   

2.
Slope stability analysis of any natural or artificial slope aims at determining the factor of safety of the slip surface that possesses the lowest factor of safety. In this study, an ant colony optimization (ACO) algorithm is developed to solve this factor-of-safety minimization problem. Factors of safety of slip surfaces are found by using the Morgenstern–Price method, which satisfies both force and moment equilibrium. Nonlinear equations from the Morgenstern–Price method are solved numerically by the Newton–Raphson method. In the proposed ACO algorithm, the initiation point and the shape of the slip surface are treated as the search variables. The proposed heuristic algorithm represents slip surfaces as piecewise-linear curves and solves for the optimal curve yielding the minimum factor of safety. To demonstrate its applicability and to investigate the validity and effectiveness of the algorithm, four examples with varying complexity are presented. The obtained results are compared with the available literature and are found to be in agreement.  相似文献   

3.
Measuring strain from deformed xenoliths is problematic due mainly to the large initial shape variations of these markers. A method is described which allows mean initial shape to be determined for a number of xenolith populations by displaying their logarithmic ranges (log Rfmax — log Rfmax) on a Range diagram. The diagram contains a check on validity and allows bad samples to be recognised. Xenolith data from a deformed granite in northwest Ireland is analysed using the method.  相似文献   

4.
Li  Bingyao  Hou  Jingming  Ma  Yongyong  Bai  Ganggang  Wang  Tian  Xu  Guoxin  Wu  Binzhong  Jiao  Yongbao 《Natural Hazards》2022,110(1):607-628

Flooding is now becoming one of the most frequent and widely distributed natural hazards, with significant losses to human lives and property around the world. Evacuation of pedestrians during flooding events is a crucial factor in flood risk management, in addition to saving people’s lives and increasing time for rescue. The key objective of this work is to propose a shortest evacuation path planning algorithm by considering the evacuable areas and human instability during floods. A shortest route optimization algorithm based on cellular automata is established while using diagonal distance calculation methods in heuristic search algorithms. The Morpeth flood event that occurred in 2008 in the UK is used as a case study, and a highly accurate and efficient 2D hydrodynamic model is adopted to discuss the flood characteristics in flood plains. Two flood hazard assessment approaches [i.e., empirical and mechanics-based and experimental calibrated (M&E)] are chosen to study human instability. A comprehensive analysis shows that extreme events are better identified with mechanics-based and experimental calibration methods than with an empirical method. The result of M&E is used as the initial condition for the Morpeth evacuation scenario. Evacuation path planning in Morpeth shows that this algorithm can realize shortest route planning with multiple starting points and ending points at the microscale. These findings are of significance for flood risk management and emergency evacuation research.

  相似文献   

5.
Due to the diversity of mineral types in shale gas reservoirs, it is difficult to establish reservoir parameter volume model by conventional log interpretation methods. The optimization log interpretation method can evaluate complex lithology reservoirs effectively, and the key is optimization algorithm. With the newly proposed seagull optimization algorithm method, we calculate the mineral and physical parameters of shale gas reservoir in Well H of Yuxi block, Sichuan Basin, and compare with the genetic algorithm and the genetic algorithm-complex hybrid algorithm. It shows that calculation results of seagull optimization algorithm optimization log interpretation match well with core analysis data, and calculation error is small, calculation speed is fast. Seagull optimization algorithm also makes up for the shortcomings of premature convergence and easy to fall into local optimization of genetic algorithm, the need for secondary optimization and slow search speed of genetic-complex hybrid algorithm. It provides a reference for the application of seagull optimization algorithm in other shale gas reservoirs regions.  相似文献   

6.
Point-feature cartographic label placement (PFCLP) involves placing labels adjacent to their corresponding point features on a map. A widely accepted goal of PFCLP is to maximize the number of conflict-free labels. This paper presents an algorithm for PFCLP based on the four-slider (4S) model. The algorithm is composed of two phases: an initialization phase during which an initial solution is constructed by an exact algorithm and a heuristic method to maximize the probability of conflict-free labels. The initialization phase is followed by an improvement phase that adopts a backtracking greedy search. The exact algorithm can find a portion of the conflict-free labels in an optimal solution and an extension of the exact algorithm is provided that can find additional conflict-free labels. Computational tests were performed for instances based on standard sets. The two-phase algorithm generated better solutions relative to all methods previously reported in the literature. It also executes at a reasonable speed and is more stable than most other methods.  相似文献   

7.
Stationary segments in well log sequences can be automatically detected by searching for change points in the data. These change points, which correspond to abrupt changes in the statistical nature of the underlying process, can be identified by analysing the probability density functions of two adjacent sub-samples as they move along the data sequence. A statistical test is used to set a significance level of the probability that the two distributions are the same, thus providing a means to decide how many segments comprise the data by keeping those change points that yield low probabilities. Data from the Ocean Drilling Program were analysed, where a high correlation between the available core-log lithology interpretation and the statistical segmentation was observed. Results show that the proposed algorithm can be used as an auxiliary tool in the analysis and interpretation of geophysical log data for the identification of lithology units and sequences.  相似文献   

8.
遗传算法是近些年来产生和发展的一种模拟生物进化过程的自适应启发式全局优化的搜索算法。它不完全依赖于初始猜测,且具有全局收敛的特点,可以被用来解决各种复杂的实际问题,如工程优化设计,人工智能和决策系统,以及地球物理反演等。尽管遗传算法是一种效率很高的全局优化算法,但许多仿真结果表明,它具有计算时间长,局部搜索能力弱的缺点。而共轭梯度法属于非启发式全局优化搜索方法,收敛速度快,但容易陷入局部极值,且严重依赖初始猜测。根据遗传算法和共轭梯度法的特点,这里提出了一种混合遗传算法,用来进行地球物理反演。该算法既具有遗传算法的全局收敛性,又有共轭梯度法的快速收敛性,经实际应用,取得了良好的效果。  相似文献   

9.
李生清 《地质与勘探》2022,58(4):887-894
传统方法在选取潜在滑裂面关键点时,未综合考量剪切滑裂点、受拉滑裂点对搜索过程的影响程度,导致搜索过程易陷入局部最优,从而丢失其他同等级搜索数据。针对这一问题,本研究设计了基于GA-Sarma算法的边坡最不利滑裂面搜索方法。该方法通过计算边坡稳定性系数的实际值提取边坡潜在滑裂面,然后根据剪切滑裂点、受拉滑裂点的特性,选取潜在滑裂面关键点。进而,基于GA-Sarma算法建立目标函数和适应度函数,在考虑变异的前提下搜索边坡的最不利滑裂面。实验测试区域中包含3处最不利滑裂面、5处刚刚形成且危险程度偏低的滑裂面、14处潜在滑裂面。测试结果表明:在20轮测试中,本文方法与两组传统方法搜索到的最不利滑裂面个数平均值分别为3处、1.4处以及0.8处,从而验证了GA-Sarma算法强化了边坡最不利滑裂面搜索效果。该方法可为边坡支护、加固及改造等地质工作提供更加可靠的技术支持。  相似文献   

10.
The method discussed in the present paper permits the determination in detail of the mesoclimatic conditions on the basis of standard data provided by a network of a measurement stations. The passage from the data available at certain points and obtained from meteorological stations to spatial conceptions leads along the determination of the relationships between the altitude, various forms of the relief, and the thermal indices of climate. Resulting from these data, the duration of the frostless period is a sensitive index in the scale of mesoclimate, besides all the other indices connected with the minimum air temperatures. The nomographs applied by the authors make it possible to construct detailed climatic maps, which — with the maps of other elements of the geographical environment — form a basis for the elaboration e.g. an agrocological map of habitats. Presented method of the evaluation of climate conditions can be applied to all mountain territories, in which a network of measurement stations enables the determination of the interdependencies between the chosen thermal parameters and the altitude, and the distinction of the impact of the convex and concave forms.  相似文献   

11.
A variety of geodetic measurements can be combined, in network fashion, to yield adjusted velocities of elevation change. However, it is not always apparent which network junctions have solvable point velocities. When a velocity surface is desired, it is not always apparent how many coefficients should be used. A solvability algorithm, devised to operate on observation equations, answers these questions, and therefore permits the adjustment process to continue with the assurance that the result will be mathematically justified. Using both the hyperboloid and the reciprocal hyperboloid as quadric forms, multiquadric (MQ) analysis has been applied to leveling and tide gauge data in the vicinity of Puget Sound, to obtain heights corresponding to a selected date, and coefficients which collectively define a velocity surface. The solvability algorithm was used to tell which junctions in the level network had solvable point velocities, and consequently where MQ nodal points should be placed for an optimized solution. Networks of simulated data were also used with the solvability algorithm to help determine data requirements for height—velocity adjustments, and to evaluate the ability of MQ analysis to predict velocities.  相似文献   

12.
以提高油田采油速度为目的的人工压裂裂缝施工改造,对当前广泛应用于求取岩石力学参数的常规测井资料法提出了更高的要求。提出三维岩石力学参数场随机模拟预测的方法,即以已知有限的常规测井资料计算出来的岩石力学参数曲线作为硬数据输入,以已知地震属性数据体及泥页岩水化和岩石环境参数数据作为软数据约束,选择序贯高斯模拟方法,随机模拟得到动态规律变化的岩石力学参数的三维空间数据场,然后由岩石力学参数的三维空间数据场预测空间任一位置的岩石力学参数在井上的测井响应,并把该反映动态规律变化的测井响应按照油田勘探开发的要求输出成特定格式,供生产和工程上使用。该方法为当前岩石力学参数的求取开辟了一条全新的途径,对勘探开发上制定完善的钻井、完井与油气开发方案和技术措施提供重要的指导作用。  相似文献   

13.
计算机上的橡皮膜技术   总被引:11,自引:1,他引:11  
由于构制和修改任意形三维物体在技术上有一定的难度,三维物探人机联作反演长期未能得到实际应用。本文提出一种技术,称之为橡皮膜技术,可以灵活地构制任意形三维物体。其特点是:用鼠标器交替地移动设置在一个闭合曲面周围的若干控制点,控制点带动其附近的曲面变形,形成所期望的复杂形态的物体模型;还可按需要多次增加控制点的数目,对模型做更为精细地调节。文章详细介绍了该技术的核心-磨光算法。通过多次磨光,将一个由为  相似文献   

14.
Three sandstone body types—progradational (coarsening upward), aggradational (coarsening downward), and block—can be identified by computer using the least-squares method. The basic data are the digital gamma () ray and sonic log data. Other logs such as SP and resistivity (or conductivity) may also be used with minor changes in the computer program. A statistical program to compute the percent of each of these three types in a specified geologic unit was also written and run for several wells in the Beaufort Basin, Canada. An areal mapping of these calculated results shows the features of progradational sedimentation in this area. The percentage of progradational sandstones increases northward in a seaward direction. The amount of sandstones usually decreases in this direction.  相似文献   

15.
The question whether paleoclimatic systems are governed by a small number of significant variables (low-dimensional systems) is of importance for modeling such systems. As indicators for global Plio-/Pleistocene climate variability, four marine, sedimentary oxygen isotope time series are analyzed with regard to their dimensionality using a modified Grassberger-Procaccia algorithm. An artificial, low-dimensional chaotic time series (Hénon map) is included for the validation of the method. In order to extract equidistant data the raw data are interpolated with the Akima-subspline method since this method minimizes the change in variance due to the interpolation. The nonlinear least-squares Gauss-Marquardt regression method is used instead of the linear least-squares fit to the logarithmically transformed points, in order to acquire an unbiased estimate of the correlation dimension. The dependences of the estimated correlation dimension on the embedding dimension do not indicate a small number (i.e., less than 5) of influencing variables on the investigated paleoclimatic system, whereas the low dimension for the Hénon map is verified (dimension 1.22–1.28). Because of the limited amount of data in the oxygen isotope records, dimensions greater than about 5 cannot be examined.  相似文献   

16.
样条函数模拟土坡滑动面的效率分析   总被引:4,自引:1,他引:3  
土坡稳定分析中,二维滑动面通常由一系列点顺序连接产生,并且将点的坐标当作优化变量来搜索临界滑动面,当点的个数较多时会导致优化变量较多;只需给定少量的控制节点,即可利用三次样条函数得到光滑的曲线作为任意滑动面,该法对于含软弱夹层土坡,无需事先指定直线段。利用不平衡推力法计算给定滑动面的安全系数,采取和声搜索算法确定最小安全系数相应的临界滑动面。对3个土坡算例进行了比较分析,比较了常规策略与样条函数方法在模拟土坡滑动面时的效率。  相似文献   

17.
The nearest neighbor search algorithm is one of the major factors that influence the efficiency of grid interpolation. This paper introduces a KD-tree that is a two-dimensional index structure for use in grid interpolation. It also proposes an improved J-nearest neighbor search strategy based on ??priority queue?? and ??neighbor lag?? concepts. In the strategy, two types of J-nearest neighbor search algorithms can be used; these algorithms correspond to the consideration of a fixed number of points and a fixed radius. By using the KD-tree and proposed strategy, interpolation can be performed with methods such as Inverse Distance Weighting and Kriging. Experimental results show that the proposed algorithms has high operating efficiency, especially when the data amount is enormous, and high practical value for increasing the efficiency of grid interpolation.  相似文献   

18.
A very fast and efficient approach to self-potential (SP) data inversion for ore exploration has been developed. This approach is based on Tikhonov regularization and the conjugate gradient method, and simultaneously inverts for the depth (z), electric dipole moment (k), and angle of polarization (θ) of a buried anomalous body from SP data measured along a profile. This inversion algorithm works iteratively, and solves for z and k in the logarithmic-space (log(z) and log(k)), and solves for θ in the linear-space (non-logarithmic). It is found that the original inversion formulation that uses the model parameters themselves (z, k and θ) is unstable and divergent. It is also found that the inversion formulation that uses the logarithm of the model parameters (log(z), log(k) and log(θ)) is unstable and divergent. Rather, the new inversion scheme that is based on the aforementioned mixed log-linear combination of the model parameters (log(z), log(k), and θ) overcomes and eliminates the mentioned instability and divergence problems. The sensitivity analysis and numerical experiments investigated have indicated that the new approach has a far better and far more optimized minimization search direction. This proposed technique fits the observed data by some geometrically simple body in the restricted class of vertical cylinder, horizontal cylinder, and sphere models. The applicability of the algorithm has been demonstrated on various reliable synthetic data sets with and without noise. The algorithm has been carefully and successfully applied to six real data examples, with ore bodies buried in different complex geologic settings and at various depths in the subsurface. The method is shown to be highly applicable for mineral exploration, and is of particular value in cases where the SP observed data is due to ore body embedded in the subsurface. On average, it took about 40 s of computation (not CPU) time on a 1 GHz PC.  相似文献   

19.
A maximum-likelihood procedure for segmenting digital well-log data is presented. The method is based on a univariate state variable model in which an observed log is treated as a time-series consisting of two terms: a Gauss-Markov signal remaining constant over a segment, and an additive Gaussian, but not necessarily stationary, noise. The signal jumps by a random amount at a segment boundary. The inverse problem of log segmentation consists of detecting the segment boundaries from a given log. The problem is solved using a Bayesian approach in which the unknown parameters, the locations of segment boundaries and the jumps in the signal value, are estimated by maximizing the likelihood function for the observed data. An algorithm based on Kalman smoothing and single most likelihood replacement (SMLR) procedure is proposed. The performance of the method is illustrated with a case study comprising of multisuite log data from an exploratory well. The method is found to be rapid and robust. The resulting segments are found to be geologically consistent.  相似文献   

20.
Shear wave velocity is a critical physical property of rock, which provides significant data for geomechanical and geophysical studies. This study proposes a multi-step strategy to construct a model estimating shear wave velocity from conventional well log data. During the first stage, three correlation structures, including power law, exponential, and trigonometric were designed to formulate conventional well log data into shear wave velocity. Then, a Genetic Algorithm-Pattern Search tool was used to find the optimal coefficients of these correlations. Due to the different natures of these correlations, they might overestimate/underestimate in some regions relative to each other. Therefore, a neuro-fuzzy algorithm is employed to combine results of intelligently derived formulas. Neuro-fuzzy technique can compensate the effect of overestimation/underestimation to some extent, through the use of fuzzy rules. One set of data points was used for constructing the model and another set of unseen data points was employed to assess the reliability of the propounded model. Results have shown that the hybrid genetic algorithm-pattern search technique is a robust tool for finding the most appropriate form of correlations, which are meant to estimate shear wave velocity. Furthermore, neuro-fuzzy combination of derived correlations was capable of improving the accuracy of the final prediction significantly.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号