首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   34篇
  免费   0篇
地球物理   2篇
地质学   32篇
  2022年   2篇
  2019年   2篇
  2018年   2篇
  2017年   1篇
  2016年   1篇
  2015年   2篇
  2014年   4篇
  2012年   3篇
  2011年   2篇
  2010年   2篇
  2008年   2篇
  2007年   1篇
  2006年   3篇
  2003年   2篇
  2002年   1篇
  2000年   1篇
  1999年   1篇
  1998年   1篇
  1997年   1篇
排序方式: 共有34条查询结果,搜索用时 15 毫秒
1.
The determination of the optimal type and placement of a nonconventional well in a heterogeneous reservoir represents a challenging optimization problem. This determination is significantly more complicated if uncertainty in the reservoir geology is included in the optimization. In this study, a genetic algorithm is applied to optimize the deployment of nonconventional wells. Geological uncertainty is accounted for by optimizing over multiple reservoir models (realizations) subject to a prescribed risk attitude. To reduce the excessive computational requirements of the base method, a new statistical proxy (which provides fast estimates of the objective function) based on cluster analysis is introduced into the optimization process. This proxy provides an estimate of the cumulative distribution function (CDF) of the scenario performance, which enables the quantification of proxy uncertainty. Knowledge of the proxy-based performance estimate in conjunction with the proxy CDF enables the systematic selection of the most appropriate scenarios for full simulation. Application of the overall method for the optimization of monobore and dual-lateral well placement demonstrates the performance of the hybrid optimization procedure. Specifically, it is shown that by simulating only 10% or 20% of the scenarios (as determined by application of the proxy), optimization results very close to those achieved by simulating all cases are obtained.  相似文献   
2.
A new method for upscaling fine scale permeability fields to general quadrilateral-shaped coarse cells is presented. The procedure, referred to as the conforming scale up method, applies a triangle-based finite element technique, capable of accurately resolving both the coarse cell geometry and the subgrid heterogeneity, to the solution of the local fine scale problem. An appropriate averaging of this solution provides the equivalent permeability tensor for the coarse scale quadrilateral cell. The general level of accuracy of the technique is demonstrated through application to a number of flow problems. The real strength of the conforming scale up method is demonstrated when the method is applied in conjunction with a flow-based gridding technique. In this case, the approach is shown to provide results that are significantly more accurate than those obtained using standard techniques.  相似文献   
3.
4.
5.
An adjoint formulation for the gradient-based optimization of oil–gas compositional reservoir simulation problems is presented. The method is implemented within an automatic differentiation-based compositional flow simulator (Stanford’s Automatic Differentiation-based General Purpose Research Simulator, AD-GPRS). The development of adjoint procedures for general compositional problems is much more challenging than for oil–water problems due to the increased complexity of the code and the underlying physics. The treatment of nonlinear constraints, an example of which is a maximum gas rate specification in injection or production wells, when the control variables are well bottom-hole pressures, poses a particular challenge. Two approaches for handling these constraints are presented—a formal treatment within the optimizer and a simpler heuristic treatment in the forward model. The relationship between discrete and continuous adjoint formulations is also elucidated. Results for four example cases of increasing complexity are presented. Improvements in the objective function (cumulative oil produced) relative to reference solutions range from 4.2 to 11.6 %. The heuristic treatment of nonlinear constraints is shown to offer a cost-effective means for obtaining feasible solutions, which are, in some cases, better than those obtained using the formal constraint handling procedure.  相似文献   
6.

A new low-dimensional parameterization based on principal component analysis (PCA) and convolutional neural networks (CNN) is developed to represent complex geological models. The CNN–PCA method is inspired by recent developments in computer vision using deep learning. CNN–PCA can be viewed as a generalization of an existing optimization-based PCA (O-PCA) method. Both CNN–PCA and O-PCA entail post-processing a PCA model to better honor complex geological features. In CNN–PCA, rather than use a histogram-based regularization as in O-PCA, a new regularization involving a set of metrics for multipoint statistics is introduced. The metrics are based on summary statistics of the nonlinear filter responses of geological models to a pre-trained deep CNN. In addition, in the CNN–PCA formulation presented here, a convolutional neural network is trained as an explicit transform function that can post-process PCA models quickly. CNN–PCA is shown to provide both unconditional and conditional realizations that honor the geological features present in reference SGeMS geostatistical realizations for a binary channelized system. Flow statistics obtained through simulation of random CNN–PCA models closely match results for random SGeMS models for a demanding case in which O-PCA models lead to significant discrepancies. Results for history matching are also presented. In this assessment CNN–PCA is applied with derivative-free optimization, and a subspace randomized maximum likelihood method is used to provide multiple posterior models. Data assimilation and significant uncertainty reduction are achieved for existing wells, and physically reasonable predictions are also obtained for new wells. Finally, the CNN–PCA method is extended to a more complex nonstationary bimodal deltaic fan system, and is shown to provide high-quality realizations for this challenging example.

  相似文献   
7.
This paper describes a novel approach for creating an efficient, general, and differentiable parameterization of large-scale non-Gaussian, non-stationary random fields (represented by multipoint geostatistics) that is capable of reproducing complex geological structures such as channels. Such parameterizations are appropriate for use with gradient-based algorithms applied to, for example, history-matching or uncertainty propagation. It is known that the standard Karhunen–Loeve (K–L) expansion, also called linear principal component analysis or PCA, can be used as a differentiable parameterization of input random fields defining the geological model. The standard K–L model is, however, limited in two respects. It requires an eigen-decomposition of the covariance matrix of the random field, which is prohibitively expensive for large models. In addition, it preserves only the two-point statistics of a random field, which is insufficient for reproducing complex structures. In this work, kernel PCA is applied to address the limitations associated with the standard K–L expansion. Although widely used in machine learning applications, it does not appear to have found any application for geological model parameterization. With kernel PCA, an eigen-decomposition of a small matrix called the kernel matrix is performed instead of the full covariance matrix. The method is much more efficient than the standard K–L procedure. Through use of higher order polynomial kernels, which implicitly define a high-dimensionality feature space, kernel PCA further enables the preservation of high-order statistics of the random field, instead of just two-point statistics as in the K–L method. The kernel PCA eigen-decomposition proceeds using a set of realizations created by geostatistical simulation (honoring two-point or multipoint statistics) rather than the analytical covariance function. We demonstrate that kernel PCA is capable of generating differentiable parameterizations that reproduce the essential features of complex geological structures represented by multipoint geostatistics. The kernel PCA representation is then applied to history match a water flooding problem. This example demonstrates that kernel PCA can be used with gradient-based history matching to provide models that match production history while maintaining multipoint geostatistics consistent with the underlying training image.  相似文献   
8.
9.
The advantages of the simultaneous integration of production and time-lapse seismic data for history matching have been demonstrated in a number of previous studies. Production data provide accurate observations at particular spatial locations (wells), while seismic data enable global, though filtered/noisy, estimates of state variables. In this work, we present an efficient computational tool for bi-objective history matching, in which data misfits for both production and seismic measurements are minimized using an adjoint-gradient approach. This enables us to obtain a set of Pareto optimal solutions defining the optimal trade-off between production and seismic data misfits (which are, to some extent, conflicting). The impact of noise structure and noise level on Pareto optimal solutions is investigated in detail. We discuss the existence of the “best” trade-off solution, or least-conflicting posterior model, which corresponds to the history-matched model that is expected to provide the least-conflicting forecast of future reservoir performance. The overall framework is successfully applied in 2D and 3D compositional simulation problems to provide a single least-conflicting posterior model and, for the 2D case, multiple samples from the posterior distribution using the randomized maximum likelihood method.  相似文献   
10.
Thermal recovery can entail considerably higher costs than conventional oil recovery, so the use of computational optimization techniques in designing and operating these processes may be beneficial. Optimization, however, requires many simulations, which results in substantial computational cost. Here, we implement a model-order reduction technique that aims at large reductions in computational requirements. The technique considered, trajectory piecewise linearization (TPWL), entails the representation of new solutions in terms of linearizations around previously simulated (and saved) training solutions. The linearized representation is projected into a low-dimensional space, with the projection matrix constructed through proper orthogonal decomposition of solution “snapshots” generated in the training step. Two idealized problems are considered here: primary production of oil driven by downhole heaters and a simplified model for steam-assisted gravity drainage, where water and steam are treated as a single “effective” phase. The strong temperature dependence of oil viscosity is included in both cases. TPWL results for these systems demonstrate that the method can provide accurate predictions relative to full-order reference solutions. Observed runtime speedups are very substantial, over 2 orders of magnitude for the cases considered. The overhead associated with TPWL model construction is equivalent to the computation time for several full-order simulations (the precise overhead depends on the number of training runs), so the method is only applicable if many simulations are to be performed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号