共查询到20条相似文献,搜索用时 31 毫秒
1.
Integrated Interpretation of Interwell Connectivity Using Injection and Production Fluctuations 总被引:1,自引:1,他引:0
A method to characterize reservoirs, based on matching temporal fluctuations in injection and production rates, has recently
been developed. The method produces two coefficients for each injector–producer pair; one parameter, λ, quantifies the connectivity and the other, τ, quantifies the fluid storage in the vicinity of the pair. Previous analyses used λ and τ separately to infer the presence of transmissibility barriers and conduits in the reservoir, but several common conditions
could not be easily distinguished. This paper describes how λ and τ can be jointly interpreted to enhance inference about preferential transmissibility trends and barriers. Two different combinations
are useful: one is a plot of log (λ) versus log (τ) for a producer and nearby injectors, and the second is a Lorenz-style flow capacity (F) versus storativity (C) plot. These techniques were tested against the results of a numerical simulator and applied to data from the North Buck
Draw field. Using the simulated data, we find that the F–C plots and the λ–τ plots are capable of identifying whether the connectivity of an injector–producer well pair is through fractures, a high-permeability
layer, multiple-layers or through partially completed wells. Analysis of data from the North Buck Draw field shows a reasonable
correspondence between τ and the tracer breakthrough times. Of two possible geological models for Buck Draw, the F–C and λ–τ plots support the model that has less connectivity in the field. The wells in fluvial deposits show better communication
than those wells in more estuarine-dominated regions. 相似文献
2.
The Nu Expression for Probabilistic Data Integration 总被引:4,自引:0,他引:4
The general problem of data integration is expressed as that of combining probability distributions conditioned to each individual
datum or data event into a posterior probability for the unknown conditioned jointly to all data. Any such combination of
information requires taking into account data interaction for the specific event being assessed. The nu expression provides
an exact analytical representation of such a combination. This representation allows a clear and useful separation of the
two components of any data integration algorithm: individual data information content and data interaction, the latter being
different from data dependence. Any estimation workflow that fails to address data interaction is not only suboptimal, but
may result in severe bias. The nu expression reduces the possibly very complex joint data interaction to a single multiplicative
correction parameter ν
0, difficult to evaluate but whose exact analytical expression is given; availability of such an expression provides avenues
for its determination or approximation. The case ν
0=1 is more comprehensive than data conditional independence; it delivers a preliminary robust approximation in presence of
actual data interaction. An experiment where the exact results are known allows the results of the ν
0=1 approximation to be checked against the traditional estimators based on assumption of data independence. 相似文献
3.
The study computes time-dependant earthquake probabilities on the basis of seismicity data mainly deriving from historic records.
It provides a methodological approach useful for those countries where the scarcity of instrumental data and/or paleoseismological
evidences requires that historical information shall be stressed. Thus, the conditional probability that damaging earthquakes
(M ≥ 6) may occur in Italy in the next 30 years is shown, and the potential for the main worldwide known Italian cities with
a cultural heritage is outlined. Earthquake probabilities are computed referring to the application of renewal processes,
where the periodicity is analytically modelled by means of the Brownian Passage Time function; an estimate of the dispersion
(i.e., uncertainty) introduced on probabilities is provided making use of Monte Carlo simulations. The computed probabilities
refer to seismic source zones deriving from the spatial clustering of the historically documented seismicity. The computation
of probabilities based on the interaction of earthquakes occurring in nearby zones, has been also attempted for a test area
to explore the influence exerted by the stress transfer effect. The main findings of this study are that (1) seismic source
zones in Southern Italy are the most prone to experience damaging earthquakes in the next 30-years, with conditional probabilities
a large as 10%; and (2) the influence exerted by the earthquake interaction in increasing such probabilities, doesn’t seem
to be relevant, because the mean recurrence times of large earthquakes (above the threshold magnitude of six chosen in this
study) are in general much longer than the time shortening produced by the stress transfer. 相似文献
4.
Physical phenomena are observed in many fields (science and engineering) and are often studied by time-consuming computer codes. These codes are analyzed with statistical models, often called emulators. In many situations, the physical system (computer model output) may be known to satisfy inequality constraints with respect to some or all input variables. The aim is to build a model capable of incorporating both data interpolation and inequality constraints into a Gaussian process emulator. By using a functional decomposition, a finite-dimensional approximation of Gaussian processes such that all conditional simulations satisfy the inequality constraints in the entire domain is proposed. To show the performance of the proposed model, some conditional simulations with inequality constraints such as boundedness, monotonicity or convexity conditions in one and two dimensions are given. A simulation study to investigate the efficiency of the method in terms of prediction is included. 相似文献
5.
The reproduction of the non-stationary distribution and detailed characteristics of geological bodies is the main difficulty of reservoir modeling. Recently developed multiple-point geostatistics can represent a stationary geological body more effectively than traditional methods. When restricted to a stationary hypothesis, multiple-point geostatistical methods cannot simulate a non-stationary geological body effectively, especially when using non-stationary training images (TIs). According to geologic principles, the non-stationary distribution of geological bodies is controlled by a sedimentary model. Therefore, in this paper, we propose auxiliary variables based on the sedimentary model, namely geological vector information (GVI). GVI can characterize the non-stationary distribution of TIs and simulation domains before sequential simulation, and the precision of data event statistics will be enhanced by the sequential simulation’s data event search area limitations under the guidance of GVI. Consequently, the reproduction of non-stationary geological bodies will be improved. The key features of this method are as follows: (1) obtain TIs and geological vector information for simulated areas restricted by sedimentary models; (2) truncate TIs into a number of sub-TIs using a set of cut-off values such that each sub-TI is stationary and the adjacent sub-TIs have a certain similarity; (3) truncate the simulation domain into a number of sub-regions with the same cut-off values used in TI truncation, so that each sub-region corresponds to a number of sub-TIs; (4) use an improved method to scan the TI or TIs and construct a single search tree to restore replicates of data events located in different sub-TIs; and (5) use an improved conditional probability distribution function to perform sequential simulation. A FORTRAN program is implemented based on the SNESIM. 相似文献
6.
Michele Catti 《Physics and Chemistry of Minerals》1989,16(6):582-590
The method of crystal static deformation, including inner strain effects, was applied to calculate the structure configuration
and the elastic constants of forsterite under anisotropic and isotropic pressure. A Born type interatomic potential is used,
with optimized atomic charges and repulsive radii; SiO4 tetrahedra are approximated as rigid units. Computations were carried out in the range 1–8 GPa, with steps of 1 GPa, for
the three uniaxial stresses τ1, τ2, τ3 and for pressure p. By interpolation of results, interatomic distances and elastic tensor components are shown to depend quadratically on stress.
A non-linear behaviour generally appears above 4 GPa; the importance of inner strain and non-linear effects is analyzed. Mg-O
bond lengths and O-O edges of coordination polyhedra respond differently to anisotropic and to isotropic stresses, according
to the topological features of the structure. Elastic and structural results for hydrostatic pressure are compared to experimental
literature data, discussing the range of validity of the rigid body approximation for SiO4 groups. 相似文献
7.
Piotr W. Mirowski Daniel M. Tetzlaff Roy C. Davies David S. McCormick Nneka Williams Claude Signer 《Mathematical Geosciences》2009,41(4):447-474
This research introduces a novel method to assess the validity of training images used as an input for Multipoint Geostatistics,
alternatively called Multiple Point Simulation (MPS). MPS are a family of spatial statistical interpolation algorithms that
are used to generate conditional simulations of property fields such as geological facies. They are able to honor absolute
“hard” constraints (e.g., borehole data) as well as “soft” constraints (e.g., probability fields derived from seismic data,
and rotation and scale). These algorithms require 2D or 3D training images or analogs whose textures represent a spatial arrangement
of geological properties that is presumed to be similar to that of a target volume to be modeled. To use the current generation
of MPS algorithms, statistically valid training image are required as input. In this context, “statistical validity” includes
a requirement of stationarity, so that one can derive from the training image an average template pattern. This research focuses
on a practical method to assess stationarity requirements for MPS algorithms, i.e., that statistical density or probability
distribution of the quantity shown on the image does not change spatially, and that the image shows repetitive shapes whose
orientation and scale are spatially constant. This method employs image-processing techniques based on measures of stationarity
of the category distribution, the directional (or orientation) property field and the scale property field of those images.
It was successfully tested on a set of two-dimensional images representing geological features and its predictions were compared
to actual realizations of MPS algorithms. An extension of the algorithms to 3D images is also proposed. As MPS algorithms
are being used increasingly in hydrocarbon reservoir modeling, the methods described should facilitate screening and selection
of the input training images. 相似文献
8.
Covariance models provide the basic measure of spatial continuity in geostatistics. Traditionally, a closed-form analytical model is fitted to allow for interpolation of sample Covariance values while ensuring the positive definiteness condition. For cokriging, the modeling task is made even more difficult because of the restriction imposed by the linear coregionalization model. Bochner's theorem maps the positive definite constraints into much simpler constraints on the Fourier transform of the covariance, that is the density spectrum. Accordingly, we propose to transform the experimental (cross) covariance tables into quasidensity spectrum tables using Fast Fourier Transform (FFT). These quasidensity spectrum tables are then smoothed under constraints of positivity and unit sum. A backtransform (FFT) yields permissible (jointly) positive definite (cross) covariance tables. At no point is any analytical modeling called for and the algorithm is not restricted by the linear coregionalization model. A case study shows the proposed covariance modeling to be easier and much faster than the traditional analytical covariance modeling, yet yields comparable kriging or simulation results. 相似文献
9.
Comparative performance of indicator algorithms for modeling conditional probability distribution functions 总被引:1,自引:0,他引:1
P. Goovaerts 《Mathematical Geology》1994,26(3):389-411
This paper compares the performance of four algorithms (full indicator cokriging. adjacent cutoffs indicator cokriging, multiple indicator kriging, median indicator kriging) for modeling conditional cumulative distribution functions (ccdf).The latter three algorithms are approximations to the theoretically better full indicator cokriging in the sense that they disregard cross-covariances between some indicator variables or they consider that all covariances are proportional to the same function. Comparative performance is assessed using a reference soil data set that includes 2649 locations at which both topsoil copper and cobalt were measured. For all practical purposes, indicator cokriging does not perform better than the other simpler algorithms which involve less variogram modeling effort and smaller computational cost. Furthermore, the number of order relation deviations is found to be higher for cokriging algorithms, especially when constraints on the kriging weights are applied. 相似文献
10.
A. D. Kuz’min Yu. A. Belyatsky D. V. Dumsky V. A. Izvekova K. A. Lapaev S. V. Logvinenko B. Ya. Losovsky V. D. Pugachev 《Astronomy Reports》2011,55(5):416-424
Results of long-term (2002–2010) monitoring of giant radio pulses of the pulsar PSR B0531+21 in the Crab Nebula at ν = 44, 63, and 111 MHz are reported. The observations were conducted on the LPA and DKR-1000 radio telescopes of the Lebedev
Physical Institute. The giant pulses were analyzed using specialized software for calculating the magnitude of the scattering
τ
sc
, signal-to-noise ratio, and other required parameters by modeling the propagation of a pulse in the scattering interstellar
medium. Three pronounced sharp increases in the scattering were recorded in 2002–2010. Analysis of the dependence between
the variations of the scattering and dispersion measure (data of Jodrell Bank Observatory) shows a strong correlation at all
frequencies, ≈0.9. During periods of anomalous increase in scattering and the dispersion measure, the index γ in the frequency dependence of the scattering in the Crab Nebula, τ
sc
(ν) ∝ ν
−γ
, was smaller than the generally accepted values γ = 4.0 for a Gaussian and γ = 4.4 for a Kolmogorov distribution. This difference in combination with the piece-wise power-law spectrum may be due to
the presence of a dense plasma structure with developed Langmuir turbulence in the nebula, along the pulsar’s line of sight.
The magnetic field in the Crab Nebula estimated from measurements of the rotation measure toward the pulsar is 100 μG. 相似文献
11.
Extended Probability Perturbation Method for Calibrating Stochastic Reservoir Models 总被引:1,自引:0,他引:1
Lin Y. Hu 《Mathematical Geosciences》2008,40(8):875-885
Calibrating a stochastic reservoir model on large, fine-grid to hydrodynamic data requires consistent methods to modify the
petrophysical properties of the model. Several methods have been developed to address this problem. Recent methods include
the Gradual Deformation Method (GDM) and the Probability Perturbation Method (PPM). The GDM has been applied to pixel-based
models of continuous and categorical variables, as well as object-based models. Initially, the PPM has been applied to pixel-based
models of categorical variables generated by sequential simulation. In addition, the PPM relies on an analytical formula (known
as the tau-model) to approximate conditional probabilities. In this paper, an extension of the PPM to any type of probability
distributions (discrete, continuous, or mixed) is presented. This extension is still constrained by the approximation using
the tau-model. However, when applying the method to white noises, this approximation is no longer necessary. The result is
an entirely new and rigorous method for perturbing any type of stochastic models, a modified PPM employed in similar manner
to the GDM. 相似文献
12.
John A. Goff 《Mathematical Geology》2000,32(7):765-786
Stratigraphic modeling based on physical and geologic principles has been improved by more sophisticated process models and increased computer power. However, such efforts may reach a limit in their predictive power because of the stochastic, multiscaled nature of the physical processes involved. Building on techniques from the geostatistical literature, a conditional simulation method, dubbed SimStrat, has been developed to improve predictions of stratigraphic architecture from limited data. No physical processes are invoked. Rather, the prediction is based solely on geometric and statistical principals. The method takes as input sonar bathymetry, seismically defined stratigraphic horizons, and core-defined horizons. Each stratigraphic horizon is characterized using spectral modeling and coherence modeling for adjacent horizons. Predictions of subsurface horizons are improved where seafloor bathymetry conforms with the underlying strata. Conditional simulations can then be generated that conform to available data constraints and statistical characterization. Tests with synthetic data in one and two dimensions for differing spectral models confirm the reliability of the method. 相似文献
13.
Eulogio Pardo-Igúzquiza 《Mathematical Geology》1998,30(1):95-108
In this paper, the maximum likelihood method for inferring the parameters of spatial covariances is examined. The advantages of the maximum likelihood estimation are discussed and it is shown that this method, derived assuming a multivariate Gaussian distribution for the data, gives a sound criterion of fitting covariance models irrespective of the multivariate distribution of the data. However, this distribution is impossible to verify in practice when only one realization of the random function is available. Then, the maximum entropy method is the only sound criterion of assigning probabilities in absence of information. Because the multivariate Gaussian distribution has the maximum entropy property for a fixed vector of means and covariance matrix, the multinormal distribution is the most logical choice as a default distribution for the experimental data. Nevertheless, it should be clear that the assumption of a multivariate Gaussian distribution is maintained only for the inference of spatial covariance parameters and not necessarily for other operations such as spatial interpolation, simulation or estimation of spatial distributions. Various results from simulations are presented to support the claim that the simultaneous use of maximum likelihood method and the classical nonparametric method of moments can considerably improve results in the estimation of geostatistical parameters. 相似文献
14.
A. Pavese 《Physics and Chemistry of Minerals》2002,29(1):43-51
The P–V–T equation of state (EoS) models of Birch–Murnaghan, Vinet and Poirier–Tarantola have been compared with one another and discussed
in the light of their ability to reproduce thermoelastic functions and parameters by means of fitting to pressure–volume–temperature
data artificially generated for spinel, corundum and forsterite. Numerical simulations relying upon semi-empirical potentials,
lattice dynamics and the quasiharmonic approximation have been used to generate P–V–T data. The results obtained indicate that all the P–V–T EoSs tested predict bulk modulus at ambient conditions with errors confined, at worst, within a few percent, and reproduce
correctly its dependence on temperature. The derivatives of the bulk modulus versus P and PT are less satisfactorily modelled. The bulk thermal expansion is determined by EoSs within a few percent error, but the deviations
increase significantly if the approximation of linear dependence of EoS on temperature is used (linearised thermal pressure
model).
Received: 30 January 2001 / Accepted: 16 June 2001 相似文献
15.
A Fixed-Path Markov Chain Algorithm for Conditional Simulation of Discrete Spatial Variables 总被引:4,自引:0,他引:4
Weidong Li 《Mathematical Geology》2007,39(2):159-176
The Markov chain random field (MCRF) theory provided the theoretical foundation for a nonlinear Markov chain geostatistics.
In a MCRF, the single Markov chain is also called a “spatial Markov chain” (SMC). This paper introduces an efficient fixed-path SMC algorithm for conditional simulation of discrete spatial variables
(i.e., multinomial classes) on point samples with incorporation of interclass dependencies. The algorithm considers four nearest
known neighbors in orthogonal directions. Transiograms are estimated from samples and are model-fitted to provide parameter
input to the simulation algorithm. Results from a simulation example show that this efficient method can effectively capture
the spatial patterns of the target variable and fairly generate all classes. Because of the incorporation of interclass dependencies
in the simulation algorithm, simulated realizations are relatively imitative of each other in patterns. Large-scale patterns
are well produced in realizations. Spatial uncertainty is visualized as occurrence probability maps, and transition zones
between classes are demonstrated by maximum occurrence probability maps. Transiogram analysis shows that the algorithm can
reproduce the spatial structure of multinomial classes described by transiograms with some ergodic fluctuations. A special
characteristic of the method is that when simulation is conditioned on a number of sample points, simulated transiograms have
the tendency to follow the experimental ones, which implies that conditioning sample data play a crucial role in determining
spatial patterns of multinomial classes. The efficient algorithm may provide a powerful tool for large-scale structure simulation
and spatial uncertainty analysis of discrete spatial variables. 相似文献
16.
Characterization of complex geological features and patterns remains one of the most challenging tasks in geostatistics. Multiple point statistics (MPS) simulation offers an alternative to accomplish this aim by going beyond classical two-point statistics. Reproduction of features in the final realizations is achieved by borrowing high-order spatial statistics from a training image. Most MPS algorithms use one training image at a time chosen by the geomodeler. This paper proposes the use of multiple training images simultaneously for spatial modeling through a scheme of data integration for conditional probabilities known as a linear opinion pool. The training images (TIs) are based on the available information and not on conceptual geological models; one image comes from modeling the categories by a deterministic approach and another comes from the application of conventional sequential indicator simulation. The first is too continuous and the second too random. The mixing of TIs requires weights for each of them. A methodology for calibrating the weights based on the available drillholes is proposed. A measure of multipoint entropy along the drillholes is matched by the combination of the two TIs. The proposed methodology reproduces geologic features from both TIs with the correct amount of continuity and variability. There is no need for a conceptual training image from another modeling technique; the data-driven TIs permit a robust inference of spatial structure from reasonably spaced drillhole data. 相似文献
17.
Eva R. Myers 《Physics and Chemistry of Minerals》1998,25(6):465-468
A statistical-thermodynamic approximation for order-disorder phase transitions in aluminosilicate solid solutions is presented.
The approximation involves estimating the number of configurations with long range order parameter Q and short range order parameter σ, using an approximation to the probability that a configuration with the correct amount
of long range order will also have the correct amount of short range order. This estimate is then used to give a free energy
F(Q, σ, x, T), where x is the concentration of Al in the structure, and hence quantities such as T
c
(x) are estimated. The predictions of the model for T
c
(x) and the critical concentration x
c
at which T
c
falls to zero are shown to be in good agreement with the results of Monte Carlo simulations.
Received: 14 May 1997 / Revised, accepted: 2 June 1997 相似文献
18.
Weon Shik Han Kue-Young Kim Sungwook Choung Jina Jeong Na-Hyun Jung Eungyu Park 《Environmental Earth Sciences》2014,71(6):2739-2752
The present study focuses on understanding the leakage potentials of the stored supercritical CO2 plume through caprocks generated in geostatistically created heterogeneous media. For this purpose, two hypothetical cases with different geostatistical features were developed, and two conditional geostatistical simulation models (i.e., sequential indicator simulation or SISIM and generalized coupled Markov chain or GCMC) were applied for the stochastic characterizations of the heterogeneities. Then, predictive CO2 plume migration simulations based on stochastic realizations were performed and summarized. In the geostatistical simulations, the results from the GCMC model showed better performance than those of the SISIM model for the strongly non-stationary case, while SISIM models showed reasonable performance for the weakly non-stationary case in terms of low-permeability lenses characterization. In the subsequent predictive simulations of CO2 plume migration, the observations in the geostatistical simulations were confirmed and the GCMC-based predictions showed underestimations in CO2 leakage in the stationary case, while the SISIM-based predictions showed considerable overestimations in the non-stationary case. The overall results suggest that: (1) proper characterization of low-permeability layering is significantly important in the prediction of CO2 plume behavior, especially for the leakage potential of CO2 and (2) appropriate geostatistical techniques must be selectively employed considering the degree of stationarity of the targeting fields to minimize the uncertainties in the predictions. 相似文献
19.
A. Pavese 《Physics and Chemistry of Minerals》1999,26(8):649-657
Numerical simulations, using empirical interatomic potentials within the framework of lattice dynamics and quasi-harmonic
approximation, have been carried out to model the behaviour of the structure and of some thermoelastic properties of pyrope
at high pressure and high temperature conditions (0–50 GPa, 300–1500 K). Comparison with observed data, available as a function
either of P or of T, suggests that the pressure effects are satisfactorily modelled, whilst the effect of T on the simulations is underestimated. The cell edge, bond lengths and polyhedral volumes have been studied as a function
of P along five isotherms, spaced by 300 K steps. These isotherms tend to converge at high pressure, which demonstrates that the
pressure effects become dominant compared to those of thermal origin in affecting the structural properties far from ambient
conditions. The cell parameter, bond distances, and other structural and thermoelastic quantities determined through simulations
have been parametrised as a function of P and T by polynomial expansions. Bulk modulus and thermal expansion have been discussed in the light of the high-temperature-Birch-Murnaghan
and of the Vinet P – V – T equations of state. The predictions of the bulk modulus versus P and T from the present calculations and from the Vinet-EOS agree up to 10 GPa, but they differ at higher pressure.
Received: 23 October, 1998 / Revised, accepted: 23 April, 1999 相似文献
20.
This paper describes a geostatistical technique based on conditional simulations to assess confidence intervals of local
estimates of lake pH values on the Canadian Shield. This geostatistical approach has been developed to deal with the estimation
of phenomena with a spatial autocorrelation structure among observations. It uses the autocorrelation structure to derive
minimum-variance unbiased estimates for points that have not been measured, or to estimate average values for new surfaces.
A survey for lake water chemistry has been conducted by the Ministère de l'Environnement du Québec between 1986 and 1990, to assess surface water quality and delineate the areas affected by acid precipitation on the southern
Canadian Shield in Québec. The spatial structure of lake pH was modeled using two nested spherical variogram models, with
ranges of 20 km and 250 km, accounting respectively for 20% and 55% of the spatial variation, plus a random component accounting
for 25%. The pH data have been used to construct a number of geostatistical simulations that produce plausible realizations
of a given random function model, while 'honoring' the experimental values (i.e., the real data points are among the simulated
data), and that correspond to the same underlying variogram model. Post-processing of a large number of these simulations,
that are equally likely to occur, enables the estimation of mean pH values, the proportion of affected lakes (lakes with pH≤5.5),
and the potential error of these parameters within small regions (100 km×100 km). The method provides a procedure to establish
whether acid rain control programs will succeed in reducing acidity in surface waters, allowing one to consider small areas
with particular physiographic features rather than large drainage basins with several sources of heterogeneity. This judgment
on the reduction of surface water acidity will be possible only if the amount of uncertainty in the estimation of mean pH
is properly quantified.
Received: 3 March 1997 · Accepted: 16 November 1998 相似文献