Geostatistical models should be checked to ensure consistency with conditioning data and statistical inputs. These are minimum
acceptance criteria. Often the first and second-order statistics such as the histogram and variogram of simulated geological
realizations are compared to the input parameters to check the reasonableness of the simulation implementation. Assessing
the reproduction of statistics beyond second-order is often not considered because the “correct” higher order statistics are
rarely known. With multiple point simulation (MPS) geostatistical methods, practitioners are now explicitly modeling higher-order
statistics taken from a training image (TI). This article explores methods for extending minimum acceptance criteria to multiple
point statistical comparisons between geostatistical realizations made with MPS algorithms and the associated TI. The intent
is to assess how well the geostatistical models have reproduced the input statistics of the TI; akin to assessing the histogram
and variogram reproduction in traditional semivariogram-based geostatistics. A number of metrics are presented to compare
the input multiple point statistics of the TI with the statistics of the geostatistical realizations. These metrics are (1)
first and second-order statistics, (2) trends, (3) the multiscale histogram, (4) the multiple point density function, and
(5) the missing bins in the multiple point density function. A case study using MPS realizations is presented to demonstrate
the proposed metrics; however, the metrics are not limited to specific MPS realizations. Comparisons could be made between
any reference numerical analogue model and any simulated categorical variable model. 相似文献
Probability field simulation is being used increasingly to simulate geostatistical realizations. The method can be faster than conventional simulation algorithms and it is well suited to integrate prior soft information in the form of local probability distributions. The theoretical basis of probability field simulation has been established when there are no conditioning data; however, no such basis has been established in presence of conditioning data. Realizations generated by probability field simulation show two severe artifacts near conditioning data. We document these artifacts and show theoretically why they exist. The two artifacts that have been investigated are (1) local conditioning data appear as local minima or maxima of the simulated values, and (2) the variogram model in range of conditioning data is not honored; the simulated values have significantly greater continuity than they are supposed to. These two artifacts are predicted by theory. An example flow simulation study is presented to illustrate that they affect more than the visual appearance of the simulated realizations. Notwithstanding the flexibility of the probability field simulation method, these two artifacts suggest that it be used with caution in presence of conditioning data. Future research may overcome these limitations. 相似文献
Mathematical Geosciences - Modeling the semivariogram to characterize spatial continuity requires expert geostatistical knowledge and domain expertise about the spatial phenomenon of interest.... 相似文献
Kriging-based geostatistical models require a semivariogram model. Next to the initial decision of stationarity, the choice of an appropriate semivariogram model is the most important decision in a geostatistical study. Common practice consists of fitting experimental semivariograms with a nested combination of proven models such as the spherical, exponential, and Gaussian models. These models work well in most cases; however, there are some shapes found in practice that are difficult to fit. We introduce a family of semivariogram models that are based on geometric shapes, analogous to the spherical semivariogram, that are known to be conditional negative definite and provide additional flexibility to fit semivariograms encountered in practice. A methodology to calculate the associated geometric shapes to match semivariograms defined in any number of directions is presented. Greater flexibility is available through the application of these geometric semivariogram models. 相似文献
Spatial data analytics provides new opportunities for automated detection of anomalous data for data quality control and subsurface segmentation to reduce uncertainty in spatial models. Solely data-driven anomaly detection methods do not fully integrate spatial concepts such as spatial continuity and data sparsity. Also, data-driven anomaly detection methods are challenged in integrating critical geoscience and engineering expertise knowledge. The proposed spatial anomaly detection method is based on the semivariogram spatial continuity model derived from sparsely sampled well data and geological interpretations. The method calculates the lag joint cumulative probability for each matched pair of spatial data, given their lag vector and the semivariogram under the assumption of bivariate Gaussian distribution. For each combination of paired spatial data, the associated head and tail Gaussian standardized values of a pair of spatial data are mapped to the joint probability density function informed from the lag vector and semivariogram. The paired data are classified as anomalous if the associated head and tail Gaussian standardized values fall within a low probability zone. The anomaly decision threshold can be decided based on a loss function quantifying the cost of overestimation or underestimation. The proposed spatial correlation anomaly detection method is able to integrate domain expertise knowledge through trend and correlogram models with sparse spatial data to identify anomalous samples, region, segmentation boundaries, or facies transition zones. This is a useful automation tool for identifying samples in big spatial data on which to focus professional attention.
Stratigraphic rule-based reservoir models approximate sedimentary dynamics to generate numerical models of reservoir architecture with realistic spatial distributions of petrophysical properties for reservoir forecasting and to support development decision making. A few intuitive rules for the sequential placement of surfaces bounding reservoir units render realistic reservoir heterogeneity, continuity, and spatial organization of petrophysical property distributions that are difficult to obtain using conventional geostatistical pixel- and object-based subsurface methods. While these methods are emerging in applications specifically for deepwater and fluvial clastic reservoirs, there are some remaining obstacles to broad application, such as selection of rule parameters and addressing emergent non-stationarities over the sequence of the placed surfaces. Firstly, there is a need to tune rule parameters to ensure the models honor natural heterogeneities. We demonstrate this for the compensational rule. Secondly, invariants over model sequence (from the base to the top of the model) may occur with respect to shape, volume, undulation, and gradients of surfaces. For example, for a stack of compensational lobes, the volume of individual lobes may decrease due to the onlapping of previous bathymetry and also increasing undulation over model sequence may occur. In addition, for stacking of compensational lobes, the interfacial width and average gradient of the composite surface may initially increase, but then saturate and stabilize. Such non-stationarities represent numerical artifacts that may bias the results from these rule-based models. It is essential that these features are quantified and mitigated as a prerequisite for robust application of rule-based aggradational lobe methods for reservoir modeling.
Selecting a training image (TI) that is representative of the target spatial phenomenon (reservoir, mineral deposit, soil
type, etc.) is essential for an effective application of multiple-point statistics (MPS) simulation. It is often possible
to narrow potential TIs to a general subset based on the available geological knowledge; however, this is largely subjective.
A method is presented that compares the distribution of runs and the multiple-point density function from available exploration
data and TIs. The difference in the MPS can be used to select the TI that is most representative of the data set. This tool
may be applied to further narrow a suite of TIs for a more realistic model of spatial uncertainty. In addition, significant
differences between the spatial statistics of local conditioning data and a TI may lead to artifacts in MPS. The utilization
of this tool will identify contradictions between conditioning data and TIs. TI selection is demonstrated for a deepwater
reservoir with 32 wells. 相似文献
Fitting semivariograms with analytical models can be tedious and restrictive. There are many smooth functions that could be
used for the semivariogram; however, arbitrary interpolation of the semivariogram will almost certainly create an invalid
function. A spectral correction, that is, taking the Fourier transform of the corresponding covariance values, resetting all
negative terms to zero, standardizing the spectrum to sum to the sill, and inverse transforming is a valuable method for constructing
valid discrete semivariogram models. This paper addresses some important implementation details and provides a methodology
to working with spectrally corrected semivariograms. 相似文献
Natural Resources Research - Constructing subsurface models that accurately reproduce geological heterogeneity and their associated uncertainty is critical to many geoscience and engineering... 相似文献