首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
Most existing reservoir models are based on 2D outcrop studies; 3D aspects are inferred from correlation between wells, and so are inadequately constrained for reservoir simulations. To overcome these deficiencies, we have initiated a multidimensional characterization of reservoir analogues in the Cretaceous Ferron Sandstone in Utah. Detailed sedimentary facies maps of cliff faces define the geometry and distribution of reservoir flow units, barriers and baffles at the outcrop. High‐resolution 2D and 3D ground‐penetrating radar (GPR) images extend these reservoir characteristics into 3D to allow the development of realistic 3D reservoir models. Models use geometric information from mapping and the GPR data, combined with petrophysical data from surface and cliff‐face outcrops, and laboratory analyses of outcrop and core samples. The site of the field work is Corbula Gulch, on the western flank of the San Rafael Swell, in east‐central Utah. The outcrop consists of an 8–17 m thick sandstone body which contains various sedimentary structures, such as cross‐bedding, inclined stratification and erosional surfaces, which range in scale from less than a metre to hundreds of metres. 3D depth migration of the common‐offset GPR data produces data volumes within which the inclined surfaces and erosional surfaces are visible. Correlation between fluid permeability, clay content, instantaneous frequency and instantaneous amplitude of the GPR data provides estimates of the 3D distribution of fluid permeability and clay content.  相似文献   

2.
The area east of Varanasi is one of numerous places along the watershed of the Ganges River with groundwater concentrations of arsenic surpassing the maximum value of 10 parts per billion (ppb) recommended by the World Health Organization in drinking water. Here we apply geostatistics and compositional data analysis for the mapping of arsenic and iron to help in understanding the conditions leading to the occurrence of elevated level of arsenic in groundwater. The methodology allows for displaying concentrations of arsenic and iron as maps consistent with the limited information from 95 water wells across an area of approximately 210 km2; visualization of the uncertainty associated with the sampling; and summary of the findings in the form of probability maps. For thousands of years, Varanasi has been on the erosional side in a meander of the river that is free of arsenic values above 10 ppb. Maps reveal two anomalies of high arsenic concentrations on the depositional side of the valley, which has started seeing urban development. The methodology using geostatistics combined with compositional data analysis is completely general, so this study could be used as a prototype for hydrochemistry mapping in other areas.  相似文献   

3.
Updating of reservoir models by history matching of 4D seismic data along with production data gives us a better understanding of changes to the reservoir, reduces risk in forecasting and leads to better management decisions. This process of seismic history matching requires an accurate representation of predicted and observed data so that they can be compared quantitatively when using automated inversion. Observed seismic data is often obtained as a relative measure of the reservoir state or its change, however. The data, usually attribute maps, need to be calibrated to be compared to predictions. In this paper we describe an alternative approach where we normalize the data by scaling to the model data in regions where predictions are good. To remove measurements of high uncertainty and make normalization more effective, we use a measure of repeatability of the monitor surveys to filter the observed time‐lapse data. We apply this approach to the Nelson field. We normalize the 4D signature based on deriving a least squares regression equation between the observed and synthetic data which consist of attributes representing measured acoustic impedances and predictions from the model. Two regression equations are derived as part of the analysis. For one, the whole 4D signature map of the reservoir is used while in the second, 4D seismic data is used from the vicinity of wells with a good production match. The repeatability of time‐lapse seismic data is assessed using the normalized root mean square of measurements outside of the reservoir. Where normalized root mean square is high, observations and predictions are ignored. Net: gross and permeability are modified to improve the match. The best results are obtained by using the normalized root mean square filtered maps of the 4D signature which better constrain normalization. The misfit of the first six years of history data is reduced by 55 per cent while the forecast of the following three years is reduced by 29 per cent. The well based normalization uses fewer data when repeatability is used as a filter and the result is poorer. The value of seismic data is demonstrated from production matching only where the history and forecast misfit reductions are 45% and 20% respectively while the seismic misfit increases by 5%. In the best case using seismic data, it dropped by 6%. We conclude that normalization with repeatability based filtering is a useful approach in the absence of full calibration and improves the reliability of seismic data.  相似文献   

4.
A need for more accurate flood inundation maps has recently arisen because of the increasing frequency and extremity of flood events. The accuracy of flood inundation maps is determined by the uncertainty propagated from all of the variables involved in the overall process of flood inundation modelling. Despite our advanced understanding of flood progression, it is impossible to eliminate the uncertainty because of the constraints involving cost, time, knowledge, and technology. Nevertheless, uncertainty analysis in flood inundation mapping can provide useful information for flood risk management. The twin objectives of this study were firstly to estimate the propagated uncertainty rates of key variables in flood inundation mapping by using the first‐order approximation method and secondly to evaluate the relative sensitivities of the model variables by using the Hornberger–Spear–Young (HSY) method. Monte Carlo simulations using the Hydrologic Engineering Center's River Analysis System and triangle‐based interpolation were performed to investigate the uncertainty arising from discharge, topography, and Manning's n in the East Fork of the White River near Seymour, Indiana, and in Strouds Creek in Orange County, North Carolina. We found that the uncertainty of a single variable is propagated differently to the flood inundation area depending on the effects of other variables in the overall process. The uncertainty was linearly/nonlinearly propagated corresponding to valley shapes of the reaches. In addition, the HSY sensitivity analysis revealed the topography of Seymour reach and the discharge of Strouds Creek to be major contributors to the change of flood inundation area. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

5.
It is the goal of remote sensing to infer information about objects or a natural process from a remote location. This invokes that uncertainty in measurement should be viewed as central to remote sensing. In this study, the uncertainty associated with water stages derived from a single SAR image for the Alzette (G.D. of Luxembourg) 2003 flood is assessed using a stepped GLUE procedure. Main uncertain input factors to the SAR processing chain for estimating water stages include geolocation accuracy, spatial filter window size, image thresholding value, DEM vertical precision and the number of river cross sections at which water stages are estimated. Initial results show that even with plausible parameter values uncertainty in water stages over the entire river reach is 2.8 m on average. Adding spatially distributed field water stages to the GLUE analysis following a one-at-a-time approach helps to considerably reduce SAR water stage uncertainty (0.6 m on average) thereby identifying appropriate value ranges for each uncertain SAR water stage processing factor. For the GLUE analysis a Nash-like efficiency criterion adapted to spatial data is proposed whereby acceptable SAR model simulations are required to outperform a simpler regression model based on the field-surveyed average river bed gradient. Weighted CDFs for all factors based on the proposed efficiency criterion allow the generation of reliable uncertainty quantile ranges and 2D maps that show the uncertainty associated with SAR-derived water stages. The stepped GLUE procedure demonstrated that not all field data collected are necessary to achieve maximum constraining. A possible efficient way to decide on relevant locations at which to sample in the field is proposed. It is also suggested that the resulting uncertainty ranges and flood extent or depth maps may be used to evaluate 1D or 2D flood inundation models in terms of water stages, depths or extents. For this, the extended GLUE approach, which copes with the presence of uncertainty in the observed data, may be adopted.  相似文献   

6.
Estimating and mapping spatial uncertainty of environmental variables is crucial for environmental evaluation and decision making. For a continuous spatial variable, estimation of spatial uncertainty may be conducted in the form of estimating the probability of (not) exceeding a threshold value. In this paper, we introduced a Markov chain geostatistical approach for estimating threshold-exceeding probabilities. The differences of this approach compared to the conventional indicator approach lie with its nonlinear estimators—Markov chain random field models and its incorporation of interclass dependencies through transiograms. We estimated threshold-exceeding probability maps of clay layer thickness through simulation (i.e., using a number of realizations simulated by Markov chain sequential simulation) and interpolation (i.e., direct conditional probability estimation using only the indicator values of sample data), respectively. To evaluate the approach, we also estimated those probability maps using sequential indicator simulation and indicator kriging interpolation. Our results show that (i) the Markov chain approach provides an effective alternative for spatial uncertainty assessment of environmental spatial variables and the probability maps from this approach are more reasonable than those from conventional indicator geostatistics, and (ii) the probability maps estimated through sequential simulation are more realistic than those through interpolation because the latter display some uneven transitions caused by spatial structures of the sample data.  相似文献   

7.
The Itajaí River basin is one of the areas most affected by flood-related disasters in Brazil. Flood hazard maps based on digital elevation models (DEM) are an important alternative in the absence of detailed hydrological data and for application in large areas. We developed a flood hazard mapping methodology by combining flow frequency analysis with the Height Above the Nearest Drainage (HAND) model – f2HAND – and applied it in three municipalities in the Itajaí River basin. The f2HAND performance was evaluated through comparison with observed 2011 flood extent maps. Model performance and sensitivity were tested for different DEM resolutions, return periods and streamflow data from stations located upstream and downstream on the main river. The flood hazard mapping with our combined approach matched 92% of the 2011 flood event. We found that the f2HAND model has low sensitivity to DEM resolution and high sensitivity to area threshold of channel initiation.  相似文献   

8.
Heating heavy oil reservoirs is a common method for reducing the high viscosity of heavy oil and thus increasing the recovery factor. Monitoring of these viscosity changes in the reservoir is essential for delineating the heated region and controlling production. In this study, we present an approach for estimating viscosity changes in a heavy oil reservoir. The approach consists of three steps: measuring seismic wave attenuation between reflections from above and below the reservoir, constructing time‐lapse Q and Q?1 factor maps, and interpreting these maps using Kelvin–Voigt and Maxwell viscoelastic models. We use a 4D relative spectrum method to measure changes in attenuation. The method is tested with synthetic seismic data that are noise free and data with additive Gaussian noise to show the robustness and the accuracy of the estimates of the Q‐factor. The results of the application of the method to a field data set exhibit alignment of high attenuation zones along the steam‐injection wells, and indicate that temperature dependent viscosity changes in the heavy oil reservoir can be explained by the Kelvin–Voigt model.  相似文献   

9.
高分辨率非线性三维整体反演方法是基于非线性理论,在层位控制下,将工区多井(或全部井)的测井数据与井旁地震道数据输入具有多输入多输出的网络,同时进行整体训练,可获得整个工区的自适应权函数,并建立综合非线性映射关系,并根据储层在纵横方向上的地质变化特征更新这种非线性映射关系,这样,就能对反演过程及其反演结果起到约束和控制的作用,从而获得稳定且分辨率高的地震反演剖面(速度反演剖面/波阻抗反演剖面/密度反演剖面),实现整体反演,该方法通过模型试算和实际资料处理,获得较好的地质效果,证明该方法精度高、实用性强,可用于储层的定量分析。  相似文献   

10.
This study compares the accuracy of two types of water table maps both of which were constructed with the object of optimizing future mapping efforts in similar environments. The. first type of map is based solely on office information, with no field verification. The second type of map is based on careful field mapping using numerous measurement points.
The office-derived maps were based on topography, surface water features, existing reports, maps and data in the files of the Wisconsin Geological and Natural History Survey; the data were not field-verified. The field-derived maps used a dense network of 236 piezometers at 176 sites in an area of approximately 170 square miles. The field project was much more expensive and labor-intensive than was the construction of office-derived maps for the same area.
The two methods produce water table maps which agree to an appreciable extent, the greatest agreement being in areas having ground water-fed streams. Differences in water table elevations indicated by the two methods range from negligible to approximately 5 feet. Thus, depending upon the availability of existing information, relatively accurate water table elevations can be delineated in similar sandy unconfined aquifers without time-consuming and expensive field work that drilling and piezometer installation entails.
Preliminary construction of office-derived water table maps enables researchers to use their resources efficiently. In some situations, expensive installation of wells and piezometers for a regional monitoring network may add little accuracy to the regional map. For localized problems, collection of additional field data will always be necessary, but can be guided by the office-derived maps. The authors caution that this technique may only be applicable to sandy, unconfined aquifers in humid climates.  相似文献   

11.
A set of geophysical data collected in an area in Iran are analyzed to check the validity of a geological map that was prepared in connection to a mineral prospecting project and also to image the spatial electrical resistivity distribution. The data set includes helicopter electromagnetic (HEM), airborne magnetic and ground electrical resistivity measurement. Occam approach was used to invert the HEM data to model the resistivity using a layered earth model with fixed thicknesses. The algorithm is based on a nonlinear inverse problem in a least-squares sense.The algorithm was tested on a part of an HEM dataset acquired with a DIGHEM helicopter EM system at Kalat-e-Reshm, Semnan in Iran. The area contains a resistive porphyry andesite that is covered by Eocene sedimentary units. The results are shown as resistivity sections and maps confirming the existence of an arc like resistive structure in the survey area. The resistive andesite seems to be thicker than it is indicated in the geological maps. The results are compared with the reduced to the pole (RTP) airborne magnetic anomaly field data as well as with two ground resistivity profiles. We found reasonable correlations between the HEM 1D resistivity models and 2D models from electrical resistivity tomography (ERT) inversions. A 3D visualization of the 1D models along all flight lines provided a useful tool for the study of spatial variations of the resistivity structure in the investigation area.  相似文献   

12.
In this study, the potential land use planning for rural communities and agricultural development is examined with a multi-criteria analysis and Geographical Information System. For this purpose, geological, geomorphological and socio-economic data and natural hazard maps were chosen as major factors affecting both land uses. The Analytical Hierarchical Process method was applied to evaluate these factors and the uncertainty of their weight alterations estimated. Three scenarios were developed for each land use to examine the effect of uncertainty to the suitability assessment results, leading to the corresponding potential suitability maps. The areas of very high suitability are distributed mainly at the plain part of the study area. The proposed methodology comprises a case application concerning physical factors in conjunction with natural hazard maps in the land use planning procedure.  相似文献   

13.
Time-lapse seismic data is useful for identifying fluid movement and pressure and saturation changes in a petroleum reservoir and for monitoring of CO2 injection. The focus of this paper is estimation of time-lapse changes with uncertainty quantification using full-waveform inversion. The purpose of also estimating the uncertainty in the inverted parameters is to be able to use the inverted seismic data quantitatively for updating reservoir models with ensemble-based methods. We perform Bayesian inversion of seismic waveform data in the frequency domain by combining an iterated extended Kalman filter with an explicit representation of the sensitivity matrix in terms of Green functions (acoustic approximation). Using this method, we test different strategies for inversion of the time-lapse seismic data with uncertainty. We compare the results from a sequential strategy (making a prior from the monitor survey using the inverted baseline survey) with a double difference strategy (inverting the difference between the monitor and baseline data). We apply the methods to a subset of the Marmousi2 P-velocity model. Both strategies performed well and relatively good estimates of the monitor velocities and the time-lapse differences were obtained. For the estimated time-lapse differences, the double difference strategy gave the lowest errors.  相似文献   

14.
ABSTRACT

Flood risk management strongly relies on inundation models for river basin zoning in flood-prone and risk-free areas. Floodplain zoning is significantly affected by the diverse and concurrent uncertainties that characterize the modelling chain used for producing inundation maps. In order to quantify the relative impact of the uncertainties linked to a lumped hydrological (rainfall–runoff) model and a FLO-2D hydraulic model, a Monte Carlo procedure is proposed in this work. The hydrological uncertainty is associated with the design rainfall estimation method, while the hydraulic model uncertainty is associated with roughness parameterization. This uncertainty analysis is tested on the case study of the Marta coastal catchment in Italy, by comparing the different frequency, extent and depth of inundation simulations associated with varying rainfall forcing and/or hydraulic model roughness realizations. The results suggest a significant predominance of the hydrological uncertainty with respect to the hydraulic one on the overall uncertainty associated with the simulated inundation maps.  相似文献   

15.
The goal of the presented research was the derivation of flood hazard maps, using Monte Carlo simulation of flood propagation at an urban site in the UK, specifically an urban area of the city of Glasgow. A hydrodynamic model describing the propagation of flood waves, based on the De Saint Venant equations in two‐dimensional form capable of accounting for the topographic complexity of the area (preferential outflow paths, buildings, manholes, etc.) and for the characteristics of prevailing imperviousness typical of the urban areas, has been used to derive the hydrodynamic characteristics of flood events (i.e. water depths and flow velocities). The knowledge of the water depth distribution and of the current velocities derived from the propagation model along with the knowledge of the topographic characteristics of the urban area from digital map data allowed for the production of hazard maps based on properly defined hazard indexes. These indexes are evaluated in a probabilistic framework to overcome the classical problem of single deterministic prediction of flood extent for the design event and to introduce the concept of the likelihood of flooding at a given point as the sum of data uncertainty, model structural error and parameterization uncertainty. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
Predictive models for calculating sediment yield and discharge require accurate areal data. Such models may be unrealistic when using digitized data given the potential error involved in compiling and digitizing thematic polygon maps. The estimation of boundary variability for digitized polygon maps of a 0·34 km2 area of badlands in Dinosaur Provincial Park, Alberta shows the effects of positional errors introduced during mapping and digitizing processes. Polygon overlay of maps of surface features and slopes produced high frequencies of very small polygons and some unlikely combinations of slopes and surface features, and decreased reliability in areal measurements in the composite map. At an epsilon band width of 0·7 m, a reasonable estimate of boundary variability, 31·7 per cent of the resultant overlaid map can be considered unreliable.  相似文献   

17.
In the Norwegian North Sea, the Sleipner field produces gas with a high CO2 content. For environmental reasons, since 1996, more than 11 Mt of this carbon dioxide (CO2) have been injected in the Utsira Sand saline aquifer located above the hydrocarbon reservoir. A series of seven 3D seismic surveys were recorded to monitor the CO2 plume evolution. With this case study, time‐lapse seismics have been shown to be successful in mapping the spread of CO2 over the past decade and to ensure the integrity of the overburden. Stratigraphic inversion of seismic data is currently used in the petroleum industry for quantitative reservoir characterization and enhanced oil recovery. Now it may also be used to evaluate the expansion of a CO2 plume in an underground reservoir. The aim of this study is to estimate the P‐wave impedances via a Bayesian model‐based stratigraphic inversion. We have focused our study on the 1994 vintage before CO2 injection and the 2006 vintage carried out after a CO2 injection of 8.4 Mt. In spite of some difficulties due to the lack of time‐lapse well log data on the interest area, the full application of our inversion workflow allowed us to obtain, for the first time to our knowledge, 3D impedance cubes including the Utsira Sand. These results can be used to better characterize the spreading of CO2 in a reservoir. With the post‐stack inversion workflow applied to CO2 storage, we point out the importance of the a priori model and the issue to obtain coherent results between sequential inversions of different seismic vintages. The stacking velocity workflow that yields the migration model and the a priori model, specific to each vintage, can induce a slight inconsistency in the results.  相似文献   

18.
Flood modelling inputs used to create flood hazard maps are normally based on the assumption of data stationarity for flood frequency analysis. However, changes in the behaviour of climate systems can lead to nonstationarity in flood series. Here, we develop flood hazard maps for Ho Chi Minh City, Vietnam, under nonstationary conditions using extreme value analysis, a coupled 1D–2D model and high-resolution topographical data derived from LiDAR (Light Detection and Ranging) data. Our findings indicate that ENSO (El Niño Southern Oscillation) and PDO (Pacific Decadal Oscillation) influence the magnitude and frequency of extreme rainfall, while global sea-level rise causes nonstationarity in local sea levels, having an impact on flood risk. The detailed flood hazard maps show that areas of high flood potential are located along river banks, with 0.60 km2 of the study area being unsafe for people, vehicles and buildings (H5 zone) under a 100-year return period scenario.  相似文献   

19.
Airborne electromagnetic methods are most commonly used in mineral exploration. However, new developments, such as multifrequency capability and digital on-board field recording, as well as improvements in instrumentation resulting in high signal-to-noise ratios in recorded data, have made their application in geological mapping possible. A three-frequency airborne EM survey carried out over an area northwest of Timmins, Ontario, was interpreted in terms of thickness and resistivity of the layers of a two-layer earth section. Since both in-phase and quadrature components are measured, this provides six independent parameters at each point in space. Based on prior geological information and a preliminary interpretation of the field records, two two-layer models of the subsurface seemed to be appropriate for most of the survey area. An automatic computerized interpretation procedure was devised to interpret the field data at each point in terms of thickness and resistivity parameters of those two models. When the geology is more complex, the data do not fit the models and no interpretations are made. Two maps illustrating the variation of the resistivity and the thicknesses of the layers were constructed from the interpreted data. These maps agree with the known geological information about the distribution of glacial clay in the area. Areas where the layered models do not fit are known to be areas where the geology is complex with a large number of dykes and other lateral inhomogeneities. The study shows that multifrequency airborne EM surveys can be very useful in geological mapping over inaccessible terrain and can significantly help the mapping geologist where outcrops are scarce.  相似文献   

20.
地震岩相识别概率表征方法   总被引:4,自引:3,他引:1       下载免费PDF全文
储层岩相分布信息是油藏表征的重要参数,基于地震资料开展储层岩相识别通常具有较强的不确定性.传统方法仅获取唯一确定的岩相分布信息,无法解析反演结果的不确定性,增加了油藏评价的风险.本文引入基于概率统计的多步骤反演方法开展地震岩相识别,通过在其各个环节建立输入与输出参量的统计关系,然后融合各环节概率统计信息构建地震数据与储层岩相的条件概率关系以反演岩相分布概率信息.与传统方法相比,文中方法通过概率统计关系表征了地震岩相识别各个环节中地球物理响应关系的不确定性,并通过融合各环节概率信息实现了不确定性传递的数值模拟,最终反演的岩相概率信息能够客观准确地反映地震岩相识别结果的不确定性,为油藏评价及储层建模提供了重要参考信息.模型数据和实际资料应用验证了方法的有效性.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号