首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper summarizes the findings of a statistical analysis of the locations of metallic anomalies detected at the Pueblo Precision Bombing Range Number 2 in Otero County, Colorado, and at the Victorville Precision Bombing Range in San Bernardino County, California. The purpose of the study is to explore whether statistical properties of the pattern of anomaly locations can be used to discriminate areas likely to contain unexploded ordnance (UXO) left over from previous bombing practice from those unlikely to contain UXO. Techniques for discriminating areas with and without UXO are needed because historic records have left an incomplete account of previous military training activities, so that locations historically used for target practice are often unknown. This study differs from previous research on metallic anomaly data at former military training ranges in that it analyzes the spatial pattern of the discrete locations of the anomalies, rather than the average number of anomalies per unit area. The results indicate that differences in spatial pattern may be a distinguishing feature between areas that were used for target practice and those that are unlikely to contain UXO, even when a large number of ferrous rocks and other inert metallic anomalies are present. We found that at both of the former bombing ranges, the anomaly patterns in sample areas that are distant from all known bombing targets are consistent with a complete spatial randomness pattern, while those near the target areas fit a radially symmetric, bivariate Gaussian pattern. Furthermore, anomaly location patterns generated by surveys with airborne metal detectors have the same statistical properties as the patterns generated by surveys with on-ground detectors, even though the airborne systems detect only a subset of the anomalies found by the ground-based detectors. Thus, pattern information revealed by airborne surveys with metal detectors may be useful in identifying areas where careful searches for UXO are needed.
Jacqueline A. MacDonaldEmail:
  相似文献   

2.
Data collected along transects are becoming more common in environmental studies as indirect measurement devices, such as geophysical sensors, that can be attached to mobile platforms become more prevalent. Because exhaustive sampling is not always possible under constraints of time and costs, geostatistical interpolation techniques are used to estimate unknown values at unsampled locations from transect data. It is known that outlying observations can receive significantly greater ordinary kriging weights than centrally located observations when the data are contiguously aligned along a transect within a finite search window. Deutsch (1994) proposed a kriging algorithm, finite domain kriging, that uses a redundancy measure in place of the covariance function in the data-to-data kriging matrix to address the problem of overweighting the outlying observations. This paper compares the performances of two kriging techniques, ordinary kriging (OK) and finite domain kriging (FDK), on examining unexploded ordnance (UXO) densities by comparing prediction errors at unsampled locations. The impact of sampling design on object count prediction is also investigated using data collected from transects and at random locations. The Poisson process is used to model the spatial distribution of UXO for three 5000 × 5000 m fields; one of which does not have any ordnance target (homogeneous field), while the other two sites have an ordnance target in the center of the site (isotropic and anisotropic fields). In general, for a given sampling transects width, the differences between OK and FDK in terms of the mean error and the mean square error are not significant regardless of the sampled area and the choice of the field. When 20% or more of the site is sampled, the estimation of object counts is unbiased on average for all three fields regardless of the choice of the transect width and the choice of the kriging algorithm. However, for non-homogeneous fields (isotropic and anisotropic fields), the mean error fluctuates considerably when a small number of transects are sampled. The difference between the transect sampling and the random sampling in terms of prediction errors becomes almost negligible if more than 20% of the site is sampled. Overall, FDK is no better than OK in terms of the prediction performances when the transect sampling procedure is used.  相似文献   

3.
Site characterization activities at potential unexploded ordnance (UXO) sites rely on sparse sampling collected as geophysical surveys along strip transects. From these samples, the locations of target areas, those regions on the site where the geophysical anomaly density is significantly above the background density, must be identified. A target area detection approach using a hidden Markov model (HMM) is developed here. HMM’s use stationary transition probabilities from one state to another for steps between adjacent locations as well as the probability of any particular observation occurring given each possible underlying state. The approach developed here identifies the transition probabilities directly from the conceptual site model (CSM) created as part of the UXO site characterization process. A series of simulations examine the ability of the HMM approach to simultaneously determine the target area locations within each transect and to estimate the unknown anomaly intensity within the identified target area. The HMM results are compared to those obtained using a simpler target detection approach that considers the background anomaly density to be defined by a Poisson distribution and each location to be independent of any adjacent location. Results show that the HMM approach is capable of accurately identifying the target locations with limited false positive identifications when both the background and target are intensities are known. The HMM approach is relatively robust to changes in the initial estimate of the target anomaly intensity and is capable of identifying target locations and the corresponding target anomaly intensity when this intensity is approximately 60% higher than the background intensity at intensities that are representative of actual field sites. Application to data collected from a wide area assessment field site show that the HMM approach identifies the area of the site with elevated anomaly intensity with few false positives. This field site application also shows that the HMM results are relatively robust to changes in the transect width.  相似文献   

4.
We investigate prediction abilities of different variants of kriging and different combinations of data in a local geometric (GNSS/leveling based) geoid modeling. In order to generate local geoid models, we have used GNSS/leveling data and EGM2008 geopotential model. EGM2008 has been used twofold. Firstly, it was used as a basic long wave-length trend to be removed from geoid undulation data to generate a residual field of geoid heights modeled later by kriging (remove-restore technique). Secondly, EGM2008-based undulations were used as a secondary variable in a cokriging prediction procedure (as pseudo-observations). Besides the use of EGM2008, the kriging-based local geometric geoid models were generated only on the basis of raw undulations data. Kriging itself was used in two variants, i.e. ordinary kriging and universal kriging for univariate and bivariate cases (cokriging). The quality of kriging-based prediction for all its variants and all data combinations have been investigated on one fixed validation dataset consisting of 86 points and three training data sets characterized by a different density of sampling. Results of this study indicate that incorporation of EGM08 as a long wave-length trend in kriging prediction procedure outperforms cokriging strategy based on incorporation of EGM08 as a secondary spatially correlated variable.  相似文献   

5.
Throughout the world, millions of acres of potentially productive land are contaminated with unexploded ordnance due to either past conflicts or to military training activities. Low-level helicopter magnetometry (HeliMag) is currently being used to rapidly survey large areas and identify regions that are potentially clear of hazardous munitions. One configuration currently in use comprises seven cesium vapor magnetometers, horizontally spaced 1.5 m apart and mounted on a boom several meters in front of a Bell 206L helicopter. Magnetometer data are collected at 400 Hz at altitudes as low as 1.5 m above the ground along transects spaced 7 meters apart. From this dense, high-resolution data, potential metallic targets as small as a 60 mm mortar are identified using manual and/or automatic target picking methods. The target picks are then used to estimate densities of potential contamination. 100% detection is generally not feasible, so that HeliMag is usually applied in a characterization rather than in a clearance mode. We describe a HeliMag survey collected over a UXO contaminated site at Yekau Lake, near Edmonton, Canada. The objective was to identify the location and extent of an 11.5 pound bomb target area at a former training range. The target density estimates derived from manual picks were strongly influenced by geology and clutter and did not reflect the underlying density of ordnance and ordnance related clutter. By fitting a dipole model to each target pick, and comparing it to the expected response of the target item, we could estimate the density of objects with similar size/shape to an 11.5 pound bomb. This analysis clearly identified an area of elevated contamination in the same region where 11.5 pound bombs were found during ground reconnaissance. In summary, the new methodology significantly improves the interpretability of HeliMag data when used for UXO site assessment.  相似文献   

6.
Statistically defensible methods are presented for developing geophysical detector sampling plans and analyzing data for munitions response sites where unexploded ordnance (UXO) may exist. Detection methods for identifying areas of elevated anomaly density from background density are shown. Additionally, methods are described which aid in the choice of transect pattern and spacing to assure with degree of confidence that a target area (TA) of specific size, shape, and anomaly density will be identified using the detection methods. Methods for evaluating the sensitivity of designs to variation in certain parameters are also discussed. Methods presented have been incorporated into the Visual Sample Plan (VSP) software (free at ) and demonstrated at multiple sites in the United States. Application examples from actual transect designs and surveys from the previous two years are demonstrated.  相似文献   

7.
In humid, well-vegetated areas, such as in the northeastern US, runoff is most commonly generated from relatively small portions of the landscape becoming completely saturated, however, little is known about the spatial and temporal behavior of these saturated regions. Indicator kriging provides a way to use traditional water table data to quantify probability of saturation to evaluate predicted spatial distributions of runoff generation risk, especially for the new generation of water quality models incorporating saturation excess runoff theory. When spatial measurements of a variable are transformed to binary indicators (i.e., 1 if above a given threshold value and 0 if below) and the resulting indicator semivariogram is modeled, indicator kriging produces the probability of the measured variable to exceed the threshold value. Indicator kriging gives quantified probability of saturation or, consistent with saturation excess runoff theory, runoff generation risk with depth to water table as the variable and the threshold set near the soil surface. The probability of saturation for a 120 m × 180 m hillslope based upon 43 measurements of depth to water table is investigated with indicator semivariograms for six storm events. The indicator semivariograms show high spatial structure in saturated regions with large antecedent rainfall conditions. The temporal structure of the data is used to generate interpolated (soft) data to supplement measured (hard) data. This improved the spatial structure of the indicator semivariograms for lower antecedent rainfall conditions. Probability of saturation was evaluated through indicator kriging incorporating soft data showing, based on this preliminary study, highly connected regions of saturation as expected for the wet season (April through May) in the Catskill Mountain region of New York State. Supplementation of hard data with soft data incorporates physical hydrology of the hillslope to capture significant patterns not available when using hard data alone for indicator kriging. With the need for water quality models incorporating appropriate runoff generation risk estimates on the rise, this manner of data will lay the groundwork for future model evaluation and development.  相似文献   

8.
Lava flows from Mauna Loa volcano can travel the long distances from source vents to populated areas of east Hawaii only if heat-insulating supply conduits (lava channels and/or lava tubes) are constructed and maintained, so as to channelize the flow and prevent heat loss during transport. Lava is commonly directed into such conduits by horseshoe-or lyre-shaped spatter cones-loose accumulations of partially welded scoria formed around principal vents during periods of high fountaining. These conduit systems commonly develop fragile areas amenable to artificial disruption by explosives during typical eruptions. If these conduits can be broken or blocked, lava supply to the threatening flow fronts will be cut off or reduced. Explosives were first suggested as a means to divert lava flows threatening Hilo, Hawaii during the eruption of 1881. They were first used in 1935, without significant success, when the Army Air Force bombed an active pahoehoe channel and tube system on Mauna Loa’s north flank. Channel walls of a Mauna Loa flow were also bombed in 1942, but again there were no significant effects. The locations of the 1935 and 1942 bomb impact areas were determined and are shown for the first time, and the bombing effects are documented. Three days after the 1942 bombing the spatter cone surrounding the principal vent partially collapsed by natural processes, and caused the main flow advancing on Hilo to cease movement. This suggested that spatter cones might be a suitable target for future lava diversion attempts. Because ordnance, tactics, and aircraft delivery systems have changed dramatically since 1942, the U.S. Air Force conducted extensive testing of large aerial bombs (to 900 kg) on prehistoric Mauna Loa lavas in 1975 and 1976, to evaluate applicability of the new systems to lava diversion. Thirty-six bombs were dropped on lava tubes, channels, and a spatter cone in the tests, and it was verified that spatter cones are especially fragile. Bomb crater size (to 30 m diameter) was found to be inversely related to target rock density, with the largest craters produced in the least dense, weakest rock. Bomb fuze time delays of 0.05 sec caused maximum disruption effects for the high impact velocities employed (250 to 275 m/sec). Modern aerial bombing has a substantial probability of success for diversion of lava from most expected types of eruptions on Mauna Loa’s Northeast Rift Zone, if Hilo is threatened and if Air Force assistance is requested. The techniques discussed in this paper may be applicable to other areas of the world threatened by fluid lava flows in the future.  相似文献   

9.
We consider the problem of predicting the spatial field of particle-size curves (PSCs) from a sample observed at a finite set of locations within an alluvial aquifer near the city of Tübingen, Germany. We interpret PSCs as cumulative distribution functions and their derivatives as probability density functions. We thus (a) embed the available data into an infinite-dimensional Hilbert Space of compositional functions endowed with the Aitchison geometry and (b) develop new geostatistical methods for the analysis of spatially dependent functional compositional data. This approach enables one to provide predictions at unsampled locations for these types of data, which are commonly available in hydrogeological applications, together with a quantification of the associated uncertainty. The proposed functional compositional kriging (FCK) predictor is tested on a one-dimensional application relying on a set of 60 PSCs collected along a 5-m deep borehole at the test site. The quality of FCK predictions of PSCs is evaluated through leave-one-out cross-validation on the available data, smoothed by means of Bernstein Polynomials. A comparison of estimates of hydraulic conductivity obtained via our FCK approach against those rendered by classical kriging of effective particle diameters (i.e., quantiles of the PSCs) is provided. Unlike traditional approaches, our method fully exploits the functional form of PSCs and enables one to project the complete information content embedded in the PSC to unsampled locations in the system.  相似文献   

10.
The area located inside the São Sebastião volcanic crater, at the southeast end of Terceira Island (Azores), is characterized by an important amplification of ground motion with respect to the surrounding area, as clearly demonstrated by the spatial distribution of the damage that occurred during the Terceira earthquake (the strongest earthquake felt in the Island during the recent decades — 01/01/1980 — M = 7.2). Geological and geophysical studies have been conducted, to characterize the volcanic crater and understand the different site effects that occurred in the village of São Sebastião. The complexity of the subsurface geology, with intercalations of compact basalt and soft pyroclastic deposits, is associated to extreme vertical and lateral velocity contrasts, and poses a serious challenge to different geophysical characterization methods. The available qualitative model did not allow a complete understanding of the site effects. A new seismic campaign has been designed and acquired, and a single, geologically consistent geophysical model has been generated integrating the existing and new data. The new campaign included two cross-line P-wave seismic refraction profiles, four short SH-wave seismic reflection profiles, and seven multichannel surface wave acquisitions. The integration and joint interpretation of geophysical and geological data allowed mutual validation and confirmation of data processing steps. In particular, the use of refraction, reflection and surface wave techniques allowed facing the complexity of a geology that can pose different challenges to all the methods when used individually: velocity inversions, limited reflectivity, and lateral variations. It is shown how the integration of seismic data from different methods, in the framework of a geological model, allowed the geometrical and dynamic characterization of the site. Correlation with further borehole information, then allowed the definition of a subsoil model for the crater, providing information that allowed a better understanding of the earthquake site effects in the São Sebastião village. The new near-surface geological model includes a lava layer within the soft infill materials of the crater. This new model matches closely with the damage distribution map, and explains the spatial variation of building stock performance in the 1980 earthquake.  相似文献   

11.
Conditional bias-penalized kriging (CBPK)   总被引:1,自引:1,他引:0  
Simple and ordinary kriging, or SK and OK, respectively, represent the best linear unbiased estimator in the unconditional sense in that they minimize the unconditional (on the unknown truth) error variance and are unbiased in the unconditional mean. However, because the above properties hold only in the unconditional sense, kriging estimates are generally subject to conditional biases that, depending on the application, may be unacceptably large. For example, when used for precipitation estimation using rain gauge data, kriging tends to significantly underestimate large precipitation and, albeit less consequentially, overestimate small precipitation. In this work, we describe an extremely simple extension to SK or OK, referred to herein as conditional bias-penalized kriging (CBPK), which minimizes conditional bias in addition to unconditional error variance. For comparative evaluation of CBPK, we carried out numerical experiments in which normal and lognormal random fields of varying spatial correlation scale and rain gauge network density are synthetically generated, and the kriging estimates are cross-validated. For generalization and potential application in other optimal estimation techniques, we also derive CBPK in the framework of classical optimal linear estimation theory.  相似文献   

12.
In the geostatistical analysis of regionalized data, the practitioner may not be interested in mapping the unsampled values of the variable that has been monitored, but in assessing the risk that these values exceed or fall short of a regulatory threshold. This kind of concern is part of the more general problem of estimating a transfer function of the variable under study. In this paper, we focus on the multigaussian model, for which the regionalized variable can be represented (up to a nonlinear transformation) by a Gaussian random field. Two cases are analyzed, depending on whether the mean of this Gaussian field is considered known or not, which lead to the simple and ordinary multigaussian kriging estimators respectively. Although both of these estimators are theoretically unbiased, the latter may be preferred to the former for practical applications since it is robust to a misspecification of the mean value over the domain of interest and also to local fluctuations around this mean value. An advantage of multigaussian kriging over other nonlinear geostatistical methods such as indicator and disjunctive kriging is that it makes use of the multivariate distribution of the available data and does not produce order relation violations. The use of expansions into Hermite polynomials provides three additional results: first, an expression of the multigaussian kriging estimators in terms of series that can be calculated without numerical integration; second, an expression of the associated estimation variances; third, the derivation of a disjunctive-type estimator that minimizes the variance of the error when the mean is unknown.  相似文献   

13.
Snow is a critical storage component in the hydrologic cycle, but current measurement networks are sparse. In addition, the heterogeneity of snow requires surveying larger areas to measure the areal average. We presented snow measurements using GPS interferometric reflectometry (GPS‐IR). GPS‐IR measures a large area (~100 m2), and existing GPS installations around the world have the potential to expand existing snow measurement networks. GPS‐IR uses a standard, geodetic GPS installation to measure the snow surface via the reflected component of the signal. We reported GPS‐IR snow depth measurements made at Niwot Ridge, Colorado, from October 2009 through June 2010. This site is in a topographic saddle at 3500 m elevation with a peak snow depth of 1.7 m near the GPS antenna. GPS‐IR measurements are compared with biweekly snow surveys, a continuously operating scanning laser system and an airborne light detection and ranging (LIDAR) measurement. The GPS‐IR measurement of peak snowpack (1.36–1.76 m) matches manual measurements (0.95–1.7 m) and the scanning laser (1.16 m). GPS‐IR has RMS error of 13 cm (bias = 10 cm) compared with the laser, although differences between the measurement locations make comparison imprecise. Over the melt season, when the snowpack is more homogenous, the difference between the GPS‐IR and the laser is reduced (RMS = 9 cm, bias = 6 cm). In other locations, the GPS and the LIDAR agree on which areas have more or less snow, but the GPS estimates more snow on the ground on tracks to the west (1.58 m) than the LIDAR (1.14 m). Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

14.
15.
Digital elevation models (DEMs) of river channel bathymetries are developed by interpolating elevations between data collected at discrete points or along transects. The accuracy of interpolated bathymetries depends on measurement error, the density and distribution of point data, and the interpolation method. Whereas point measurement errors can be minimized by selecting the most efficient equipment, the effect of data density and interpolation method on river bathymetry is relatively unknown. Thus, this study focuses on transect‐based collection methods and investigates the effects of transect location, the spacing between transects, and interpolation methods on the accuracy of interpolated bathymetry. This is accomplished by comparing four control bathymetries generated from accurate and high resolution, sub‐meter scale data to bathymetries interpolated from transect data extracted from the control bathymetries using two transect locating methods and four interpolation methods. The transect locating methods are a morphologically‐spaced and an equally‐spaced model. The four interpolation methods are Ordinary Kriging, Delaunay Triangulation, and Simple Linear, which are applied in curvilinear coordinates (Delaunay Triangulation is also applied in Cartesian coordinates), and Natural Neighbor only in Cartesian Coordinates. The bathymetric data were obtained from morphologically simple and complex reaches of a large (average bankfull width = 90 m) and a small (average bankfull width = 17 m) river. The accuracy of the developed DEMs is assessed using statistical analysis of the differences between the control and interpolated bathymetries and hydraulic parameters assessed from bankfull water surface elevations. Results indicate that DEM accuracy is not influenced by the choice of transect location method (with same averaged cross‐section spacing) or a specific interpolation method, but rather by the coordinate system for which the interpolation method is applied and the spacing between transects. They also show negligible differences between the mean depths and surface areas calculated from bathymetries with dense or coarse spacing. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

16.
Light Detection and Ranging scanner (LIDAR) provides precise information about the current state of relief even for densely forested areas. Ground surface point clouds were used to create raster digital elevation models (DEM) of two study areas at Lago-Naki Plateau (Krasnodar krai and the Republic of Adygea, Russia). Karst depression patterns on the plateau were examined using LIDAR-based DEM and aerial photographs. Various analysis methods were used to examine karst relief with the help of open software SAGA GIS (rasterizing LIDAR data and calculating morphometric indices) and ImageJ (Fourier transformation and two-dimensional bandpass filter functions). The current karst relief patterns were explored by methods of mathematical morphology of landscapes.  相似文献   

17.
Spatial prediction of river channel topography by kriging   总被引:2,自引:0,他引:2  
Topographic information is fundamental to geomorphic inquiry, and spatial prediction of bed elevation from irregular survey data is an important component of many reach‐scale studies. Kriging is a geostatistical technique for obtaining these predictions along with measures of their reliability, and this paper outlines a specialized framework intended for application to river channels. Our modular approach includes an algorithm for transforming the coordinates of data and prediction locations to a channel‐centered coordinate system, several different methods of representing the trend component of topographic variation and search strategies that incorporate geomorphic information to determine which survey data are used to make a prediction at a specific location. For example, a relationship between curvature and the lateral position of maximum depth can be used to include cross‐sectional asymmetry in a two‐dimensional trend surface model, and topographic breaklines can be used to restrict which data are retained in a local neighborhood around each prediction location. Using survey data from a restored gravel‐bed river, we demonstrate how transformation to the channel‐centered coordinate system facilitates interpretation of the variogram, a statistical model of reach‐scale spatial structure used in kriging, and how the choice of a trend model affects the variogram of the residuals from that trend. Similarly, we show how decomposing kriging predictions into their trend and residual components can yield useful information on channel morphology. Cross‐validation analyses involving different data configurations and kriging variants indicate that kriging is quite robust and that survey density is the primary control on the accuracy of bed elevation predictions. The root mean‐square error of these predictions is directly proportional to the spacing between surveyed cross‐sections, even in a reconfigured channel with a relatively simple morphology; sophisticated methods of spatial prediction are no substitute for field data. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

18.
Interpolations of groundwater table elevation in dissected uplands   总被引:3,自引:0,他引:3  
Chung JW  Rogers JD 《Ground water》2012,50(4):598-607
The variable elevation of the groundwater table in the St. Louis area was estimated using multiple linear regression (MLR), ordinary kriging, and cokriging as part of a regional program seeking to assess liquefaction potential. Surface water features were used to determine the minimum water table for MLR and supplement the principal variables for ordinary kriging and cokriging. By evaluating the known depth to the water and the minimum water table elevation, the MLR analysis approximates the groundwater elevation for a contiguous hydrologic system. Ordinary kriging and cokriging estimate values in unsampled areas by calculating the spatial relationships between the unsampled and sampled locations. In this study, ordinary kriging did not incorporate topographic variations as an independent variable, while cokriging included topography as a supporting covariable. Cross validation suggests that cokriging provides a more reliable estimate at known data points with less uncertainty than the other methods. Profiles extending through the dissected uplands terrain suggest that: (1) the groundwater table generated by MLR mimics the ground surface and elicits a exaggerated interpolation of groundwater elevation; (2) the groundwater table estimated by ordinary kriging tends to ignore local topography and exhibits oversmoothing of the actual undulations in the water table; and (3) cokriging appears to give the realistic water surface, which rises and falls in proportion to the overlying topography. The authors concluded that cokriging provided the most realistic estimate of the groundwater surface, which is the key variable in assessing soil liquefaction potential in unconsolidated sediments.  相似文献   

19.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   

20.
Abstract

This paper compares the performance of three geostatistical algorithms, which integrate elevation as an auxiliary variable: kriging with external drift (KED); kriging combined with regression, called regression kriging (RK) or kriging after detrending; and co-kriging (CK). These three methods differ by the way by in which the secondary information is introduced into the prediction procedure. They are applied to improve the prediction of the monthly average rainfall observations measured at 106 climatic stations in Tunisia over an area of 164 150 km2 using the elevation as the auxiliary variable. The experimental sample semivariograms, residual semivariograms and cross-variograms are constructed and fitted to estimate the rainfall levels and the estimation variance at the nodes of a square grid of 20 km?×?20 km resolution and to develop corresponding contour maps. Contour diagrams for KED and RK were similar and exhibited a pattern corresponding more closely to local topographic features when (a) the network is sparse and (b) the rainfall–elevation correlation is poor, while CK showed a smooth zonal pattern. Smaller prediction variances are obtained for the RK algorithm. The cross-validation showed that the RMSE obtained for CK gave better results than for KED or RK.

Editor D. Koutsoyiannis; Associate editor C. Onof

Citation Feki, H., Slimani, M., and Cudennec, C., 2012. Incorporating elevation in rainfall interpolation in Tunisia using geostatistical methods. Hydrological Sciences Journal, 57 (7), 1294–1314.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号