首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In urban areas, groundwater is subjected to many different stresses. One of these, high-rate groundwater withdrawal, contributes to the formation of hydrodynamic, hydrogeochemical, and thermal anomalies within the developed aquifer and to the intensification of karstification and suffosion. (Suffosion refers to undermining through removal of sediment by mechanical and corrosive action of ground-water.) Many different investigative methods or combinations of methods are used to delineate karst zones. Many of these known techniques are labor-consuming, expensive, and inefficient, and are not applicable in urban areas. Under these conditions, a helium survey method may be effectively applied as a viable option to the frequently used techniques. An example of this method for delineation of karst zones in Carboniferous carbonate rocks in an urban area is discussed and the helium survey data are compared with hydrogeochemical, thermodynamic, and tritium survey data.  相似文献   

2.
Leveling geochemical data between map sheets   总被引:1,自引:0,他引:1  
Geochemical surveys are frequently assembled into larger, regional compilations. In some cases a boundary shift in the values for one or more elements may be observed at the join of adjacent surveys. This indicates that data for the affected elements are not consistent between the surveys. Where the same sampling medium has been used, the shift may be due to different crews/organizations, who varied in their sampling techniques. However, most commonly the shift is due to imperfect calibration of the analytical method used for samples from the different surveys. For example, there may have been a lack of proper analytical standardization between survey programs. To carry out leveling, bands are established on either side of the boundary between two surveys that show a shift. It is desirable that the bands have a close match in terms of geology and physiography. A quantitative method is presented to estimate the optimum width for these bands. Quantiles of the data within each band are calculated. The quantile pairs are plotted in XY space and a line fitted to express the relationship between the pairs of quantiles. The equation of this line is used to correct the shift between the two surveys. This method is tested on data for Mo in stream sediments, and pH of stream water, from two National Geochemical Reconnaissance Surveys in British Columbia.  相似文献   

3.
Landslide prediction is complex and involves many factors, such as geotechnical, geological, topographical, and even meteorological. This work presents a methodology by using a Data Mining approach in order to predict landslide occurrences induced by rainfall in Rio de Janeiro city. Landslide and rain data records from 1998 to 2001 were obtained from field technical reports and 30 automatic rain gauges, respectively. It was also collected data regarding soil parameters, including urban areas, forest, vulnerability, among others, and totalizing 46 soil variables. All the information was inserted into a Geographic Information Systems. Clustering (Dendrogram and k-means) and Statistical (Principal Component Analysis and Correlation) techniques were used to regionalize the rain data and select the rain gauges to be input on Artificial Neural Networks , which were used to replace the missing rain values. The landslide volume variable also presented missing values and it was completed by the k-Nearest Neighbor method. After data preparation, some models were built to predict landslide and rainfall using Data Mining techniques. The obtained model’s performance is also analyzed.  相似文献   

4.
This paper documents and investigates an important source of inaccuracy when paleoecological equations calibrated on modern biological data are applied downcore: fossil assemblages for which there are no modern analogs. Algebraic experiments with five calibration techniques are used to evaluate the sensitivity of the methods with respect to no-analog conditions. The five techniques are: species regression; principal-components regression [e.g., Imbrie, J., and Kipp, N. G. (1971). In “The Late Cenozoic Ages,” 71–181]; distance-index regression [Hecht, A. D. (1973). Micropaleontology19, 68–77]; diversity-index regression (Williams, D. F., and Johnson, W. C. (1975). Quaternary Research5, 237–250]; weighted-average method [Jones, J. I. (1964). Unpublished Ph. D. Thesis, Univ. of Wisconsin]. The experiments indicate that the four regression techniques extrapolate under no-analog conditions, yielding erroneous estimates. The weighted-average technique, however, does not extrapolate under no-analog conditions and consequently is more accurate than the other techniques. Methods for recognizing no-analog conditions downcore are discussed, and ways to minimize inaccuracy are suggested. Using several equations based on different calibration techniques is recommended. Divergent estimates suggest that no-analog conditions occur and that estimates are unreliable. The value determined by the weighted-average technique, however, may well be the most accurate.  相似文献   

5.
Two methods are presented whereby finite-strain data may be determined from naturally occurring irregular strain markers (polygons) which are of unknown pre-deformation shape and distribution, without assumptions as to the orientation of the finite-strain ellipse. The first method describes “construction” of ellipses within the polygons, these ellipses providing the basis for analysis by already developed techniques. The second method is a simple extension of Wellman's method, which graphically establishes a strain ellipse from angle and line data.  相似文献   

6.
Three trend surface techniques are used to evaluate the nature and distribution of the surface sediments of a rock glacier. Trend surface analysis, vector trend analysis, and most predictable surface (MPS)mapping suggest that long-term rock glacier creep causes the highly variable debris cover to display crude sorting and orientation patterns. Poorly sorted and randomly oriented surface materials may be the result of insufficient time for slowly evolving rock glacier sediment distributions to be established and/or the site of glacial erosion or deposition.  相似文献   

7.
Seismic measurements may be used in geostatistical techniques for estimation and simulation of petrophysical properties such as porosity. The good correlation between seismic and rock properties provides a basis for these techniques. Seismic data have a wide spatial coverage not available in log or core data. However, each seismic measurement has a characteristic response function determined by the source-receiver geometry and signal bandwidth. The image response of the seismic measurement gives a filtered version of the true velocity image. Therefore the seismic image cannot reflect exactly the true seismic velocity at all scales of spatial heterogeneities present in the Earth. The seismic response function can be approximated conveniently in the spatial spectral domain using the Born approximation. How the seismic image response affects the estimation of variogram. and spatial scales and its impact on geostatistical results is the focus of this paper. Limitations of view angles and signal bandwidth not only smooth the seismic image, increasing the variogram range, but also can introduce anisotropic spatial structures into the image. The seismic data are enhanced by better characterizing and quantifying these attributes. As an exercise, examples of seismically assisted cokriging and cosimulation of porosity between wells are presented.  相似文献   

8.
There is an increasing use of analytical macro‐beam techniques (such as portable XRF, PXRF) for geochemical measurements, as a result of their convenience and relatively low cost per measurement. Reference materials (RMs) are essential for validation, and sometimes calibration, of beam measurements, just as they are for the traditional analytical techniques that use bulk powders. RMs are typically supplied with data sheets that tabulate uncertainties in the reference values by element, for which purpose they also specify a minimum recommended mass of material to be used in the chemical analysis. This minimum mass may not be achievable using analytical beam techniques. In this study, the mass of the test portion interrogated by a handheld PXRF within pellets made from three silicate RMs (SdAR L2, M2 and H1) was estimated using a theoretical approach. It was found to vary from 0.001 to 0.3 g for an 8 mm beam size and 0.0001 to 0.045 g for a 3 mm beam. These test portion masses are mainly well below the recommended minimum mass for these particular RMs (0.2 g), but were found to increase as a function of atomic number (as might be expected). The uncertainties caused by heterogeneity (UHET) in PXRF measurements of the three RMs were experimentally estimated using two different beam diameters for eighteen elements. The elements showing the highest levels of heterogeneity (UHET > 5%) seem generally to be those usually associated with either an accessory mineral (e.g., Zr in zircon, As in pyrite) or low test portion mass (associated with low atomic number). When the beam size was changed from nominally 8 to 3 mm, the uncertainty caused by heterogeneity was seen to increase for most elements by an average ratio of 2.2. These values of UHET were used to calculate revised uncertainties of the reference values that would be appropriate for measurements made using a PXRF with these beam sizes. The methods used here to estimate UHET in PXRF measurements have a potential application to other analytical beam techniques.  相似文献   

9.
10.
Simulation experiments have been conducted to examine the potential usefulness of R-mode and Q-mode factor methods in the analysis and interpretation of geochemical data. The R-mode factor analysis experiment consisted of constructing a factor model, using the model to generate a correlation matrix, and attempting to recover the model by R-mode techniques. The techniques were successful in determining the number of factors in the model, but the factor loadings could not be estimated even approximately on the basis of mathematical procedures alone. Q-mode factor methods were successful in recovering all of the properties of a model used to generate hypothetical chemicaldata on olivinesamples, but it was necessary to use a correction previously regarded as unimportant.Publication approved by Director, U.S. Geological Survey.  相似文献   

11.
Geostatistical estimation techniques were customized to allow forecasting of production figures at the Silver Bell uranium mine (Uravan District).Surface drill hole data were used to provide a block model of kriged estimators of average uranium grades. Figures for recoverable ore grade and the ore-waste ratio are then deduced from regressive curves previously obtained from underground information and production data. Cross-validations of the entire model were performed and were found positive.  相似文献   

12.
Rosenbleuth's point‐estimate method has become widely used in geotechnical practice for reliability calculations. Although the point‐estimate method is a powerful and simple method for evaluating the moments of functions of random variables, it is limited by the need to make 2n evaluations when there are n random variables. Modifications of the method reduce this to 2n evaluations by using points on the diameters of a hypersphere instead of at the corners of the inscribed hypercube. However, these techniques force the co‐ordinates of the evaluation points farther from the means of the variables; for a bounded variable, the points may easily fall outside the domain of definition of the variable. The problem can be avoided by using other techniques for some special cases or by reducing the number of random variables that must be considered. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

13.
There is no single method available for estimating the seismic risk in a given area, and as a result most studies are based on some statistical model. If we denote by Z the random variable that measures the maximum magnitude of earthquakes per unit time, the seismic risk of a value m is the probability that this value will be exceeded in the next time units, that is, R(m)=P(Z>m). Several approximations can be made by adjusting different theoretical distributions to the function R, assuming different distributions for the magnitude of earthquakes. A related method used to treat this problem is to consider the difference between the times of occurrence of consecutive earthquakes, or inter-event times. The hazard function, or failure rate function, of this variable measures the instantaneous risk of occurrence of a new earthquake, supposing that the last earthquake happened at time 0. In this paper, we will consider the estimation of the variable that measures the inter-event time and apply nonparametric techniques; that is, we do not consider any theoretical distribution. Moreover, because the stochastic process associated with this variable can sometimes be non-stationary, we condition each time by the previous ones. We then work with a multidimensional estimation, and consider each multidimensional variable as a functional datum. Functional data analysis deals with data consisting of curves or multidimensional variables. Nonparametric estimation can be applied to functional data, to describe the behavior of seismic zones and their associated instantaneous risk. The applications of estimation techniques are shown by applying them to two different regions and data catalogues: California and southern Spain.  相似文献   

14.
The Bayesian Maximum Entropy (BME) method of spatial analysis and mapping provides definite rules for incorporating prior information, hard and soft data into the mapping process. It has certain unique features that make it a loyal guardian of plausible reasoning under conditions of uncertainty. BME is a general approach that does not make any assumptions regarding the linearity of the estimator, the normality of the underlying probability laws, or the homogeneity of the spatial distribution. By capitalizing on various sources of information and data, BME introduces an epistemological framework that produces predictive maps that are more accurate and in many cases computationally more efficient than those derived by traditional techniques. In fact, kriging techniques can be derived as special cases of the BME approach, under restrictive assumptions regarding the prior information and the data available. BME is a more rigorous approach than indicator kriging for incorporating soft data. The BME formulation, in fact, applies in a spatial or a spatiotemporal domain and its extension to the case of block and vector random fields is straightforward. New theoretical results are presented and numerical examples are discussed, which use the BME approach to account for important sources of knowledge in a systematic manner. BME can be useful in practical situations in which prior information can be used to compensate for the limited amount of measurements available (e.g., preliminary or feasibility study levels) or soft data are available that can be combined with hard data to improve mapping significantly. BME may be then viewed as an effort towards the development of a more general framework of spatial/temporal analysis and mapping, which includes traditional geostatistics as its limiting case, and it also provides the means to derive novel results that could not be obtained by traditional geostatistics.  相似文献   

15.
Exploration geochemistry is viewed in a resource appraisal framework and the various general methods are discussed in terms of their applicability at different stages of the appraisal exercise. The direct nature of geochemical exploration is emphasized and the various types of data that the surveys yield are discussed together with their modes of interpretation. It is shown how the data may be simply reduced to a probability form which will allow data from many sources to be utilized. The limitations of exploration geochemistry in resource appraisal are also discussed so that unnatural expectations may not be fostered and that geochemistry be placed correctly, and complementarity, with the other geoscience techniques of resource appraisal.This paper was presented at the International Geological Correlation Program (IGCP) Project 98 entitled /ldStandards for Computer Applications in Resource Studies held at Loen, Norway, September 27–October 1, 1976.  相似文献   

16.
We present a second-order analytic solution [in terms of a heterogeneous log-transmissivity Y(r) = ln T(r)] for the hydraulic head field in a finite 2D confined heterogeneous aquifer under steady radial flow conditions assuming fixed head boundary conditions at the well and at a circular exterior boundary. The solution may be used to obtain the gradient used in calculation of solute transport to a well in a heterogeneous transmissivity field. The solution, obtained using perturbation methods coupled with Green's function techniques, leads us to postulate a more general form of the head for arbitrarily large-variance fields and may be used to obtain moment relations between the log-transmissivity and head under convergent flow conditions when Y(r) is expressed as a random space function. We present expressions for the mean head field when the log-transmissivity is Gaussian and conditioned on the transmissivity value at the well for an arbitrary ln T covariance. Finally, we look at the effect of parameter variations on the mean head behavior and present numerical simulations verifying the second-order mean head expressions.  相似文献   

17.
When factor analysis is used in geochemistry, it may be useful for factors to be transformed by rotations in order to be identified either to the end members of a mixing model (Miesch, 1976a),or to known chemical equilibriums. It requires that the formula for recomputing data from the factors may be written in a factored manner, which is generally not the case in correspondence analysis. The present paper shows that this becomes possible with data having constant row sums. As an example, the method is tested on the lavas of Paricutin Volcano, already examined by using an extended Q-mode factor analysis (Miesch, 1979).Recomputation of the data after projection gives simular results for both methods. Otherwise, the fact that correspondence analysis provides centered factors makes it well suited to the study of chemical reactions leading to constant mass transformations.  相似文献   

18.
Bayesian Modeling and Inference for Geometrically Anisotropic Spatial Data   总被引:3,自引:0,他引:3  
A geometrically anisotropic spatial process can be viewed as being a linear transformation of an isotropic spatial process. Customary semivariogram estimation techniques often involve ad hoc selection of the linear transformation to reduce the region to isotropy and then fitting a valid parametric semivariogram to the data under the transformed coordinates. We propose a Bayesian methodology which simultaneously estimates the linear transformation and the other semivariogram parameters. In addition, the Bayesian paradigm allows full inference for any characteristic of the geometrically anisotropic model rather than merely providing a point estimate. Our work is motivated by a dataset of scallop catches in the Atlantic Ocean in 1990 and also in 1993. The 1990 data provide useful prior information about the nature of the anisotropy of the process. Exploratory data analysis (EDA) techniques such as directional empirical semivariograms and the rose diagram are widely used by practitioners. We recommend a suitable contour plot to detect departures from isotropy. We then present a fully Bayesian analysis of the 1993 scallop data, demonstrating the range of inferential possibilities.  相似文献   

19.
Geomicrobial and geochemical studies were carried out in Argentina (Patagonia, Chubut Province) on four Au and polymetallic sulfide vein-type deposits. A horizon soils were analyzed for Bacillus reacting to lecithin [Bacillus L.(+)], Au and 12 additional elements. In two of the four sampling sites, exhibiting known and relatively simple mineralized structures, Bacillus L.(+) populations are clearly related to Au, As, Pb, Zn, Cu-sulfide mineralization. In areas containing more complex mineralized structures, the spatial relationship between Bacillus L.(+) and metals in the A horizon is more difficult to interpret. Results of a factor analysis performed on all analytical data (n = 130) suggest a partial relationship between Bacillus L.(+) and Au-As-Y pedochemical associations located above known Au mineralization. Bacillus L.(+) was first analyzed in Argentina in December 1994 and re-analyzed in Belgium five to seven months later. Most of the Bacillus contents (85%) of the Belgian tests are higher than those determined in Argentina. The present results and data of a previous study in Mexico (Melchior et al., 1994a; Melchior et al., 1994b) suggest that this may be the result of temperature variations during sample storage between periods of microbial analysis. From a strictly analytical point of view, the geomicrobial method is not an accurate, reproducible technique. However, Bacillus L.(+) can be used as a microbiological indicator of Au and polymetallic mineralization at a reconnaissance-level regional survey. At a local scale, this microbiological tool should be combined with classical exploration techniques such as soil geochemistry. It is recommended that the collection of all A horizon samples (for microbial study) should be accompanied by B or C horizon soils (for potential geochemical study, after prioritizing targets) so that a second field sampling program does not have to be undertaken.  相似文献   

20.
Stochastic simulation techniques which do not depend on a back transform step to reproduce a prior marginal cumulative distribution function (cdf)may lead to deviations from that distribution which are deemed unacceptable. This paper presents an algorithm to post process simulated realizations or any spatial distribution to reproduce the target cdfin the case of continuous variables or target proportions in the case of categorical variables, yet honoring the conditioning data. Validations conducted for both continuous and categorical cases show that. by adjusting the value of a correction level parameter , the target cdfor proportions can be well reproduced without significant modification of the spatial correlation patterns of the original simulated realizations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号