首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Novel techniques and existing knowledge from chemical research that could be applied to understanding processes at the ocean bottom are by and large analytical, but not entirely. Microelectrodes developed for both the study of electron transfer at interfaces and for application in medical research could be readily modified to investigate gradients at the sediment-water interface. The body of knowledge assembled for, and derived from, election transfer research should be a valuable resource of understanding the mechanisms of redox reactions that occur in the ocean. Chemiluminescent methods for measuring metals in seawater would become much more generally applicable if additional luminescing compounds that chelate metals with great specificity could be identified or synthesized. Collaboration with analytical chemists might enable the development of this method to a wide variety of marine analytical problems. Recent advances in optical detector technology should catalize the adaptation of chemiluminescent methods to in situ analysis.A variety of separation techniques developed for chemical research could be applied to the problems of separating both dissolved and particulate organic matter from their natural matrices. If this can be accomplished it will remove a major barrier in the characterization of organic matter in the ocean. A potential approach for determining the biochemical character of this material after separation is degradation using bacterial enzymes, and then identification of subunits of the polymers by techniques such as 13C-NMR.Relatively recent application of GC-MS and high pressure liquid chromatography to marine organic analysis has already produced more data than is interpretable by marine organic chemists. Chemometric and statistical methods developed by chemists to maximize data interpretation could be used to interpret large data sets plan future experimental approaches.High analytical precision and extremely low detection limits are prerequisites for solving many of the problems associated with hydrothermal circulation and paleoceanography. Some examples of emerging analytical methodology for improving these are echelle spectrography coupled with highly sensitive charge transfer device detectors to enable determination of elemental ratios at high precision, and ICP-mass spectrometry which achieves very low detection limits for some elements.  相似文献   

2.
Knowledge of the natural background content of metals is important, but can be difficult to establish because the concentrations of substances dissolved in ground waters vary considerably with time and space. The main objective of the paper is to assess the natural background of five selected elements: As, Al, Cd, Pb and Hg. Each of these elements, with the exception of Al, is included in the Minimum list of pollutants and their indicators for which the EU Member States should establish threshold values (Daughter Groundwater Directive). The data of the Czech Hydrometeorological Institute which contains analyses obtained by regular monitoring of the quality of ground waters at intervals of 6 months has been used as the source information. This system incorporates ca. 450 monitoring sites which provide information about water in all the types of rocks penetrated by individual boreholes. Because of the low concentrations of certain elements (Hg, Pb and Cd in particular) a significant number of analytical results lie below the quantification limit of the analytical methods used. Therefore, conventional statistical methods for processing data were not applicable and alternative procedures were used. The Kaplan–Meier procedure was used within the NADA module for statistical analysis of data sets containing values below the quantification limit. The concentrations of monitored elements that can be considered natural background are suggested to be values of the third quartile, i.e. values that are less than or equal to 75% of analytical results in the assessed dataset. The remaining 25% of analytical results that exceed the proposed limit can be considered to be anomalies which may be natural or anthropogenic. Based on the statistical analysis of data specific values for the natural background content of elements in ground waters within particular types of lithology have been proposed. These can be considered the natural background values that apply within the whole of the territory of the Czech Republic.  相似文献   

3.
The current research presents a detailed landslide susceptibility mapping study by binary logistic regression, analytical hierarchy process, and statistical index models and an assessment of their performances. The study area covers the north of Tehran metropolitan, Iran. When conducting the study, in the first stage, a landslide inventory map with a total of 528 landslide locations was compiled from various sources such as aerial photographs, satellite images, and field surveys. Then, the landslide inventory was randomly split into a testing dataset 70 % (370 landslide locations) for training the models, and the remaining 30 % (158 landslides locations) was used for validation purpose. Twelve landslide conditioning factors such as slope degree, slope aspect, altitude, plan curvature, normalized difference vegetation index, land use, lithology, distance from rivers, distance from roads, distance from faults, stream power index, and slope-length were considered during the present study. Subsequently, landslide susceptibility maps were produced using binary logistic regression (BLR), analytical hierarchy process (AHP), and statistical index (SI) models in ArcGIS. The validation dataset, which was not used in the modeling process, was considered to validate the landslide susceptibility maps using the receiver operating characteristic curves and frequency ratio plot. The validation results showed that the area under the curve (AUC) for three mentioned models vary from 0.7570 to 0.8520 $ ({\text{AUC}}_{\text{AHP}} = 75.70\;\% ,\;{\text{AUC}}_{\text{SI}} = 80.37\;\% ,\;{\text{and}}\;{\text{AUC}}_{\text{BLR}} = 85.20\;\% ) $ ( AUC AHP = 75.70 % , AUC SI = 80.37 % , and AUC BLR = 85.20 % ) . Also, plot of the frequency ratio for the four landslide susceptibility classes of the three landslide susceptibility models was validated our results. Hence, it is concluded that the binary logistic regression model employed in this study showed reasonably good accuracy in predicting the landslide susceptibility of study area. Meanwhile, the results obtained in this study also showed that the statistical index model can be used as a simple tool in the assessment of landslide susceptibility when a sufficient number of data are obtained.  相似文献   

4.
Parr and Boyd (2002) used colorimetric analysis in combination with geophysical and geochemical techniques to estimate firing temperatures for archaeological daub from an Iron Age site in Thailand. They suggest that the daub was fired at high temperatures and, therefore, is indicative of kiln utilization and increased industrialization during that period in Thailand. They argue that the adoption of a multimethod analytical approach in which the combination of data derived from ICP‐MS, X‐ray diffraction, and magnetic susceptibility analyses of daub samples, coupled with microscopic and macroscopic examination of samples, enhances the accuracy of their interpretations. While they should be commended for attempting to substantiate their claims using many geophysical and geochemical techniques, their arguments are flawed by the misapplication of the techniques described and/or over‐interpretation of the data generated by such techniques. Therefore, Parr and Boyd's (2002:285) point about methodology (“that the combined interpretation of independent measures provides a better estimate of the original firing temperatures of the archaeological material than has hitherto been possible”) is made redundant by the lack of scientific rigor applied to the independent measures used for this study. © 2003 Wiley Periodicals, Inc.  相似文献   

5.
单管高压旋喷注浆技术在砂层加固、止水工程中的应用   总被引:1,自引:0,他引:1  
李文华  张瑞琼 《江苏地质》2004,28(2):103-106
就其他工艺在砂层的加固、止水工程中难以解决的问题,提出了单管高压旋喷注浆技术,并通过一些工程实例证实此工艺的可行性。  相似文献   

6.
Round 23 of the GeoPT international proficiency testing scheme included the ferromanganese nodule powder FeMn‐1 which was distributed as an additional sample (23A). The aim of this initiative was to assess overall analytical performance for such a challenging oxide matrix with a view to the possible certification of such a material in accordance with ISO Guide requirements. To investigate inter‐method discrepancies, precision data and the method means for the most frequently used analytical methods (XRF, ICP‐MS and ICP‐AES) and sample preparation techniques were calculated and then compared using statistical tests of equivalence. For most major elements, XRF and ICP‐AES data dominated and these were found to give equivalent results. In contrast, for most trace elements significant discrepancies were detected between data obtained by different analytical methods. Possible causes are discussed with a view to attributing their origin to calibration strategy, sensitivity or interferences. It is assumed that the unusual oxide matrix generated unexpected interferences and thus method bias. Discrepancies observed between data from different analytical methods provide valuable information for the participating analysts, helping them to avoid systematic errors and thus minimising bias. They also suggest actions necessary to improve results for any future certification of such a material.  相似文献   

7.
Detecting analytical bias is a valuable step during method validation and, in the case of the mining industry, fundamental when one validates the geological databases used for resource estimation. This will generally affect the costs of future investments in new or expanding projects. This paper details frequently used techniques for doing this, with their theoretical background, advantages and shortcomings, providing a deeper insight in how to choose a method for special applications. This is done by means of a review of the specialised literature, and by showing practical applications. The latter is done by choosing some of the most commonly used methods, comparing them using Monte Carlo simulations and testing in some real applications (analytical methods used in Brazilian prospects for copper, gold and iron). Consequently, laboratory specialists will have a better understanding of statistical tools for this purpose, being able to provide their customers with a very consistent assurance of chemical analytical results. It is shown that some commonly used statistical methods for this type of comparison are not applicable, and in some cases, one must resort to more complicated models. These are, however, easily implemented computationally, and the mathematical details are shown.  相似文献   

8.
As new analytical techniques are brought to sourcing studies and researchers compile data into multi‐laboratory databases, systematic evaluation is essential. The importance of precision and accuracy is clear, but Shackley (2005) also calls for “archaeological accuracy.” Hughes (1998) offered a framework to consider precision and accuracy alongside the concepts of reliability and validity. These four concepts can serve as a foundation to evaluate archaeological sourcing data and procedures, but adoption of Hughes’ framework has been nearly nonexistent. Unfortunately, Hughes’ formulations of reliability and validity are somewhat at odds with their conventional definitions, hindering his framework. Furthermore, the concept of precision has become outdated in analytical circles, and superfluous terms (e.g., replicability) have emerged in the archaeological literature. Here I consider the basis of Hughes’ framework and how its four components, when applied consistently by the sourcing community, are best applied to evaluate analytical data and techniques for sourcing.  相似文献   

9.
River, rain and spring water samples from a region covered in “Shirasu” ignimbrite were collected on Kyushu Island, Japan. The analytical results were subjected to multivariate statistical analysis and stoichiometric calculation to understand the geographical distribution of chemical components in water and to extract geochemical underlying factors. The multivariate statistical analysis showed that the river-water chemistry is only slightly influenced by hot springs or polluted waters, but is highly controlled by weathering of ignimbrite. On the basis of the stoichiometric calculation based on water–rock interaction, the water chemistry was successfully estimated by a simple equation:\({\left[ {{\text{Si}}} \right]}{\text{ = 2}}{\left[ {{\text{Na}}^{{\text{ + }}} } \right]}{\text{ + }}{\left[ {{\text{Mg}}^{{{\text{2 + }}}} } \right]}\) in the upstream area, complemented by \({\left[ {{\text{Si}}} \right]}{\text{ = }}{\left[ {{\text{Na}}^{{\text{ + }}} } \right]}{\text{ - 3}}{\left[ {{\text{K}}^{{\text{ + }}} } \right]}{\text{ + }}{\left[ {{\text{Mg}}^{{{\text{2 + }}}} } \right]}{\text{ - 2}}{\left[ {{\text{Ca}}^{{{\text{2 + }}}} } \right]}\) in the downstream area.  相似文献   

10.
Sample collection methods and data standardization techniques accounting for marsh physiography were tested for mobile aquatic fauna in two intertidal pocket salt marshes in Sarah’s Creek, a tributary to the York River in Virginia. Fish and blue crab populations were described and compared as numbers per cubic meter of total marsh volume and numbers per square meter of total marsh area. These methods increase the accuracy of the analyses of sampled populations by relying on fewer assumptions than traditionals random sampling methods. Depending upon the season and the species compared, statistical differences were observed between dimension-adjusted data within the same sampled populations. Our data suggest that accurate population profiles can be determined if collection methods and data adjustments are based on the ecology, behavior, and life-history stage of the target species.  相似文献   

11.
Progress in geochemical research is greatly influenced by developments in analytical technology and in this paper, the development of geoanalytical techniques over the last fifty years is reviewed, based in part on a previous study (Potts et al. 1993). From an evaluation of trends in techniques used for the bulk analysis of silicate rocks during recent years, the important future role of XRF and ICP-MS is apparent. However, it is concluded that the techniques that will be the most important in influencing progress in geochemical research in the future will be those based on microbeam analytical techniques. These techniques are increasingly capable of making the full spectrum of analytical measurements, traditionally undertaken on bulk samples, but on a microprobe scale to spatial resolutions that currently vary from sub-μm to about 50 μm. One way of evaluating future developments is to ask what might happen to various categories of techniques if key parameters such as sensitivity and detection limits were improved by, say, two orders of magnitude. Some suggestions are made describing the possible consequences of such an enhancement in analytical performance.  相似文献   

12.
There is an increasing use of analytical macro‐beam techniques (such as portable XRF, PXRF) for geochemical measurements, as a result of their convenience and relatively low cost per measurement. Reference materials (RMs) are essential for validation, and sometimes calibration, of beam measurements, just as they are for the traditional analytical techniques that use bulk powders. RMs are typically supplied with data sheets that tabulate uncertainties in the reference values by element, for which purpose they also specify a minimum recommended mass of material to be used in the chemical analysis. This minimum mass may not be achievable using analytical beam techniques. In this study, the mass of the test portion interrogated by a handheld PXRF within pellets made from three silicate RMs (SdAR L2, M2 and H1) was estimated using a theoretical approach. It was found to vary from 0.001 to 0.3 g for an 8 mm beam size and 0.0001 to 0.045 g for a 3 mm beam. These test portion masses are mainly well below the recommended minimum mass for these particular RMs (0.2 g), but were found to increase as a function of atomic number (as might be expected). The uncertainties caused by heterogeneity (UHET) in PXRF measurements of the three RMs were experimentally estimated using two different beam diameters for eighteen elements. The elements showing the highest levels of heterogeneity (UHET > 5%) seem generally to be those usually associated with either an accessory mineral (e.g., Zr in zircon, As in pyrite) or low test portion mass (associated with low atomic number). When the beam size was changed from nominally 8 to 3 mm, the uncertainty caused by heterogeneity was seen to increase for most elements by an average ratio of 2.2. These values of UHET were used to calculate revised uncertainties of the reference values that would be appropriate for measurements made using a PXRF with these beam sizes. The methods used here to estimate UHET in PXRF measurements have a potential application to other analytical beam techniques.  相似文献   

13.
Four silicate glasses were prepared by the fusion of about 1 kg powder each of a basalt, syenite, soil and andesite to provide reference materials of natural composition for microanalytical work. These glasses are referred to as ‘Chinese Geological Standard Glasses’ (CGSG) ‐1, ‐2, ‐4 and ‐5. Micro and bulk analyses indicated that the glasses are well homogenised with respect to major and trace elements. Some siderophile/chalcophile elements (e.g., Sn, Pt, Pb) may be heterogeneously distributed in CGSG‐5. This paper provides the first analytical data for the CGSG reference glasses using a variety of analytical techniques (wet chemistry, XRF, EPMA, ICP‐AES, ICP‐MS, LA‐ICP‐MS) performed in nine laboratories. Most data agree within uncertainty limits of the analytical techniques used. Discrepancies in the data for some siderophile/chalcophile elements exist, mainly because of possible heterogeneities of these elements in the glasses and/or analytical problems. From the analytical data, preliminary reference and information values for fifty‐five elements were calculated. The analytical uncertainties [2 relative standard error (RSE)] were estimated to be between about 1% and 20%.  相似文献   

14.
The spatial filtering techniques that are used for the analysis and interpretation of exploration geochemical data to define regional distribution patterns or to outline anomalous areas are, in most cases, based on non-robust statistical methods. The performance of these techniques is heavily influenced by the presence of outliers that commonly exist in the data. This study describes a number of filtering techniques motivated by the development of exploratory data analysis (EDA) and robust statistical procedures. These are the median filter (MF) and the adaptive trimmed mean filter (ATM) for the smoothing of regional geochemical data to reduce spurious variations; two new filters, the fence filter (FF) and the notch filter (NF), have been developed to define geochemical anomalies.The application of the spatial filtering techniques is illustrated by Zn data from approximately 3100 stream sediment samples taken in a regional geochemical survey over 25,000 km2 of the western margin of the São Francisco Basin, Brazil. Regional distribution patterns for Zn obtained by the MF and ATM filters are clearly related to known stratigraphic units. Anomaly filtering using the FF and NF has delineated most known base metal and gold occurrences, as well as a number of anomalies located in geologically favourable environments but unrelated to any known mineralization. The two anomaly filters have, for the most part, defined the same anomalies in the study area but only the NF highlights the anomaly associated with the important Morro Agudo Pb-Zn deposit, which is too subtle to be immediately apparent in the unprocessed data.  相似文献   

15.
多维标度法在矿产预测中的应用   总被引:1,自引:0,他引:1  
在矿产预测中常涉及一些定性变量,对这些变量的分析和应用,必须将定性描述的地质特征转化为用数值表示的变量,这就需要处理此问题的方法——多维标度法。笔者介绍了计量性的Torgerson法、准计量性的林知已夫数量化理论和非计量性多维标度法,并列举了其在矿产预测中的应用实例。  相似文献   

16.
统计降尺度法对未来区域气候变化情景预估的研究进展   总被引:65,自引:5,他引:65  
由于迄今为止大部分的海气耦合气候模式(AOGCM)的空间分辨率还较低,很难对区域尺度的气候变化情景做合理的预测,降尺度法已广泛用于弥补AOGCM在这方面的不足。简要介绍了3种常用的降尺度法:动力降尺度法、统计降尺度法和统计与动力相结合的降尺度法;系统论述了统计降尺度方法的理论和应用的研究进展,其中包括:统计降尺度法的基本假设,统计降尺度法的优缺点,以及常用的3种统计降尺度法;还论述了用统计降尺度法预估未来气候情景的一般步骤,以及方差放大技术在统计降尺度中的应用;同时还强调了统计降尺度方法和动力降尺度方法比较研究在统计降尺度研究中的重要性;最后指出统计与动力相结合的降尺度方法将成为降尺度技术的重要发展方向。  相似文献   

17.
Eight silicate glasses were prepared by directly fusing and stirring 50-100 g each of basalt, andesite, komatiite, peridotite, rhyolite, and quartz-diorite. These are referred to as MPI-DING glasses and were made for the purpose of providing reference materials for geochemical, in-situ microanalytical work. Results from various analytical techniques indicate that individual glass fragments are well homogenised with respect to major and trace elements at the μm to mm scale. Heterogeneities due to quench crystallisation of olivine have been observed in small and limited areas of the two komatiitic glasses. In order to obtain concentration values for as many elements as possible, the glasses were analysed by a variety of bulk and microanalytical methods in a number of laboratories. The analytical uncertainties of most elements are estimated to be between 1% and 10%. From the analytical data, preliminary reference values for more than sixty elements were calculated. The analytical uncertainties of most elements are estimated to be between 1% and 10%.  相似文献   

18.
Joint Consistent Mapping of High-Dimensional Geochemical Surveys   总被引:1,自引:0,他引:1  
Geochemical surveys often contain several tens of components, obtained from different horizons and with different analytical techniques. These are used either to obtain elemental concentration maps or to explore links between the variables. The first task involves interpolation, the second task principal component analysis (PCA) or a related technique. Interpolation of all geochemical variables (in wt% or ppm) should guarantee consistent results: At any location, all variables must be positive and sum up to 100 %. This is not ensured by any conventional geostatistical technique. Moreover, the maps should ideally preserve any link present in the data. PCA also presents some problems, derived from the spatial dependence between the observations, and the compositional nature of the data. Log-ratio geostatistical techniques offer a consistent solution to all these problems. Variation-variograms are introduced to capture the spatial dependence structure: These are direct variograms of all possible log ratios of two components. They can be modeled with a function analogous to the linear model of coregionalization (LMC), where for each spatial structure there is an associated variation matrix describing the links between the components. Eigenvalue decompositions of these matrices provide a PCA of that particular spatial scale. The whole data set can then be interpolated by cokriging. Factorial cokriging can also be used to map a certain spatial structure, eventually projected onto those principal components (PCs) of that structure with relevant contribution to the spatial variability. If only one PC is used for a certain structure, the maps obtained represent the spatial variability of a geochemical link between the variables. These procedures and their advantages are illustrated with the horizon C Kola data set, with 25 components and 605 samples covering most of the Kola peninsula (Finland, Norway, Russia).  相似文献   

19.
Stationarity Scores on Training Images for Multipoint Geostatistics   总被引:2,自引:2,他引:0  
This research introduces a novel method to assess the validity of training images used as an input for Multipoint Geostatistics, alternatively called Multiple Point Simulation (MPS). MPS are a family of spatial statistical interpolation algorithms that are used to generate conditional simulations of property fields such as geological facies. They are able to honor absolute “hard” constraints (e.g., borehole data) as well as “soft” constraints (e.g., probability fields derived from seismic data, and rotation and scale). These algorithms require 2D or 3D training images or analogs whose textures represent a spatial arrangement of geological properties that is presumed to be similar to that of a target volume to be modeled. To use the current generation of MPS algorithms, statistically valid training image are required as input. In this context, “statistical validity” includes a requirement of stationarity, so that one can derive from the training image an average template pattern. This research focuses on a practical method to assess stationarity requirements for MPS algorithms, i.e., that statistical density or probability distribution of the quantity shown on the image does not change spatially, and that the image shows repetitive shapes whose orientation and scale are spatially constant. This method employs image-processing techniques based on measures of stationarity of the category distribution, the directional (or orientation) property field and the scale property field of those images. It was successfully tested on a set of two-dimensional images representing geological features and its predictions were compared to actual realizations of MPS algorithms. An extension of the algorithms to 3D images is also proposed. As MPS algorithms are being used increasingly in hydrocarbon reservoir modeling, the methods described should facilitate screening and selection of the input training images.  相似文献   

20.

Slopes in geotechnical and mining engineering are the most crucial geo-structure. Predicting or forecasting the stability or instability of the slope and then classifying the slope accordingly helps in mitigating the risks and enhancing the design by maximizing the safety. Computing techniques have overpowered the analytical and statistical models used for predicting the stability of the slopes. To reduce the uncertainties and ambiguity of the previously used models, lately, researchers have come up with the novel techniques for Slope Stability Classification (SSC) which are Random Forest, Gradient Boosting Machine, Extreme Gradient Boosting, Boosted Trees and Classification and Regression Trees. These computational algorithms are employed in this research paper and the slope details are taken from a literature i.e. 221 input datasets are used and slopes are classified accordingly using the mentioned models. The relation between the inputs such as height (H), slope angle (β), cohesion (c), pore water pressure ratio (ru), unit weight (γ), angle of internal friction (φ) and slope stability (output) is established and slopes are categorized according to their failure and stability. Performance analysis is done thereafter to analyses and compare different models and let the readers and researchers know that which model sufficed and fitted best to the study.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号