New Earth observation missions and technologies are delivering large amounts of data. Processing this data requires developing and evaluating novel dimensionality reduction approaches to identify the most informative features for classification and regression tasks. Here we present an exhaustive evaluation of Guided Regularized Random Forest (GRRF), a feature selection method based on Random Forest. GRRF does not require fixing a priori the number of features to be selected or setting a threshold of the feature importance. Moreover, the use of regularization ensures that features selected by GRRF are non-redundant and representative. Our experiments based on various kinds of remote sensing images, show that GRRF selected features provides similar results to those obtained when using all the available features. However, the comparison between GRRF and standard random forest features shows substantial differences: in classification, the mean overall accuracy increases by almost 6% and, in regression, the decrease in RMSE almost reaches 2%. These results demonstrate the potential of GRRF for remote sensing image classification and regression. Especially in the context of increasingly large geodatabases that challenge the application of traditional methods. 相似文献
Univariate and multivariate stress release models are fitted to historical earthquake data from North China. It is shown that a better fit is obtained by treating separately the Eastern part of the region, including the North China Plain and Bohai Sea, and the Western part of the region, including the Ordos Plateau and its Eastern boundary. Further improvement is obtained by fitting the large events (M7.6) and smaller events in the Western region by different stress release models. The comparisons are made by computing the likelihoods of the fitted models and discounting the number of parameters used by Akaike's AIC criterion. The models are used to develop long-term risk scenarios for the East and West regions. 相似文献
The stratiform Cu–Co ore mineralisation in the Katangan Copperbelt consists of dispersed sulphides and sulphides in nodules
and lenses, which are often pseudomorphs after evaporites. Two types of pseudomorphs can be distinguished in the nodules and
lenses. In type 1 examples, dolomite precipitated first and was subsequently replaced by Cu–Co sulphides and authigenic quartz,
whereas in type 2 examples, authigenic quartz and Cu–Co sulphides precipitated prior to dolomite and are coarse-grained. The
sulphur isotopic composition of the copper–cobalt sulphides in the type 1 pseudomorphs is between −10.3 and 3.1‰ relative
to the Vienna Canyon Diablo Troilite, indicating that the sulphide component was derived from bacterial sulphate reduction
(BSR). The generation of during this process caused the precipitation and replacement of anhydrite by dolomite. A second product of BSR is the generation
of H2S, resulting in the precipitation of Cu–Co sulphides from the mineralising fluids. Initial sulphide precipitation occurred
along the rim of the pseudomorphs and continued towards the core. Precipitation of authigenic quartz was most likely induced
by a pH decrease during sulphide precipitation. Fluid inclusion data from quartz indicate the presence of a high-salinity
(8–18 eq. wt.% NaCl) fluid, possibly derived from evaporated seawater which migrated through the deep subsurface. 87Sr/86Sr ratios of dolomite in type 1 nodules range between 0.71012 and 0.73576, significantly more radiogenic than the strontium
isotopic composition of Neoproterozoic marine carbonates (87Sr/86Sr = 0.7056–0.7087). This suggests intense interaction with siliciclastic sedimentary rocks and/or the granitic basement.
The low carbon isotopic composition of the dolomite in the pseudomorphs (−7.02 and −9.93‰ relative to the Vienna Pee Dee Belemnite,
V-PDB) compared to the host rock dolomite (−4.90 and +1.31‰ V-PDB) resulted from the oxidation of organic matter during BSR. 相似文献
The aim of this paper is to discuss a number of issues related to the use of spatial information for landslide susceptibility, hazard, and vulnerability assessment. The paper centers around the types of spatial data needed for each of these components, and the methods for obtaining them. A number of concepts are illustrated using an extensive spatial data set for the city of Tegucigalpa in Honduras. The paper intends to supplement the information given in the “Guidelines for Landslide Susceptibility, Hazard and Risk Zoning for Land Use Planning” by the Joint ISSMGE, ISRM and IAEG Technical Committee on Landslides and Engineered Slopes (JTC-1). The last few decades have shown a very fast development in the application of digital tools such as Geographic Information Systems, Digital Image Processing, Digital Photogrammetry and Global Positioning Systems. Landslide inventory databases are becoming available to more countries and several are now also available through the internet. A comprehensive landslide inventory is a must in order to be able to quantify both landslide hazard and risk. With respect to the environmental factors used in landslide hazard assessment, there is a tendency to utilize those data layers that are easily obtainable from Digital Elevation Models and satellite imagery, whereas less emphasis is on those data layers that require detailed field investigations. A review is given of the trends in collecting spatial information on environmental factors with a focus on Digital Elevation Models, geology and soils, geomorphology, land use and elements at risk. 相似文献
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies.
Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making. 相似文献