Acta Geochimica - Isotopic signature is a powerful tool to discriminate methane (CH4) source types and constrain regional and global scale CH4 budgets. Peatlands on the Qinghai-Tibetan Plateau are... 相似文献
In the numerical simulation of groundwater flow, uncertainties often affect the precision of the simulation results. Stochastic and statistical approaches such as the Monte Carlo method, the Neumann expansion method and the Taylor series expansion, are commonly employed to estimate uncertainty in the final output. Based on the first-order interval perturbation method, a combination of the interval and perturbation methods is proposed as a viable alternative and compared to the well-known equal interval continuous sampling method (EICSM). The approach was realized using the GFModel (an unsaturated-saturated groundwater flow simulation model) program. This study exemplifies scenarios of three distinct interval parameters, namely, the hydraulic conductivities of six equal parts of the aquifer, their boundary head conditions, and several hydrogeological parameters (e.g. specific storativity and extraction rate of wells). The results show that the relative errors of deviation of the groundwater head extremums (RDGE) in the late stage of simulation are controlled within approximately ±5% when the changing rate of the hydrogeological parameter is no more than 0.2. From the viewpoint of the groundwater head extremums, the relative errors can be controlled within ±1.5%. The relative errors of the groundwater head variation are within approximately ±5% when the changing rate is no more than 0.2. The proposed method of this study is applicable to unsteady-state confined water flow systems.
Radiogenic isotopic dating and Lu–Hf isotopic composition using laser ablation-inductively coupled plasma-mass spectrometry(LA-ICP-MS)of the Wude basalt in Yunnan province from the Emeishan large igneous province(ELIP)yielded timing of formation and post-eruption tectonothermal event.Holistic lithogeochemistry and elements mapping of basaltic rocks were further reevaluated to provide insights into crustal contamination and formation of the ELIP.A zircon U–Pb age of 251.3±2.0 Ma of the Wude basalt recorded the youngest volcanic eruption event and was consistent with the age span of 251-263 Ma for the emplacement of the ELIP.Such zircons hadεHf(t)values ranging from7.3 to+2.2,identical to those of magmatic zircons from the intrusive rocks of the ELIP,suggesting that crust-mantle interaction occurred during magmatic emplacement,or crust-mantle mixing existed in the deep source region prior to deep melting.The apatite U–Pb age at 53.6±3.4 Ma recorded an early Eocene magmatic superimposition of a regional tectonothermal event,corresponding to the Indian–Eurasian plate collision.Negative Nb,Ta,Ti and P anomalies of the Emeishan basalt may reflect crustal contamination.The uneven Nb/La and Th/Ta values distribution throughout the ELIP supported a mantle plume model origin.Therefore,the ELIP was formed as a result of a mantle plume which was later superimposed by a regional tectonothermal event attributed to the Indian–Eurasian plate collision during early Eocene. 相似文献
The selection of a suitable discretization method(DM)to discretize spatially continuous variables(SCVs)is critical in ML-based natural hazard susceptibility assessment.However,few studies start to consider the influence due to the selected DMs and how to efficiently select a suitable DM for each SCV.These issues were well addressed in this study.The information loss rate(ILR),an index based on the informa-tion entropy,seems can be used to select optimal DM for each SCV.However,the ILR fails to show the actual influence of discretization because such index only considers the total amount of information of the discretized variables departing from the original SCV.Facing this issue,we propose an index,infor-mation change rate(ICR),that focuses on the changed amount of information due to the discretization based on each cell,enabling the identification of the optimal DM.We develop a case study with Random Forest(training/testing ratio of 7:3)to assess flood susceptibility in Wanan County,China.The area under the curve-based and susceptibility maps-based approaches were presented to compare the ILR and ICR.The results show the ICR-based optimal DMs are more rational than the ILR-based ones in both cases.Moreover,we observed the ILR values are unnaturally small(<1%),whereas the ICR values are obviously more in line with general recognition(usually 10%-30%).The above results all demonstrate the superiority of the ICR.We consider this study fills up the existing research gaps,improving the ML-based natural hazard susceptibility assessments. 相似文献
Machine learning algorithms are an important measure with which to perform landslide susceptibility assessments,but most studies use GIS-based classification methods to conduct susceptibility zonation.This study presents a machine learning approach based on the C5.0 decision tree(DT)model and the K-means cluster algorithm to produce a regional landslide susceptibility map.Yanchang County,a typical landslide-prone area located in northwestern China,was taken as the area of interest to introduce the proposed application procedure.A landslide inventory containing 82 landslides was prepared and subse-quently randomly partitioned into two subsets:training data(70%landslide pixels)and validation data(30%landslide pixels).Fourteen landslide influencing factors were considered in the input dataset and were used to calculate the landslide occurrence probability based on the C5.0 decision tree model.Susceptibility zonation was implemented according to the cut-off values calculated by the K-means clus-ter algorithm.The validation results of the model performance analysis showed that the AUC(area under the receiver operating characteristic(ROC)curve)of the proposed model was the highest,reaching 0.88,compared with traditional models(support vector machine(SVM)=0.85,Bayesian network(BN)=0.81,frequency ratio(FR)=0.75,weight of evidence(WOE)=0.76).The landslide frequency ratio and fre-quency density of the high susceptibility zones were 6.76/km2 and 0.88/km2,respectively,which were much higher than those of the low susceptibility zones.The top 20%interval of landslide occurrence probability contained 89%of the historical landslides but only accounted for 10.3%of the total area.Our results indicate that the distribution of high susceptibility zones was more focused without contain-ing more"stable"pixels.Therefore,the obtained susceptibility map is suitable for application to landslide risk management practices. 相似文献