首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Kriging with external drift for functional data for air quality monitoring   总被引:3,自引:2,他引:1  
Functional data featured by a spatial dependence structure occur in many environmental sciences when curves are observed, for example, along time or along depth. Recently, some methods allowing for the prediction of a curve at an unmonitored site have been developed. However, the existing methods do not allow to include in a model exogenous variables that, for example, bring meteorology information in modeling air pollutant concentrations. In order to introduce exogenous variables, potentially observed as curves as well, we propose to extend the so-called kriging with external drift—or regression kriging—to the case of functional data by means of a three-step procedure involving functional modeling for the trend and spatial interpolation of functional residuals. A cross-validation analysis allows to choose smoothing parameters and a preferable kriging predictor for the functional residuals. Our case study considers daily PM10 concentrations measured from October 2005 to March 2006 by the monitoring network of Piemonte region (Italy), with the trend defined by meteorological time-varying covariates and orographical constant-in-time variables. The performance of the proposed methodology is evaluated by predicting PM10 concentration curves on 10 validation sites, even with simulated realistic datasets on a larger number of spatial sites. In this application the proposed methodology represents an alternative to spatio-temporal modeling but it can be applied more generally to spatially dependent functional data whose domain is not a time interval.  相似文献   

2.
We consider the problem of predicting the spatial field of particle-size curves (PSCs) from a sample observed at a finite set of locations within an alluvial aquifer near the city of Tübingen, Germany. We interpret PSCs as cumulative distribution functions and their derivatives as probability density functions. We thus (a) embed the available data into an infinite-dimensional Hilbert Space of compositional functions endowed with the Aitchison geometry and (b) develop new geostatistical methods for the analysis of spatially dependent functional compositional data. This approach enables one to provide predictions at unsampled locations for these types of data, which are commonly available in hydrogeological applications, together with a quantification of the associated uncertainty. The proposed functional compositional kriging (FCK) predictor is tested on a one-dimensional application relying on a set of 60 PSCs collected along a 5-m deep borehole at the test site. The quality of FCK predictions of PSCs is evaluated through leave-one-out cross-validation on the available data, smoothed by means of Bernstein Polynomials. A comparison of estimates of hydraulic conductivity obtained via our FCK approach against those rendered by classical kriging of effective particle diameters (i.e., quantiles of the PSCs) is provided. Unlike traditional approaches, our method fully exploits the functional form of PSCs and enables one to project the complete information content embedded in the PSC to unsampled locations in the system.  相似文献   

3.
This paper proposes methods to detect outliers in functional data sets and the task of identifying atypical curves is carried out using the recently proposed kernelized functional spatial depth (KFSD). KFSD is a local depth that can be used to order the curves of a sample from the most to the least central, and since outliers are usually among the least central curves, we present a probabilistic result which allows to select a threshold value for KFSD such that curves with depth values lower than the threshold are detected as outliers. Based on this result, we propose three new outlier detection procedures. The results of a simulation study show that our proposals generally outperform a battery of competitors. We apply our procedures to a real data set consisting in daily curves of emission levels of nitrogen oxides (NO\(_{x}\)) since it is of interest to identify abnormal NO\(_{x}\) levels to take necessary environmental political actions.  相似文献   

4.
Many sedimentary basins throughout the world exhibit areas with abnormal pore-fluid pressures (higher or lower than normal or hydrostatic pressure). Predicting pore pressure and other parameters (depth, extension, magnitude, etc.) in such areas are challenging tasks. The compressional acoustic (sonic) log (DT) is often used as a predictor because it responds to changes in porosity or compaction produced by abnormal pore-fluid pressures. Unfortunately, the sonic log is not commonly recorded in most oil and/or gas wells. We propose using an artificial neural network to synthesize sonic logs by identifying the mathematical dependency between DT and the commonly available logs, such as normalized gamma ray (GR) and deep resistivity logs (REID). The artificial neural network process can be divided into three steps: (1) Supervised training of the neural network; (2) confirmation and validation of the model by blind-testing the results in wells that contain both the predictor (GR, REID) and the target values (DT) used in the supervised training; and 3) applying the predictive model to all wells containing the required predictor data and verifying the accuracy of the synthetic DT data by comparing the back-predicted synthetic predictor curves (GRNN, REIDNN) to the recorded predictor curves used in training (GR, REID). Artificial neural networks offer significant advantages over traditional deterministic methods. They do not require a precise mathematical model equation that describes the dependency between the predictor values and the target values and, unlike linear regression techniques, neural network methods do not overpredict mean values and thereby preserve original data variability. One of their most important advantages is that their predictions can be validated and confirmed through back-prediction of the input data. This procedure was applied to predict the presence of overpressured zones in the Anadarko Basin, Oklahoma. The results are promising and encouraging.  相似文献   

5.
Practically all records of eddy-covariance flux measurements are affected by gaps, caused by several reasons. In this work, we propose analog period (AP) methods for gap-filling, and test them for filling gaps of latent heat flux at five AmeriFlux sites. The essence of the methods is to look for periods in the record that bear a strong resemblance, in the variable to be filled, to the periods immediately before and after the gap. Similarity between periods is gauged by the coefficient of determination, and the search for similar periods and their ranking is straightforward. The methods are developed in a univariate version (that uses only the latent heat flux data series itself) and several multivariate ones, that incorporate sensible heat flux, ground heat flux and net radiation data. For each set of independent variables used for gap-filling, the methods are tested against an existing gap-filling procedure with similar data requirements. Thus, the univariate version is tested against the mean diurnal variation method, and the multivariate versions are tested against corresponding simple and multiple linear regression techniques that use energy-budget data, and in one case the evaporative fraction as well. In our tests, the proposed univariate version performs better than the mean diurnal variation method, and the multivariate versions perform better than simple/multiple linear regression methods. An often used available computer package, REddyProc, was also tested as a basis of comparison. In general, the proposed methods (in univariate and multivariate versions) and simple/multiple linear regressions performed better than REddyProc. For the datasets analysed, gap filling via the evaporative fraction method performed poorly.  相似文献   

6.
The estimation of velocity and depth is an important stage in seismic data processing and interpretation. We present a method for velocity-depth model estimation from unstacked data. This method is formulated as an iterative algorithm producing a model which maximizes some measure of coherency computed along traveltimes generated by tracing rays through the model. In the model the interfaces are represented as cubic splines and it is assumed that the velocity in each layer is constant. The inversion includes the determination of the velocities in all the layers and the location of the spline knots. The process input consists of unstacked seismic data and an initial velocity-depth model. This model is often based on nearby well information and an interpretation of the stacked section. Inversion is performed iteratively layer after layer; during each iteration synthetic travel-time curves are calculated for the interface under consideration. A functional characterizing the main correlation properties of the wavefield is then formed along the synthetic arrival times. It is assumed that the functional reaches a maximum value when the synthetic arrival time curves match the arrival times of the events on the field gathers. The maximum value of the functional is obtained by an effective algorithm of non-linear programming. The present inversion algorithm has the advantages that event picking on the unstacked data is not required and is not based on curve fitting of hyperbolic approximations of the arrival times. The method has been successfully applied to both synthetic and field data.  相似文献   

7.
The method of temporal moments is an efficient approach for analyzing breakthrough curves (BTCs). By matching the moments of the BTCs computed through parametric transfer-function models or one-dimensional transport models to those of the data, one can estimate the parameters characterizing the transfer function or apparent transport parameters. The classical method of moments presumes infinite duration. However, the measurement of BTCs is usually terminated prematurely, before the concentration has reached zero. Unless this truncation of the BTCs has been taken into account, the estimates of the parameters may be in error. Truncated measured BTCs are sometimes extrapolated assuming exponential decay. In this study, we use the concept of moments of the truncated impulse–response function [Jawitz JW. Moments of truncated continuous univariate distributions. Adv Water Res 2004;27:269–81] in the analysis of truncated BTCs corresponding to the commonly encountered step and step-pulse injection modes. The method is straightforward, based on the relation, which we derive, between truncated moments of the impulse–response function and the measured BTC. It is practical to apply and does not require the extrapolation of the measured BTC. The method is also accurate. In a numerical study we discuss how short a step-pulse injection may be so that we can approximate it as instantaneous. Finally, we apply the method to the analysis of a field-scale tracer test.  相似文献   

8.
It is well known that the results of determining earthquake parameters depend to a large extent on data processing algorithms and velocity models of the seismic wave propagation medium used in solving hypocenter problems. In 1992, V.Yu. Burmin developed a hypocentric algorithm that minimizes the functional of distances between the points corresponding to the theoretical and observed travel times of seismic waves from an earthquake source to recording stations. The determination of the coordinates of earthquake hypocenters in this case is much more stable than for the commonly used minimization of the functional of discrepancies in the seismic wave arrival times at a station. Using this algorithm and the refined velocity model of the medium, V.Yu. Burmin and L.A. Shumlyanskaya reinterpreted the earthquake parameters for the Crimea–Black Sea region. The most important result of this reinterpretation was the conclusion about the occurrence of deep earthquakes with a source depth of more than 60 km in the region. This result contradicts the conventional beliefs about the seismicity of the region and therefore aroused strong criticism from experts directly involved in compiling the existing catalogs of regional earthquakes. These comments and criticisms are presented by V.E. Kulchitsky with coauthors in a work published in this issue of the journal. In the present paper, we analyze the comments in detail and respond. In particular, we show that the previously used methods of seismic data processing made it highly unlikely by default that deep earthquakes would appear in the results. As an example, we refer to the use of travel-time curves for depths down to 35 km. It is clear that deep earthquakes could not have been found with this approach.  相似文献   

9.
In this paper, an efficient pattern recognition method for functional data is introduced. The proposed method works based on reproducing kernel Hilbert space (RKHS), random projection and K-means algorithm. First, the infinite dimensional data are projected onto RKHS, then they are projected iteratively onto some spaces with increasing dimension via random projection. K-means algorithm is applied to the projected data, and its solution is used to start K-means on the projected data in the next spaces. We implement the proposed algorithm on some simulated and climatological datasets and compare the obtained results with those achieved by K-means clustering using a single random projection and classical K-means. The proposed algorithm presents better results based on mean square distance (MSD) and Rand index as we have expected. Furthermore, a new kernel based on a wavelet function is used that gives a suitable reconstruction of curves, and the results are satisfactory.  相似文献   

10.
In this paper, known simplified methods for the assessment of soil liquefaction are summarized. Their discrepancies are examined. Using the Chi-Chi earthquake data as well as other reported data, a set of three critical cyclic strength curves were obtained by finding the minimum of misclassified points. The functional forms of these three curves are an exponential function, a hyperbola, and a cubic polynomial. A lower bound critical cyclic strength curve is then established. This curve may have important applications in practice for liquefaction-related designs. Through this case study, it was found that a minimum cyclic strength CSRlim may exist at a very low value of (N1)60. An upper limit (N1)60upp also exists beyond which liquefaction may not occur. Furthermore, current simplified methods seem suitable only for a limited range of N values and fines content, and may fail for general applications. The lower bound curve proposed in this paper may provide an alternative approach for improvement. Since the explicit functional form and the statistical indices are available, the statistically regressed curves seem to have a benefit in that it may be used directly to conduct hazard analysis, and evaluate the uncertainty within the critical cyclic strength curves.  相似文献   

11.
Bivariate distributions have been recently employed in hydrologic frequency analysis to analyze the joint probabilistic characteristics of multivariate storm events. This study aims to derive practical solutions of application for the bivariate distribution to estimate design rainfalls corresponding to the desired return periods. Using the Gumbel mixed model, this study constructed rainfall–frequency curves at sample stations in Korea which provide joint relationships between amount, duration, and frequency of storm events. Based on comparisons and analyses of the rainfall–frequency curves derived from univariate and bivariate storm frequency analyses, this study found that conditional frequency analysis provides more appropriate estimates of design rainfalls as it more accurately represents the natural relationship between storm properties than the conventional univariate storm frequency analysis.  相似文献   

12.
A new wavelet-based estimation methodology, in the context of spatial functional regression, is proposed to discriminate between small-scale and large scale variability of spatially correlated functional data, defined by depth-dependent curves. Specifically, the discrete wavelet transform of the data is computed in space and depth to reduce dimensionality. Moment-based regression estimation is applied for the approximation of the scaling coefficients of the functional response. While its wavelet coefficients are estimated in a Bayesian regression framework. Both regression approaches are implemented from the empirical versions of the scaling and wavelet auto-covariance and cross-covariance operators, characterizing the correlation structure of the spatial functional response. Weather stations in ocean islands display high spatial concentration. The proposed estimation methodology overcomes the difficulties arising in the estimation of ocean temperature field at different depths, from long records of ocean temperature measurements in these stations. Data are collected from The World-Wide Ocean Optics Database. The performance of the presented approach is tested in terms of 10-fold cross-validation, and residual spatial and depth correlation analysis. Additionally, an application to soil sciences, for prediction of electrical conductivity profiles is also considered to compare this approach with previous related ones, in the statistical analysis of spatially correlated curves in depth.  相似文献   

13.
Environmental pollution affects parasite populations and communities, both directly and through effects on intermediate and final hosts. In this work, we present a comparative study on the structure and composition of metazoan parasite communities in the bogue, Boops boops, from two localities (Galician coast, Spain) affected by the Prestige oil-spill (POS). We focus on the distribution of both individual parasite species and larger functional groupings by using both univariate and multivariate analyses. Our results indicate directional trends in community composition that might be related to the Prestige oil-spill disturbance of the natural coastal communities off Galicia. Endoparasite communities in B. boops reflected a notable change in the composition and abundance of the benthic fauna in the localities studied post-spill probably due to organic enrichment after the POS.  相似文献   

14.
核磁共振T2分布评价岩石孔径分布的改进方法   总被引:49,自引:7,他引:49       下载免费PDF全文
岩芯核磁共振(NMR)T2分布和毛管压力分析数据均在一定程度上反映了岩石的孔隙结构,理论分析表明,这两组数据具有相关性.应用NMR T2分布研究岩石孔径分布,关键是在分析两者的相关性的基础上,从T2分布构造出可靠的毛管压力曲线.但以前用饱和水T2分布构造的毛管压力曲线的方法与实际毛管压力曲线匹配性差.事实上,薄膜束缚水部分的存在引起T2分布反映的孔隙空间与毛管压力曲线反映的孔隙空间有差异.本文提出一种改进方法,在消除薄膜束缚水对T2分布的贡献后,用自由水T2分布构造毛管压力曲线.应用本文方法,对24块岩芯数据自由水T2分布构造的毛管力曲线及其孔喉半径分布与隔板毛管压力分析结果进行了对比.结果表明,改进方法对于毛管压力曲线的构造精度有明显改进,从而为NMR T2分布研究孔隙结构提供了可靠的理论和方法上的支持.  相似文献   

15.
地下逸出气地震信息系统(GGEIS)包括观测数据管理、文字信息管理、实用工具和系统维护等子系统,可根据台站信息自动产生观测数据库,能够对数据库进行交叉管理和数据查询。在本文中介绍了该系统的总体设计、数据库设计、功能子系统和主要技术特点  相似文献   

16.
This paper investigates the potential of Spartan spatial random fields (SSRFs) in real-time mapping applications. The data set that we study focuses on the distribution of daily gamma dose rates over part of Germany. Our goal is to determine a Spartan spatial model from the data, and then use it to generate “predictive” maps of the radioactivity. In the SSRF framework, the spatial dependence is determined from sample functions that focus on short-range correlations. A recently formulated SSRF predictor is used to derive isolevel contour maps of the dose rates. The SSRF predictor is explicit. Moreover, the adjustments that it requires by the user are reduced compared to classical geostatistical methods. These features present clear advantages for an automatic mapping system. The performance of the SSRF predictor is evaluated by means of various cross-validation measures. The values of the performance measures are similar to those obtained by classical geostatistical methods. Application of the SSRF method to data that simulate a radioactivity release scenario is also discussed. Hot spots are detected and removed using a heuristic method. The extreme values that appear in the path of the simulated plume are not captured by the currently used Spartan spatial model. Modeling of the processes leading to extreme values can enhance the predictive capabilities of the spatial model, by incorporating physical information.  相似文献   

17.
For sediment yield estimation, intermittent measurements of suspended sediment concentration (SSC) have to be interpolated to derive a continuous sedigraph. Traditionally, sediment rating curves (SRCs) based on univariate linear regression of discharge and SSC (or the logarithms thereof) are used but alternative approaches (e.g. fuzzy logic, artificial neural networks, etc.) exist. This paper presents a comparison of the applicability of traditional SRCs, generalized linear models (GLMs) and non‐parametric regression using Random Forests (RF) and Quantile Regression Forests (QRF) applied to a dataset of SSC obtained for four subcatchments (0·08, 41, 145 and 445 km2) in the Central Spanish Pyrenees. The observed SSCs are highly variable and range over six orders of magnitude. For these data, traditional SRCs performed inadequately due to the over‐simplification of relating SSC solely to discharge. Instead, the multitude of acting processes required more flexibility to model these nonlinear relationships. Thus, alternative advanced machine learning techniques that have been successfully applied in other disciplines were tested. GLMs provide the option of including other relevant process variables (e.g. rainfall intensities and temporal information) but require the selection of the most appropriate predictors. For the given datasets, the investigated variable selection methods produced inconsistent results. All proposed GLMs showed an inferior performance, whereas RF and QRF proved to be very robust and performed favourably for reproducing sediment dynamics. QRF additionally provides estimates on the accuracy of the predictions and thus allows the assessment of uncertainties in the estimated sediment yield that is not commonly found in other methods. The capabilities of RF and QRF concerning the interpretation of predictor effects are also outlined. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

18.
The localized normal-score ensemble Kalman filter (NS-EnKF) coupled with covariance inflation is used to characterize the spatial variability of a channelized bimodal hydraulic conductivity field, for which the only existing prior information about conductivity is its univariate marginal distribution. We demonstrate that we can retrieve the main patterns of the reference field by assimilating a sufficient number of piezometric observations using the NS-EnKF. The possibility of characterizing the conductivity spatial variability using only piezometric head data shows the importance of accounting for these data in inverse modeling.  相似文献   

19.
The Wenchuan Earthquake with a magnitude of Ms 8.0 struck the Sichuan province of China on May 12, 2008, where it mainly affected the area along the Longmenshan fault. In total, 420 three-component acceleration records were obtained by the China Strong Motion Networks Centre during this seismic event, among which over 50 records exceeded 100 gal. In the present study, we collected 48 near-fault acceleration records to derive strong ground motion parameters in terms of the peak ground acceleration, peak ground velocity, peak spectrum acceleration (5% of the damping ratio) and spectrum intensity (5% of damping ratio). We determined the building collapse ratios (CRs) for 20 targeted districts based on data acquired from both the China Earthquake Administration and the Chinese Academy of Sciences, where the CRs combined the data for all building types. Fragility curves were established between the CRs and the corresponding ground motion parameters, based on which the damage criteria were proposed. In particular, we derived the fragility curves for brick-concrete structures and frame-structures. These curves indicate how different structural types can determine the damage sustained. In addition, we developed a method for estimating building damage classifications. If we assume that buildings are built according to the improved Seismic Fortification Criterion in the revised “Code for Seismic Design of Buildings”, the predicted CRs for the 20 targeted districts would be significantly lower compared with the actual damage they sustained, which illustrates the validity of both the method and the revised code.  相似文献   

20.
华北地区强震前地震活动的异常增强   总被引:2,自引:0,他引:2  
吕晓健  刘蒲雄 《地震》2000,20(3):43-47
利用应变释放曲线的加速上升作为异常增强的标志和地震预报的中期指标,对华北地区做应变释放曲线的空间扫描,并对此方法的预报效能进行了全面系统的检验。结果表明,该方法有较好的预报效果,文中就其有关问题进行了讨论,并给出其预报规则。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号