首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Kernel Principal Component Analysis (KPCA) is an efficient multivariate statistical technique used for nonlinear process monitoring. Nevertheless, the conventional KPCA suffers high computational complexity in dealing with large samples. In this paper, a new kernel method based on a novel reduced Rank-KPCA is developed to make up for the drawbacks of KPCA. The basic idea of the proposed novel approach consists at first to construct a reduced Rank-KPCA model that describes properly the system behavior in normal operating conditions from a large amount of training data and after that to monitor the system on-line. The principle of the proposed Reduced Rank-KPCA is to eliminate the dependencies of variables in the feature space and to retain a reduced data from the original one. The proposed monitoring method is applied to fault detection in a numerical example, Continuous Stirred Tank Reactor and air quality-monitoring network AIRLOR and is compared with conventional KPCA and Moving Window KPCA methods.

  相似文献   

2.
针对传统的高光谱数据分类方法分类精度不高、没有充分地利用空间信息等缺陷,提出一种基于Gabor空间纹理特征(Gabor spatial texture features)及无参数加权光谱特征(Nonparametric weighted spectral features)和稀疏表达分类(Sparse representation classification)的高光谱图像分类算法,可以简写为Gabor-NW SF和SRC,即GNWSF-SRC。所提出的GNWSF-SRC分类方法首先通过融合高光谱的Gabor空间特征和无参数加权光谱特征来更好地描述高光谱图像,然后通过其进行稀疏表达,最终通过对比其重构误差获得分类结果。在训练集比例不同的情况下,用所提出的方法对两组典型的高光谱数据进行处理,理论研究和仿真结果表明:与传统的分类方法相比,所提出算法能够提高分类精度、Kappa系数等,取得了较好的分类效果。  相似文献   

3.
Stochastic ground motion simulation techniques are becoming increasingly popular because of enhanced computation power enabling direct simulation of complex response quantities. Priestley process assumption is the most general approach for stochastic modeling of earthquake ground motion. However, a framework for multicomponent ground motion simulation using the general Priestley process assumption is not available. Multicomponent motions are useful especially when the correlation structure between them significantly influences the response. The present study proposes a framework for frequency‐dependent principal component analysis (PCA), which facilitates Priestley process–based simulation of multicomponent ground motions. The study focuses only on the frequency‐dependent PCA part, and the results show high dependency of the principal components/directions on the frequency bands of the signals. The present work also advocates that the frequency‐dependent PCA should be preferred to the conventional PCA as the former can address the issues related to the frequency‐independent uniform modulation associated with the latter.  相似文献   

4.
抗差自适应Kalman滤波算法中,抗差等价权矩阵和自适应因子的计算,要求观测信息具有多余观测量且准确可靠,但在动态变形监测应用中,通常滤波观测值仅为三维坐标且存在较强噪声和粗差的影响。为此,先对该算法中的自适应因子和抗差等价权矩阵的计算进行研究和改进,然后计算了某高速公路边坡的GPS动态监测数据。结果表明,抗差自适应Kalman滤波能够有效地抵制动态变形监测中观测值异常的影响。  相似文献   

5.
Abstract

Environmental flow requirements of estuaries have been ignored in the past, mostly because of the lack of long-term monitoring data or understanding of the responses to changes in freshwater inflow. In some cases, it was incorrectly assumed that the minimum flows determined for rivers would protect downstream processes and in others the omission of environmental water requirement studies for estuaries was as a result of the sectoral management of water resources or lack of applicable legislation. Three main countries have developed methods for estuaries, i.e. Australia, South Africa and the USA, from practical applications and a learning-by-doing approach. Recent methods take a holistic and adaptive standpoint and are presented as frameworks that include a number of steps and have elements of risk assessment and adaptive management. Most approaches are data rich and emphasize long-term monitoring. This review showed that, although methods are available, implementation is slow and will require strong governance structures, stakeholder participation, monitoring and feedback in an adaptive management cycle.
Editor Z.W. Kundzewicz; Guest editor M. Acreman  相似文献   

6.
The advent of 2D hydraulic modelling has improved our understanding of flood hydraulics, thresholds, and dynamic effects on floodplain geomorphology and riparian vegetation at the morphological-unit scale. Hydraulic concepts of bed shear stress, stream power maxima, and energy (cumulative stream power) have been used to characterize floods and define their geomorphic effectiveness. These hydraulic concepts were developed in the context of reach-averaged, 1D hydraulic analyses, but their application to 2D model results is problematic due to differences in the treatment of energy losses in 1D and 2D analyses. Here we present methods for estimating total and boundary resistance from 2D modelling of an extreme flood on a subtropical river. Hydraulic model results are correlated with observations of the flood impacts on floodplain geomorphology and the riparian vegetation to identify thresholds and compute variants of flood energy. Comparison of LiDAR data in 2011 and 2014 shows that the 2011 flood produced 2–4 m of erosion on floodplain bars that were previously forested or grass-covered. Deposition on flood levees, dunes, and chute bars was up to 3.4 m thick. Various hydraulic metrics were trialled as candidates for thresholds of vegetation disturbance. The accuracy of thresholds using metrics extracted at the flood peak (i.e. boundary resistance and stream power maxima) was similar to that using energy as a threshold. Disturbance to forest and grass on vegetated bars was associated with stream powers of >834 W/m2 and unit flows of >26 m2/s, respectively. Correlation of the hydraulic metrics with erosion and deposition depths showed no substantial improvement in using flood energy compared to metrics extracted at the flood peak for describing erosion and deposition. The extent of vegetation disturbances and morphological adjustments was limited for this extreme flood, and further 2D studies are needed to compare disturbance thresholds across different environments.  相似文献   

7.
Analytical models prepared from field drawings do not generally provide results that match with experimental results.The error may be due to uncertainties in the property of materials,size of members and errors in the modelling process.It is important to improve analytical models using experimentally obtained data.For the past several years,data obtained from ambient vibration testing have been successfully used in many cases to update and match dynamic behaviors of analytical models with real structures.This paper presents a comparison between artificial neural network(ANN) and eigensensitivity based model updating of an existing multi-story building.A simple spring-mass analytical model,developed from the structural drawings of the building,is considered and the corresponding spring stiffness and lumped mass of all floors are chosen as updating parameters.The advantages and disadvantages of these updating methods are discussed.The advantage is that both methods ensure a physically meaningful model which canbe further employed in determining structural response and health monitoring.  相似文献   

8.
Groundwater is a vital water supply worldwide for people and nature. However, species and ecosystems that depend on groundwater for some or all of their water needs, known as groundwater dependent ecosystems (GDEs), are increasingly becoming threatened worldwide due to growing human water demands. Over the past two decades, the protection and management of GDEs have been incorporated into several water management policy initiatives worldwide including jurisdictions within Australia, the European Union, South Africa, and the United States. Among these, Australia has implemented the most comprehensive framework to manage and protect GDEs through its water policy initiatives. Using a science‐based approach, Australia has made good progress at reducing uncertainty when selecting management thresholds for GDEs in their water management plans. This has been achieved by incorporating appropriate metrics for GDEs into water monitoring programs so that information gathered over time can inform management decisions. This adaptive management approach is also accompanied by the application of the “Precautionary Principle” in cases where insufficient information on GDEs exist. Additionally, the integration of risk assessment into Australia's approach has enabled water managers to prioritize the most valuable and vulnerable ecologic assets necessary to manage GDEs under Australia's national sustainable water management legislation. The purpose of this paper is to: (1) compare existing global policy initiatives for the protection and management of GDEs; (2) synthesize Australia's adaptive management approach of GDEs in their state water plans; and (3) highlight opportunities and challenges of applying Australia's approach for managing GDEs under other water management policies worldwide.  相似文献   

9.
This paper presents an extensive simulation tool based on a Cellular Automata (CA) system that models fundamental seismic characteristics of a region. The CA-based dynamic model consists of cells-charges and it is used for the simulation of the earthquake process. The simulation tool has remarkably accelerated the response of the model by incorporating principles of the High Performance Computing (HPC). Extensive programming features of parallel computing have been applied, thus improving its processing effectiveness. The tool implements an enhanced (or hyper-) 2-dimensional version of the proposed CA model. Regional characteristics that depend on the seismic background of the area under study are assigned to the model with the application of a user-friendly software environment. The model is evaluated with real data that correspond to a circular region around Skyros Island, Greece, for different time periods, as for example one of 45 years (1901–1945). The enhanced 2-dimensional version of the model incorporates all principal characteristics of the 2-dimensional one, also including groups of CA cells that interact with others, located to a considerable distance in an attempt to simulate long-range interaction. The advanced simulation tool has been thoroughly evaluated. Several measurements have been made for different critical states, as well as for various cascade (earthquake) sizes, cell activities and different neighbourhood sizes. Simulation results qualitatively approach the Gutenberg–Richter (GR) scaling law and reveal fundamental characteristics of the system.  相似文献   

10.
In this work, a fully nonparametric geostatistical approach to estimate threshold exceeding probabilities is proposed. To estimate the large-scale variability (spatial trend) of the process, the nonparametric local linear regression estimator, with the bandwidth selected by a method that takes the spatial dependence into account, is used. A bias-corrected nonparametric estimator of the variogram, obtained from the nonparametric residuals, is proposed to estimate the small-scale variability. Finally, a bootstrap algorithm is designed to estimate the unconditional probabilities of exceeding a threshold value at any location. The behavior of this approach is evaluated through simulation and with an application to a real data set.  相似文献   

11.
A new approach for streamflow simulation using nonparametric methods was described in a recent publication (Sharma et al. 1997). Use of nonparametric methods has the advantage that they avoid the issue of selecting a probability distribution and can represent nonlinear features, such as asymmetry and bimodality that hitherto were difficult to represent, in the probability structure of hydrologic variables such as streamflow and precipitation. The nonparametric method used was kernel density estimation, which requires the selection of bandwidth (smoothing) parameters. This study documents some of the tests that were conduced to evaluate the performance of bandwidth estimation methods for kernel density estimation. Issues related to selection of optimal smoothing parameters for kernel density estimation with small samples (200 or fewer data points) are examined. Both reference to a Gaussian density and data based specifications are applied to estimate bandwidths for samples from bivariate normal mixture densities. The three data based methods studied are Maximum Likelihood Cross Validation (MLCV), Least Square Cross Validation (LSCV) and Biased Cross Validation (BCV2). Modifications for estimating optimal local bandwidths using MLCV and LSCV are also examined. We found that the use of local bandwidths does not necessarily improve the density estimate with small samples. Of the global bandwidth estimators compared, we found that MLCV and LSCV are better because they show lower variability and higher accuracy while Biased Cross Validation suffers from multiple optimal bandwidths for samples from strongly bimodal densities. These results, of particular interest in stochastic hydrology where small samples are common, may have importance in other applications of nonparametric density estimation methods with similar sample sizes and distribution shapes. Received: November 12, 1997  相似文献   

12.
A new approach for streamflow simulation using nonparametric methods was described in a recent publication (Sharma et al. 1997). Use of nonparametric methods has the advantage that they avoid the issue of selecting a probability distribution and can represent nonlinear features, such as asymmetry and bimodality that hitherto were difficult to represent, in the probability structure of hydrologic variables such as streamflow and precipitation. The nonparametric method used was kernel density estimation, which requires the selection of bandwidth (smoothing) parameters. This study documents some of the tests that were conduced to evaluate the performance of bandwidth estimation methods for kernel density estimation. Issues related to selection of optimal smoothing parameters for kernel density estimation with small samples (200 or fewer data points) are examined. Both reference to a Gaussian density and data based specifications are applied to estimate bandwidths for samples from bivariate normal mixture densities. The three data based methods studied are Maximum Likelihood Cross Validation (MLCV), Least Square Cross Validation (LSCV) and Biased Cross Validation (BCV2). Modifications for estimating optimal local bandwidths using MLCV and LSCV are also examined. We found that the use of local bandwidths does not necessarily improve the density estimate with small samples. Of the global bandwidth estimators compared, we found that MLCV and LSCV are better because they show lower variability and higher accuracy while Biased Cross Validation suffers from multiple optimal bandwidths for samples from strongly bimodal densities. These results, of particular interest in stochastic hydrology where small samples are common, may have importance in other applications of nonparametric density estimation methods with similar sample sizes and distribution shapes. Received: November 12, 1997  相似文献   

13.
There are two basic approaches for estimating flood quantiles: a parametric and a nonparametric method. In this study, the comparisons of parametric and nonparametric models for annual maximum flood data of Goan gauging station in Korea were performed based on Monte Carlo simulation. In order to consider uncertainties that can arise from model and data errors, kernel density estimation for fitting the sampling distributions was chosen to determine safety factors (SFs) that depend on the probability model used to fit the real data. The relative biases of Sheater and Jones plug-in (SJ) are the smallest in most cases among seven bandwidth selectors applied. The relative root mean square errors (RRMSEs) of the Gumbel (GUM) are smaller than those of any other models regardless of parent models considered. When the Weibull-2 is assumed as a parent model, the RRMSEs of kernel density estimation are relatively small, while those of kernel density estimation are much bigger than those of parametric methods for other parent models. However, the RRMSEs of kernel density estimation within interpolation range are much smaller than those for extrapolation range in comparison with those of parametric methods. Among the applied distributions, the GUM model has the smallest SFs for all parent models, and the general extreme value model has the largest values for all parent models considered.  相似文献   

14.
Recent climate change represents one of the most serious anthropogenic threats to lake ecosystems in Canada. As meteorological and hydrological conditions are altered by climate change, so too are physical, chemical and biological properties of lakes. The ability to quantify the impact of climate change on the physical properties of lakes represents an integral step in estimating future chemical and biological change. To that end, we have used the dynamic reservoir simulation model, a one‐dimensional vertical heat transfer and mixing model, to hindcast and compare lake temperature‐depth profiles against 30 years of long‐term monitoring data in Harp Lake, Ontario. These temperature profiles were used to calculate annual (June–September) thermal stability values from 1979 to 2009. Comparisons between measured and modelled lake water temperature and thermal stability over three decades showed strong correlation (r2 > 0.9). However, despite significant increases in modelled thermal stability over the 30 year record, we found no significant change in the timing of the onset, breakdown or the duration of thermal stratification. Our data suggest that increased air temperature and decreased wind are the primary drivers of enhanced stability in Harp Lake since 1979. The high‐predictive ability of the Harp Lake dynamic reservoir simulation model suggests that its use as a tool in future lake management projects is appropriate. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

15.
In the study, multivariate statistical methods including principal component analysis (PCA)/factor analysis (FA) and cluster analysis (CA) were applied to analyze surface water quality data sets obtained from the Huaihe River segment of Bengbu (HRSB) and generated during 2 years (2011–2012) monitoring of 19 parameters at 7 sampling sites. The results of PCA for 7 sampling sites revealed that the first four components of PCA showed 94.89% of the total variance in the data sets of HRSB. The Principal components (Factors) obtained from FA indicated that the parameters for water quality variations were mainly related to heavy metals (Pb, Mn, Zn and Fe) and organic related parameters (COD, PI and DO). The results revealed that the major causes of water quality deterioration were related to inflow of industrial, domestic and agricultural effluents into the Huaihe River. Three significant sampling locations—(sites 2, 3 and 4), (sites 1 and 5) and (sites 6 and 7)—were detected on the basis of similarity of their water quality. Thus, these methods were believed to be valuable to help water resources managers understand complex nature of water quality issues and determine the priorities to improve water quality.  相似文献   

16.
Keith Beven was amongst the first to propose and demonstrate a combination of conceptual rainfall–runoff modelling and stochastically generated rainfall data in what is known as the ‘continuous simulation’ approach for flood frequency analysis. The motivations included the potential to establish better links with physical processes and to avoid restrictive assumptions inherent in existing methods applied in design flood studies. Subsequently, attempts have been made to establish continuous simulation as a routine method for flood frequency analysis, particularly in the UK. The approach has not been adopted universally, but numerous studies have benefitted from applications of continuous simulation methods. This paper asks whether industry has yet realized the vision of the pioneering research by Beven and others. It reviews the generic methodology and illustrates applications of the original vision for a more physically realistic approach to flood frequency analysis through a set of practical case studies, highlighting why continuous simulation was useful and appropriate in each case. The case studies illustrate how continuous simulation has helped to offer users of flood frequency analysis more confidence about model results by avoiding (or exposing) bad assumptions relating to catchment heterogeneity, inappropriateness of assumptions made in (UK) industry‐standard design event flood estimation methods, and the representation of engineered or natural dynamic controls on flood flows. By implementing the vision for physically realistic analysis of flood frequency through continuous simulation, each of these examples illustrates how more relevant and improved information was provided for flood risk decision‐making than would have been possible using standard methods. They further demonstrate that integrating engineered infrastructure into flood frequency analysis and assessment of environmental change are also significant motivations for adopting the continuous simulation approach in practice. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

17.
Micropurge sampling of ground water wells has been suggested as a possible replacement to traditional purge and sample methods. To compare methods, duplicate ground water samples were collected at two field sites using iraditional and micropurge methods. Samples were analyzed for selected organic and inorganic constituents, and the results were compared statistically. Analysis of the data using the nonparametric sign test indicates that within a 95 percent confidence interval, there was no significant difference between the two methods for the site contaminants and the majority of analytes. These analytical results were supported by visual observations with the colloidal borescope, which demonstrated impacts on the flow system in the well when using traditional sampling methods. Under selected circumstances, the results suggest replacing traditional sampling with micropurging based on reliability, cost, and waste minimization.  相似文献   

18.
The dynamic identification of a historical masonry palace located in Benevento (Italy) has been carried out. The case study is representative of many buildings located in historic Italian centres. Since the building has been instrumented by the Department of Civil Protection with a permanent dynamic monitoring system, some of the recorded data, acquired in various operating conditions have been analysed with basic instruments of the Operational Modal Analysis in order to identify the main eingenfrequencies and vibration modes of the structure. The experimental results have been compared to the numerical outcomes provided by a detailed three-dimensional Finite Element (FE) model of the building where Soil–Structure Interaction (SSI) has been taken into account. The comparison of experimental vs. numerical frequencies and vibration modes of the palace evidenced the role exerted by the subsoil on the dynamic response of the building.  相似文献   

19.
Mode-superposition has been extensively used in computing the dynamic response of complex structures. Two versions of mode-superposition, namely the mode-displacement method and the mode-acceleration method, have been employed. The present paper summarizes the results of a systematic study comparing the accuracy of the mode-displacement and mode-acceleration methods when applied to structures with various levels of damping or various excitation frequencies. The paper also discusses several details concerning the implementation of the mode-acceleration method.  相似文献   

20.
Ground water monitoring networks can provide vital information for sustainable water resources management. This involves the measurement of ground water level, solute concentration, or both. This article deals with the former. It optimizes network distribution of piezometer or data sampling wells to effectively monitor ground water levels under an irrigation region while retaining adequate overall measurement accuracy. This article presents a structured process for applying principal component analysis (PCA) in optimizing a ground water monitoring network in an irrigation area of Australia. The PCA functions, distributed with the MATLAB package, were used to determine relative contributions of individual piezometers in capturing the spatiotemporal variation of ground water levels. Kriging gridding interpolation algorithm was used to render the data surface presentations and determine spatial differences in piezometeric surfaces using different number of data sets. The results show that the overall difference of ground water level between the original piezometer network and the optimized networks after the PCA process was applied is less than 20%, while the total number of piezometers in the optimized network is reduced by 63%, which will save the time and cost to monitor ground water levels in the irrigation area.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号