首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The statistical analysis of compositional data is of fundamental importance to practitioners in generaland to chemists in particular.The existing methodology is principally due to Aitchison,who effectivelyuses two transformations,a ratio followed by the logarithmic,to create a useful,coherent theory thatin principle allows the plethora of normal-based multivariate techniques to be used on the transformeddata.This paper suggests that the well-known class of Box-Cox transformations can be employed inplace of the logarithmic to significantly improve the existing methodology.This is supported in part byshowing that one of the most basic problems that Aitchison managed to overcome,namely thespecification of an interpretable covariance structure for compositional data,can be resolved,or nearlyresolved,once the ratio transformation has been applied.Hence the resolution is not directly dependenton the logarithmic transformation.It is then verified that access to the general Box-Cox family will allowa more accurate use of the normal-based multivariate techniques,simply because better fits to normalitycan be achieved.Finally,maximum likelihood estimation and some associated asymptotics are employedto construct confidence intervals for ratios of the true,unknown compositional constituents.Heretoforethis had not been done even in the context of the logarithmic transformation.Applications to real dataare presented.  相似文献   

2.
Care is required for multicomponent analysis if misleading results are to be avoided. The problem ofill-conditioned calibration matrices is of primary concern. This type of numerical instability isrepresented as spectral overlap of calibration spectra. Depending on the degree of spectral overlap, thesample concentration estimates can be severely affected. A practical statistical procedure is discussedwhich tests for the presence of spectral overlap among the pure-component spectra and simultaneouslyassesses the degree that concentration estimates may be degraded. Guidelines are developed to ascertainhow much departure from spectral orthogonality is acceptable.  相似文献   

3.
Each eigenvector of the dispersion matrix[X]~T[X]was shown to be a partial predictor of the originaldata matrix [X],the sum of the predictions from the individual principal components being equal to theexpectance of [X].By comparing the distributions of the members of two neighbouring predictedmatrices,[X]_(1...i)and [X](1...i+1)(i.e.the sums of the first i and i+1 individual predictions respectively),it was shown that they should be indistinguishable provided that i is equal to or greater than the effectiverank of [X],and significantly different otherwise.This was confirmed by analysing the visible absorptionspectra of methyl orange and methyl red solutions as well as the Raman spectra of Na_2SO_4 and MgSO_4solutions.On the grounds of these findings,a non-parametric goodness-of-fit test for assessing theeffective rank of[X]was proposed which proved to be comparatively conservative and more robust thanmost currently used tests.  相似文献   

4.
The direct trilinear decomposition method(DTDM)is an algorithm for performing quantitative curveresolution of three-dimensional data that follow the so-called trilinear model,e.g.chromatography-spectroscopy or emission-excitation fluorescence.Under certain conditions complexeigenvalues and eigenvectors emerge when the generalized eigenproblem is solved in DTDM.Previouspublications never treated those cases.In this paper we show how similarity transformations can be usedto eliminate the imaginary part of the complex eigenvalues and eigenvectors,thereby increasing theusefulness of DTDM in practical applications.The similarity transformation technique was first used byour laboratory to solve the similar problem in the generalized rank annihilation method(GRAM).Because unique elution profiles and spectra can be derived by using data matrices from three or moresamples simultaneously,DTDM with similarity transformations is more efficient than GRAM in the casewhere there are many samples to be investigated.  相似文献   

5.
王强  么枕生 《地理学报》1990,45(3):363-372
本文从理论上推导了谱分析中最大后延M的取值,并且从理论和实际计算中得出了样本长度和其隐含周期之间的关系。只要抽样是平稳的,样本长度并非一定要求太长,仅略为超过其隐含周期的长度即可。但谱分析绝非是万能的。如果样本中仅含有两个波动,且当这两个波动的周期相差一个整数倍或更大时,这两个波动才可以通过谱分析分辨出,反之就会歪曲了其隐含周期。由此可见,在具体应用时,应作一些滤波和去掉趋势是必要的。因而这将对广大气象、地理科学工作者以及其它相邻学科的科研工作者具有重要的参考价值。  相似文献   

6.
遥感影像纹理分析方法研究   总被引:1,自引:0,他引:1  
影像纹理是遥感影像中的重要信息,原始的光谱信息加上纹理信息可以提高影像的精确性。作为提高影像分类精度的重要方法——纹理分析,合理有效地使用纹理分析方法至关重要。不同的纹理分析方法,不同程度的提高影像分类的精度。目前对影像进行纹理分析的方法主要有:统计方法、结构方法和谱方法3类。分别就3种方法的研究进展、应用情况进行了阐述,最后对影像纹理分析方法3种模式对比研究。  相似文献   

7.
Continuous digitalized signals such as spectra,electrophoregrams or chromatograms generally have alarge number of data points and contain redundant information.It is therefore troublesome performingdiscriminant analysis without any preliminary selection of variables.A procedure for the application ofcanonical discriminant analysis(CDA)on this kind of data is studied.CDA can be presented as asuccession of two principal component analyses(PCAs).The first is performed directly on the raw dataand gives PC scores.The second is applied on the gravity centres of each qualitative group assessed onthe normalized PC scores.A stepwise procedure for selection of the relevant PC scores is presented.Themethod has been tested on an illustrative collection of 165 size-exclusion high-performance(SE-HPLC)chromatograms of proteins of wheat belonging to 55 genotypes and grown in three locations.Thediscrimination of the growing locations was performed using seven to nine PC scores and gave more than86% accurate classifications of the samples both in the training sets and the verification sets.Thegenotypes were also rather well identified,with more than 85% of the samples correctly classified.Thestudied method gives a way of assessing relevant mathematical distances between digitalized signalsaccording to qualitative knowledge of the samples.  相似文献   

8.
A city can be topologically represented as a connectivity graph, consisting of nodes representing individual spaces and links if the corresponding spaces are intersected. It turns out in the space syntax literature that some defined topological metrics can capture human movement rates in individual spaces. In other words, the topological metrics are significantly correlated to human movement rates, and individual spaces can be ranked by the metrics for predicting human movement. However, this correlation has never been well justified. In this paper, we study the same issue by applying the weighted PageRank algorithm to the connectivity graph or space–space topology for ranking the individual spaces, and find surprisingly that: (1) the PageRank scores are better correlated to human movement rates than the space syntax metrics, and (2) the underlying space–space topology demonstrates small world and scale free properties. The findings provide a novel justification as to why space syntax, or topological analysis in general, can be used to predict human movement. We further conjecture that this kind of analysis is no more than predicting a drunkard's walking on a small world and scale free network.  相似文献   

9.
Summary. Consider a sequence of, say, 10 to 20 vector observations in three-dimensional space. It is suspected that a few subsets of consecutive observations are made up of collinear points. The purpose of this paper is to construct a statistically based algorithm to find such linear segments and to assess their accuracy. A similar assessment is made for coplanar sets of points.
This algorithm is applied here to palaeomagnetic data and is claimed to be superior to previous methods of palaeomagnetic analysis in terms of completeness and balance of analysis, treatment of measurement errors and other sources of scatter, criteria for identification of linear and planar sets of points, and statistical rigour. Stability spectra, with statistically based confidence limits, are obtained as a by-product.  相似文献   

10.
Information theory makes it possible to judge and evaluate methods and results in chemical analysis. Theobtained information can be expressed in different ways. One way is to define information as the decreaseof uncertainty after analysis. Conditional probabilities are therefore considered when evaluating theinformation provided by qualitative analyses. However, the use of other information measures, such asthe information gain, is often preferable. In multicomponent analysis the translation of information fromsignals to the amounts of the analytes has been investigated along with the relevance of individualcomponents. Information theory can also be applied to find the optimum experimental conditions. Theevaluation of the properties of analytical methods by information theory has been proposed.  相似文献   

11.
本文是有关中山站大气电场观测资料的分析报告,主要介绍基本观测情况,地面大气电场平均特征,包括各种时间尺度的频谱特征、季节变化、不同天气状态的典型电场特征。  相似文献   

12.
When full 3-D modelling is too costly or cumbersome, computations of 3-D elastic wave propagation in laterally heterogeneous, multilayered 2-D geological structures may enhance considerably our ability to predict strong ground motion for seismological and engineering purposes. Towards this goal, we extend the method based on the combination of the thin-layer finite-element and boundary-element methods (TLFE-BEM) and calculate windowend f - k spectra of the 3-D wavefield. The windowed f - k spectra are spatially localized spectra from which the local properties of the wavefield can be extracted. The TLFE-BEM is particularly suited for calculating the complete wavefield where surface waves are dominant in multilayered media. The computations are performed in the frequency domain, providing the f - k spectra directly. From the results for the 3-D wavefield excited by a point source in a 2-D multilayered, sloped structure, it can be said that the phase velocity of the fundamental-mode Rayleigh wave in a laterally heterogeneous multilayered medium, estimated from the windowed f - k spectra, varies with the location of the point source. For the model calculated in this article, the phase velocity varies between the value for the flat layered structure of the thick-layer side and that for the structure just under the centre of the window. The exact subsurface structure just under the centre of an array in a laterally heterogeneous medium cannot be obtained if we use the f - k spectral analysis assuming a flat layered structure.  相似文献   

13.
A general framework for manipulating spectra as functions in traditional multivariate methods such asPCA and PLS is described.The functional representation is very convenient for compression,ensuringsmoothness and continuity.There are two fundamentally different types of representations:(a)byfunctions and(b)by function coefficients.The use of coefficients is the most practical way of analysis.  相似文献   

14.
This paper presents a review of the rapid developments that have taken place over the last few years forthe searching of databases of three-dimensional(3D)molecules.The geometric arrangement of the atomsin a 3D molecule is described by an interatomic distance matrix.This is a form of labelled graph thatcan thus be searched using the subgraph-isomorphism algorithms that are widely used for searchingdatabases of two-dimensional(2D)molecules.Several in-house and commercial systems have beendeveloped for 3D database searching that are based on such techniques.These systems are reviewed andtheir effectiveness demonstrated by examples of their use in the discovery of novel,biologically activemolecules.Current systems represent a molecule by one or a small number of low-energy conformationsand there is hence much interest in the development of representational techniques and searchingalgorithms that account for ihe full set of geometric arrangements that can be adopted by a flexiblemolecule.  相似文献   

15.
Summary. Teleseismic P and S arrival times to North American stations are obtained from the ISC bulletins for the 10-yr period 1964–73, and relative travel-time delays are calculated with respect to standard tables. Station anomalies as well as variations of the delays with azimuth and epicentral distance from station are analysed, and the location of the velocity anomalies responsible for them is discussed. Inversion of the P delays to infer upper mantle velocity structure down to a depth of 700 km is obtained using three-dimensional blocks, as proposed by Aki, Christofferson & Husebye. Three layers can be resolved in this depth range. It is found that the heterogeneities responsible for the travel-time delays are primarily located in the first 250 km of the upper mantle, and that they correlate with surface features. Significant heterogeneities subsist to depths of at least 700 km and their broad scale pattern also correlates with the surface features: in the third layer (500 to 700 km depth) there is an increase of velocity from the West to the East of the United States, while the second layer (250 to 450 km depth) exhibits a reversed pattern. A tentative interpretation of these deeper anomalies is made, as being due mainly to topography of the major upper mantle discontinuities, near 400 and 650 km depth.  相似文献   

16.
Dong  Jiang  Xiaohuan  Yang  Naibin  Wang  Honghui  Liu 《地理学报(英文版)》2001,11(1):86-90
Automated cartographic generalization has been an intensive research topic in cartography for decades. Some problems associated with this topic could be resolved to a certain extent using fractal analysis and fractal dimension. This paper investigates the fundamental theories and operational methods of generalization. Among others, methods of calculating fractal dimensions of curves and even complicated 3-dimensional geographic objects are explained. Fractal dimensions can be used as an objective criterion for both scaling the natural geographic objects and economical computer storage. More important is that the generalization algorithms based on fractal dimensions can be performed automatically.  相似文献   

17.
Summary. For the case of a well completely penetrating a confined aquifer of infinite extent, an analysis is made of the flow processes in the well-aquifer system due to periodic pressure variations like earth tides and barometric tides. It is shown that the earth tides lead to well fluctuations with a negative phase difference (phase lag), while the barometric tides cause well fluctuations with a positive phase difference (phase advance).
In both cases the amplitudes and phases of the well fluctuations depend on frequency and are therefore different for the individual components of the tidal spectra. These frequency characteristics need to be studied more carefully than is the case now. A schematic model of a device is described that could not only be used for this purpose, but also for the determination of the specific storage S s and the conductivity k. the method would be closely related to the well-known slug or bail tests.  相似文献   

18.
辽东山地气象要素垂直分布与垂直农、林带   总被引:1,自引:1,他引:0  
毕伯钧 《地理研究》1982,1(3):55-65
本文通过两次山地气象要素垂直梯度观测,初步得出辽东山地气象要素垂直分布规律。山地气候的特点主要表现在温度、降水与湿度和光照等气象要素随海拔的增高而发生的变化,再加上坡向、坡度等地形影响,从而构成复杂的立体气候与相应的立体环状农业结构。本文依据热量、水分等指标确定出作物及熟型的种植界限、柞蚕放养的高度界限及适宜马铃薯高山留种和适宜人参栽培的海拔高度,最后划分出农业带及森林带。  相似文献   

19.
Land-use change in urbanizing areas can significantly alter the hydrology of a watershed and can have serious impacts on wetland water balances, downstream flooding, and groundwater recharge. Most currently available models used in determining the hydrologic impacts of urbanization are not well suited to long-term hydrologic analysis or are too complex and data intensive for widespread practical application. The Long-Term Hydrologic Impact Assessment (LTHIA) model run on a Geographic Information System (GIS) is a relatively simple, user-friendly model that uses the Curve Number method to estimate changes in surface runoff between different stages of development. Application of the model to a large, rapidly urbanizing watershed near Indianapolis, Indiana, suggests that average annual runoff depths increased by more than 60% from 1973 to 1991, with even greater increases for some individual sub-basins. These results are consistent with runoff changes estimated from historical stream flow data in the watershed. A sensitivity analysis to determine minimum data requirements shows that a precipitation record length of 15 years or more is required to produce consistent results with LTHIA and that the highest possible resolution land-use and soils data should be used. The LTHIA model is now available on the Internet at http://www.ecn.purdue.edu/runoff. [Key words: hydrology, urbanization, modeling, GIS.]  相似文献   

20.
Summary. In the maximum entropy spectrum analysis (MESA) of geomagnetic data, the required order can be estimated by the zero-frequency half-power width (ZHW) minimum method proposed in this paper. The spectra of the data for 1876–1975 at Zo-se showed a period of 40yr, in addition to the well-known periods. The amplitudes of the variations were estimated with parabolic-sinusoidal series fitting.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号