首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Airborne LiDAR (light detection and ranging) data are now commonly regarded as the most accurate source of elevation data for medium-scale topographical modelling applications. However, quoted LiDAR elevation error may not necessarily represent the actual errors occurring across all surfaces, potentially impacting the reliability of derived predictions in Geographical Information Systems (GIS). The extent to which LiDAR elevation error varies in association with land cover, vegetation class and LiDAR data source is quantified relative to dual-frequency global positioning system survey data captured in a 400-ha area in Ireland, where four separate classes of LiDAR point data overlap. Quoted elevation errors are found to correspond closely with the minimum requirement recommended by the American Society of Photogrammetry and Remote Sensing for the definition of 95% error in urban areas only. Global elevation errors are found to be up to 5 times the quoted error, and errors within vegetation areas are found to be even larger, with errors in individual vegetation classes reaching up to 15 times the quoted error. Furthermore, a strong skew is noted in vegetated areas within all the LiDAR data sets tested, pushing errors in some cases to more than 25 times the quoted error. The skew observed suggests that an assumption of a normal error distribution is inappropriate in vegetated areas. The physical parameters that were found to affect elevation error most fundamentally were canopy depth, canopy density and granularity. Other factors observed to affect the degree to which actual errors deviate from quoted error included the primary use for which the data were acquired and the processing applied by data suppliers to meet these requirements.  相似文献   

2.
It is shown that the presence of 31-35 commonly measured volatile organic compounds(VOCs)inground water can be detected with small error rates by using screening methods which analyze for a subsetof such VOCs.A study of selected data sets indicates that analytical determinations of only from twoto eight VOCs will suffice to detect 95% of all VOC hits.It is also shown that a serially optimal algorithmfor selecting the VOCs for screening is very nearly as accurate as a globally optimal algorithm and mucheasier to implement.These conclusions are supported by empirical evidence from two drinking-water datasets and one hazardous waste site data set.Additional research areas are also outlined.  相似文献   

3.
4.
Many of the data sets analyzed by physical geographers are compositional in nature: they have row vectors that add to one (or 100%). These unit-sum constrained data sets should not be analyzed by standard multivariate statistical methods. Significant differences were found in the log-ratio mean vectors of the hydraulic exponents (which are unit-sum constrained) for two classes of streams: those with cohesive, non-vertical banks, and those with one firm and one loose bank. Compositional discriminant function analysis of bank stability on the basis of hydraulic geometry had a success rate of 88%, making routinely archived measurements of stream width, cross-sectional area, mean velocity, and discharge a readily available data base for predicting the stability of stream reaches. [Key words: geomorphology, hydraulic geometry, discriminant function, statistics.]  相似文献   

5.
The continually increasing size of geospatial data sets poses a computational challenge when conducting interactive visual analytics using conventional desktop-based visualization tools. In recent decades, improvements in parallel visualization using state-of-the-art computing techniques have significantly enhanced our capacity to analyse massive geospatial data sets. However, only a few strategies have been developed to maximize the utilization of parallel computing resources to support interactive visualization. In particular, an efficient visualization intensity prediction component is lacking from most existing parallel visualization frameworks. In this study, we propose a data-driven view-dependent visualization intensity prediction method, which can dynamically predict the visualization intensity based on the distribution patterns of spatio-temporal data. The predicted results are used to schedule the allocation of visualization tasks. We integrated this strategy with a parallel visualization system deployed in a compute unified device architecture (CUDA)-enabled graphical processing units (GPUs) cloud. To evaluate the flexibility of this strategy, we performed experiments using dust storm data sets produced from a regional climate model. The results of the experiments showed that the proposed method yields stable and accurate prediction results with acceptable computational overheads under different types of interactive visualization operations. The results also showed that our strategy improves the overall visualization efficiency by incorporating intensity-based scheduling.  相似文献   

6.
Matrix factorization is one of the most popular methods in recommendation systems. However, it faces two challenges related to the check-in data in point of interest (POI) recommendation: data scarcity and implicit feedback. To solve these problems, we propose a Feature-Space Separated Factorization Model (FSS-FM) in this paper. The model represents the POI feature spaces as separate slices, each of which represents a type of feature. Thus, spatial and temporal information and other contexts can be easily added to compensate for scarce data. Moreover, two commonly used objective functions for the factorization model, the weighted least squares and pairwise ranking functions, are combined to construct a hybrid optimization function. Extensive experiments are conducted on two real-life data sets: Gowalla and Foursquare, and the results are compared with those of baseline methods to evaluate the model. The results suggest that the FSS-FM performs better than state-of-the-art methods in terms of precision and recall on both data sets. The model with separate feature spaces can improve the performance of recommendation. The inclusion of spatial and temporal contexts further leverages the performance, and the spatial context is more influential than the temporal context. In addition, the capacity of hybrid optimization in improving POI recommendation is demonstrated.  相似文献   

7.
ABSTRACT

Choropleth mapping provides a simple but effective visual presentation of geographical data. Traditional choropleth mapping methods assume that data to be displayed are certain. This may not be true for many real-world problems. For example, attributes generated based on surveys may contain sampling and non-sampling error, and results generated using statistical inferences often come with a certain level of uncertainty. In recent years, several studies have incorporated uncertain geographical attributes into choropleth mapping with a primary focus on identifying the most homogeneous classes. However, no studies have yet accounted for the possibility that an areal unit might be placed in a wrong class due to data uncertainty. This paper addresses this issue by proposing a robustness measure and incorporating it into the optimal design of choropleth maps. In particular, this study proposes a discretization method to solve the new optimization problem along with a novel theoretical bound to evaluate solution quality. The new approach is applied to map the American Community Survey data. Test results suggest a tradeoff between within-class homogeneity and robustness. The study provides an important perspective on addressing data uncertainty in choropleth map design and offers a new approach for spatial analysts and decision-makers to incorporate robustness into the mapmaking process.  相似文献   

8.
Digital elevation and remote sensing data sets contain different, yet complementary, information related to geomorphological features. Digital elevation models (DEMs) represent the topography, or land form, whereas remote sensing data record the reflectance/emittance, or spectral, characteristics of surfaces. Computer analysis of integrated digital data sets can be exploited for geomorphological classification using automated methods developed in the remote sensing community. In the present study, geomorphological classification in a moderate- to high-relief area dominated by slope processes in southwest Yukon Territory, Canada, is performed with a combined set of geomorphometric and spectral variables in a linear discriminant analysis. An automated method was developed to find the boundaries of geomorphological objects and to extract the objects as groups of aggregated pixels. The geomorphological objects selected are slope units, with the boundaries being breaks of slope on two-dimensional downslope profiles. Each slope unit is described by variables summarizing the shape, topographic, and spectral characteristics of the aggregated group of pixels. Overall discrimination accuracy of 90% is achieved for the aggregated slope units in ten classes.  相似文献   

9.
该文介绍了建立 3维城市模型过程中数据采集和组织的方法。将建立 3维城市模型的数据分为 5种类型 ,分别建立空间数据库、属性库和图例库 ,并给出了三种数据库表之间的联系。在编程时 ,采用面向对象的方法对各种数据进行组织 ,给出了各个类之间的组装关系。  相似文献   

10.
The Digital Elevation Model that has been derived from the February 2000 Shuttle Radar Topography Mission (SRTM) has been one of the most important publicly available new spatial data sets in recent years. However, the ‘finished’ grade version of the data (also referred to as Version 2) still contains data voids (some 836,000 km2)—and other anomalies—that prevent immediate use in many applications. These voids can be filled using a range of interpolation algorithms in conjunction with other sources of elevation data, but there is little guidance on the most appropriate void‐filling method. This paper describes: (i) a method to fill voids using a variety of interpolators, (ii) a method to determine the most appropriate void‐filling algorithms using a classification of the voids based on their size and a typology of their surrounding terrain; and (iii) the classification of the most appropriate algorithm for each of the 3,339,913 voids in the SRTM data. Based on a sample of 1304 artificial but realistic voids across six terrain types and eight void size classes, we found that the choice of void‐filling algorithm is dependent on both the size and terrain type of the void. Contrary to some previous findings, the best methods can be generalised as: kriging or inverse distance weighting interpolation for small and medium size voids in relatively flat low‐lying areas; spline interpolation for small and medium‐sized voids in high‐altitude and dissected terrain; triangular irregular network or inverse distance weighting interpolation for large voids in very flat areas, and an advanced spline method (ANUDEM) for large voids in other terrains.  相似文献   

11.
极光形态为研究日地物理过程提供了显著、直观和具有可识别性的特征。合理分类对研究各类极光现象与磁层动力学过程之间的关系尤为重要。极光形态分类机制的选择问题是极光有监督分类研究被主要诟病的问题之一。有监督分类实验中人工标记的工作量非常浩大,而且不能保证标记的准确性。更重要的是,高分类正确率只能说明自动分类符合人的认识,有监督分类结果无法验证分类机制的正确性。现有的分类机制是否为极光数据空间的真实划分,是否存在更为合理的分类机制都是我们应该探讨的问题。针对该问题,基于已有的全天空极光图像表征方法,引入聚类算法探究极光特征空间的结构,利用了9种聚类有效性函数选择适合极光数据的聚类个数。实验结果表明,对于从2003—2004年北极黄河站观测的全天空极光数据中随机选取的6 000幅极光图像,两类和四类的划分方式最为合适。两类的划分可以看作是分离度较好的极光类型,并且根据两类分布曲线呈现午前-午后双峰的分布特点,这一类极光可能是弧状极光。对于四类的情况,虽然通过肉眼观察无法用一幅典型的极光图像代表每一类,但是这些由聚类得出的极光类型具有各自的时间分布特点,这一结果从无监督的角度证明了极光类型在形态上是可分的。  相似文献   

12.
Summary. We construct a catalogue of all possible elementary point sources for static deformation in an elastic solid. the familiar double-couples, CLVD's, centres of compression and dilatation, etc., are all members of the complete catalogue. the sources are classified according to the rank of the seismic moment tensor, and according to the weight (or order) of the irreducible tensor representation of the 3-D rotation group. These sources can be classified as belonging to one of three general classes. the static excitation functions are calculated for an infinite, homogeneous, isotropic medium for all these sources. We show that, except for sources belonging to these three general classes, all other sources — which are numerous for the tensors of high rank — are null static sources. That is, sources that do not produce any static displacement outside of the source region. Due to the presence of null sources, an inversion of the static deformation data is non-unique. the expansion of the equivalent-force tensors and the stress glut tensors (or seismic moment tensors) into a set of the symmetric trace-free source tensors is proposed. the sources corresponding to seismic moment tensors of the second, third and fourth ranks are considered in more detail. We identify the third-rank sources with rotational dislocations or disclinations.  相似文献   

13.
Land-cover classification using only remotely sensed spectral features does not provide accurate information on all urban-fringe classes. Four texture measures used in conjunction with LANDSAT spectral features were empirically evaluated to determine their utility in Level II and III land-cover mapping. The contrast and high-frequency measures improved land-cover classification at the urban fringe. However, the decision to use texture measures should be weighed carefully because they yield only a small, yet important, increment in absolute classification accuracy and entail additional expense for data preprocessing.  相似文献   

14.
Seven methods designed to delineate homogeneous river segments, belonging to four families, namely — tests of homogeneity, contrast enhancing, spatially constrained classification, and hidden Markov models — are compared, firstly on their principles, then on a case study, and on theoretical templates. These templates contain patterns found in the case study but not considered in the standard assumptions of statistical methods, such as gradients and curvilinear structures. The influence of data resolution, noise and weak satisfaction of the assumptions underlying the methods is investigated. The control of the number of reaches obtained in order to achieve meaningful comparisons is discussed. No method is found that outperforms all the others on all trials. However, the methods with sequential algorithms (keeping at order n + 1 all breakpoints found at order n) fail more often than those running complete optimisation at any order. The Hubert-Kehagias method and Hidden Markov Models are the most successful at identifying subpatterns encapsulated within the templates. Ergodic Hidden Markov Models are, moreover, liable to exhibit transition areas.  相似文献   

15.
Understanding and analysis of drivers of land-use and -cover change (LUCC) is a requisite to reduce and manage impacts and consequences of LUCC. The aim of the present study is to analyze drivers of LUCC in Southern Mexico and to see how these are used by different conceptual and methodological approaches for generating transition potential maps and how this influences the effectiveness to produce reliable LUCC models. Spatial factors were tested for their relation to main LUCC processes, and their importance as drivers for the periods 1993–2002 and 2002–2007 was evaluated by hierarchical partitioning analysis and logistic regression models. Tested variables included environmental and biophysical variables, location measures of infrastructure and of existing land use, fragmentation, and demographic and social variables. The most important factors show a marked persistence over time: deforestation is mainly driven by the distance of existing land uses; degradation and regeneration by the distance of existing disturbed forests. Nevertheless, the overall number of important factors decreases slightly for the second period. These drivers were used to produce transition potential maps calibrated with the 1993–2002 data by two different approaches: (1) weights of evidence (WoE) to represent the probabilities of dominant change processes, namely deforestation, forest degradation, and forest regeneration for temperate and tropical forests and (2) logistic RM that show the suitability regarding the different land-use and -cover (LUC) classes. Validation of the transition potential maps with the 2002–2007 data indicates a low precision with large differences between LUCC processes and methods. Areas of change evaluated by difference in potential showed that WoE produce transition potential maps that are more accurate for predicting LUCC than those produced with RM. Relative operating characteristic (ROC) statistics show that transition potential models based on RM do usually better predict areas of no change, but the difference is rather small. The poor performance of maps based on RM could be attributed to their too general representation of suitability for certain LUC classes when the goal is modeling complex LUCC and the LUC classes participate in several transitions. The application of a multimodel approach enables to better understand the relations of drivers to LUCC and the evaluation of model calibration based on spatial explanatory factors. This improved understanding of the capacity of LUCC models to produce accurate predictions is important for making better informed policy assessments and management recommendations to reduce deforestation.  相似文献   

16.
Topographic effects due to irregular surface terrain may prevent accurate interpretation of magnetotelluric (MT) data. Three-dimensional (3-D) topographic effects have been investigated for a trapezoidal hill model using an edge finite-element method. The 3-D topography generates significant MT anomalies, and has both galvanic and inductive effects in any polarization. This paper presents two different correction algorithms, which are applied to the impedance tensor and to both electric and magnetic fields, respectively, to reduce topographic effects on MT data. The correction procedures using a homogeneous background resistivity derived from a simple averaging method effectively decrease distortions caused by surface topography, and improve the quality of subsurface interpretation. Nonlinear least-squares inversion of topography-corrected data successfully recovers most of structures including a conductive or resistive dyke.  相似文献   

17.
以遥感为基础的干旱监测方法研究进展   总被引:2,自引:0,他引:2  
周磊  武建军  张洁 《地理科学》2015,35(5):630-636
总结了目前广泛应用的气象监测模型和基于遥感数据的干旱监测模型,将目前的遥感监测方法分为植被状态监测方法、微波土壤水分监测方法、热红外遥感监测方法和基于能量平衡的遥感监测方法进行综述,深入分析了基于遥感数据的监测方法的特点、适用条件和存在的问题。通过综述基于多源数据的干旱综合监测模型,对未来干旱监测方法的发展方向进行研究和探讨,指出集成多源数据的干旱综合监测模型是解决复杂的干旱监测问题的新方法。  相似文献   

18.
《Urban geography》2013,34(7):724-738
Determining an accurate depiction of population distribution for urban areas in order to develop an improved "denominator" is important for the calculation of higher-precision rates in GIS analyses, particularly when exploring the spatial dynamics of disease. Rather than using data aggregated by arbitrary administrative boundaries such as census tracts, we developed the Cadastral-Based Expert Dasymetric System (CEDS), an interpolation method using ancillary information to delineate areas of homogeneous values. This method uses cadastral data, land-use filters, modeling by expert system routines, and validation against various census enumeration units and other data. The CEDS method is presented through a case study of asthma hospitalizations in the borough of the Bronx in New York City, in relation to proximity buffers constructed around major sources of air pollution. The analysis using CEDS shows that asthma hospitalization risk due to proximity to pollution sources is greater than previously calculated using traditional disaggregation methods.  相似文献   

19.
Personal trajectory data are increasingly collected for a variety of academic and recreational pursuits. As access to location data widens and locations are linked to other information repositories, individuals become increasingly vulnerable to identification. The quality and precision of spatially linked attributes are essential to accurate analysis; yet, there is a trade-off between privacy and geographic data resolution. Obfuscation of point data, or masking, is a solution that aims to protect privacy and maximize preservation of spatial pattern. Trajectory data, with multiple locations recorded for an entity over time, is a strong personal identifier. This study explores the balance between privacy and spatial pattern resulting from two methods of obfuscation for personal GPS data: grid masking and random perturbation. These methods are applied to travel survey GPS data in the greater metropolitan regions of Chicago and Atlanta. The rate of pattern correlation between the original and masked data sets declines as the distance thresholds for masking increase. Grid masking at the 250-m threshold preserves route anonymity better than other methods and distance thresholds tested, but preserves spatial pattern least. This study also finds via linear regression that median trip speed and road density are significant predictors of trip anonymity.  相似文献   

20.
Chemometrics is defined as the application of mathematical and statistical methods to chemical systems.Systems theory is seen to be useful for organizing and categorizing the inputs to and outputs fromchemical systems.Advances in measurement science in the 1950s and 1960s,particularly in analyticalchemistry,created a need for a multivariate approach to data analysis.Early chemometrics emphasizedthe use of structure-finding methods for existing data sets.In many instances,data sets can be obtainedfrom designed experiments.Such data sets are more likely to contain the desired information and the datacan usually be acquired at less cost.Renewed interest in statistical process control will provide many new,more robust data sets in the future.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号