首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
一种基于面积平衡约束的图斑化简算法   总被引:1,自引:0,他引:1  
保持综合前后各地类土地面积平衡是土地利用数据综合的基本原则.地类图斑是面积统计的最小单元,用常规的线化简算法不但难以保证图斑的地理特性,而且会给面积统计带来不确定性.该文提出一种能够使图斑化简前后面积保持绝对平衡的化简算法,该算法给出了求取图斑群和岛图斑的地理特征点、几何特征点的相关方法,对被特征点分割的综合子弧段运用直线面积平衡和参数曲线面积平衡两种求取平衡线的方式进行化简,基本原理是综合前后弧段切割它们自身的包络矩形面积比相等,且综合后的弧段须落入综合前子弧段的误差缓冲区内.实验表明,该算法不但能保持图斑的形态特征,而且能保持图斑化简前后面积绝对相等.  相似文献   

2.
针对边界一致化改正过程中存在的边界提取不准确以及未能自动维持图斑拓扑一致性的问题,该文提出了一种新的三角剖分算法,对共享边界不一致的图斑进行自动检测和改正。在最大最小角原则和非法边原则的基础上,采用基于四边形的方法进行三角剖分,自动提取并平滑图斑缓冲区域间的骨架线以代替共享边界,达到边界一致化的目的。应用该算法与Delaunay三角网算法分别对第三次全国国土调查中的土地利用数据进行边界改正,实验结果表明:该算法不仅能够有效解决Delaunay三角网中骨架线延伸方向无法确定的问题,而且生成的新边界能够近似拟合原边界的自然弯曲形态,保持图斑边界的拓扑一致性。  相似文献   

3.
本文以地球系统科学、地球信息科学和现代地图学的理论、方法和技术为指导,系统地研究和建立了面向地理特征的制图综合的指标体系和知识法则,并进行了实例应用分析。研究方法是采用地学分析和归纳、地图分析、专家咨询、GIS和遥感空间分析等方法来总结、提炼和建立制图综合的指标体系和知识法则。指标类型包括数据指标、文字说明指标、图形指标3种,共分数据库概括(即语义概括)和地图可视化概括(即图形概括)两类。知识法则在横向由几何性知识、结构性知识、过程性知识构成,在纵向按照地物的地理特征描述性知识、操作项选择知识规则、算法选择知识规则、面向专门地理要素和制图综合知识规则、面向区域制图综合的知识规则等过程和方面来组织和分类。在知识库中则按照概括条件、概括行为和概括要求(或概括水平)3个变量来组织,形成三维坐标关系的知识法则内部体系。在实例分析中阐述了珠江三角洲经济区的交通网络图的制图综合过程和结果。  相似文献   

4.
土地利用图斑综合是实现其多尺度表达的关键技术,主要包括相邻图斑的融合、相离且语义邻近图斑的聚合、狭长图斑的综合以及要素边线的化简等。通过以下4种方法实现图斑综合:①以要素间语义相似度为先决条件并顾及空间拓扑关系建立邻近度分析模型,通过模型检索要素的最邻近图斑实现数据融合;②以相离要素缓冲区交集中的节点建立多边形要素填补桥梁区域实现要素聚合;③以相邻要素的缓冲区将狭长区域剖分,并融合到周围要素的方法来实现狭长图斑的综合;④以Douglas-Peucker算法实现要素边线化简。上述方法对普通图斑及特征图斑均提出了各自适应的综合算法。实验结果表明上述模型及算法能最大限度的保证各地类面积平衡、也能有效控制要素变形、合理压缩数据并保证地图简洁、美观,保证自动化制图综合的简捷高效。  相似文献   

5.
李松  罗绪强 《热带地理》2015,35(3):386-392
为增强石漠化信息提取的客观性,文章提出了基于小流域的图斑确定方法,并以贵州西部为研究区,基于2013-06-16的Landsat遥感影像和DEM(30 m)数据,分析小流域图斑分割在石漠化信息提取中的应用;并以2013-06-16的Landsat遥感影像为基础提取土地利用边界,以其为地理单元提取石漠化信息。在此基础上构建石漠化稳定指数(DSI),分析不同地理分割单元对石漠化信息提取的影响。结果表明,研究区2种地理单元的中度及以下石漠化面积比率>80%,而无明显、轻度、中度、重度和强度石漠化提取结果差异分别为66.5%、26.8%、72.4%、75.4%和55.9%,石漠化稳定指数仅0.15,不同地理分割单元对石漠化信息提取有显著影响。基于小流域图斑分割的方法,具有明确的地理意义和良好的稳定性,是石漠化信息提取中理想的分割方法。  相似文献   

6.
基本农田保护区划定自动化成图中图斑分割决策模型研究   总被引:3,自引:0,他引:3  
该文针对基本农田保护区划定自动化成图中必须进行图斑分割的决策问题,运用动态规划原理和相关几何知识,在建立连片性和区位因素影响下的耕地图斑定积分割决策模型基础上,开发出基本农田保护区划定图斑分割自动化成图决策系统,并利用该模型和系统对山西省太原市万柏林区一个行政村的基本农田保护区划定中的图斑自动分割进行了实证。结果表明,该模型能有效地达到基本农田保护区划定自动化成图中图斑分割定位、定量的要求,从而为实现基本农田保护区划定成图自动化提供技术支持。  相似文献   

7.
多尺度数字地貌等级分类方法   总被引:4,自引:1,他引:3  
参考已出版的全国各级各类比例尺的地貌类型图的分类方案及图例表达,探讨了中国1:100 万数字地貌的等级分类方法,采用基于形态、成因、物质和年龄等地貌要素,综合反映地貌特征的等级分类指标和分类体系,初步构建了中国多个国家基本比例尺(即1:400 万、1:100 万、1:50 万、1:25 万、1:5 万)数字地貌等级分类方法,发展了由连续分布的多边形图斑反映形态成因类型,以及由离散的点、线和面图斑共同反映形态结构类型的数字地貌类型数据组织方式,构建了多尺度数字地貌类型的编码方法。该研究可为发展多尺度地貌类型图的编制提供方法基础,也可为当前正在进行的地理国情监测工程的大比例尺地貌类型信息普查提供分类规范和技术支持。  相似文献   

8.
在国土资源动态监管中,变更调查管理信息套合等业务需要判断图斑的相似程度,经常将用地管理信息与土地利用现状变更结果空间叠加,利用重叠面积进行相似性判断,往往忽略了由于图斑形状不同所产生的误差。为解决此问题,该文提出基于F直方图的图斑几何形状相似性度量方法。首先,在图斑主轴上选择两个参考点,分别计算图斑与参考点之间的F直方图;然后,将两个直方图合并,并对合并后的直方图做归一化和中值滤波处理,从而获得表征图斑形态的特征向量;最后,利用特征向量间的距离表示图斑的相异程度。实验结果表明:基于F直方图的图斑几何形状相似性度量方法与图斑重叠面积、空间关系相似性度量方法相比,能更真实地描述图斑间的相似性,更清晰地反映图斑在形状上的差异,可提升土地利用变更调查中图斑相似性度量的准确度。  相似文献   

9.
土地利用调查中图斑平均坡度的获取方法   总被引:3,自引:0,他引:3  
图斑的平均坡度是新一轮土地利用更新调查中确定土地利用是否合理的重要参 数,但常用的图斑平均坡度获取方法均存在不同程度的不足.探讨图斑平均坡度的算法原 理,以浙江省某县的地形图数据与土地利用更新调查数据为基准数据,提出由TIN内插成DEM ,在此基础上利用ArcGIS得出各土地利用图斑的平均坡度.经验证,该方法的相对误差在4% 以内,平均相对误差为2.2%,精度符合应用要求.  相似文献   

10.
姚广标 《地理教学》2011,(13):59-60
地理口诀法是地理教师运用语言,形象地概括各种地理现象、地理数据、地理特征、地理规律,以达到教学效果最优化的一种教学方法。运用口诀进行地理教学,既符合心理学原理,又符合初中生的记忆特征,可以把广泛而繁杂的教学内容系统化、条理化、概括化,把需要学生掌握的知识集中起来进行教学,达到便于学生理解和记忆的目的。  相似文献   

11.
This paper reports an investigation on the accuracy of grid-based routing algorithms used in hydrological models. A quantitative methodology has been developed for objective and data-independent assessment of errors generated from the algorithms that extract hydrological parameters from gridded DEM. The generic approach is to use artificial surfaces that can be described by a mathematical model, thus the ‘true’ output value can be pre-determined to avoid uncertainty caused by uncontrollable data errors. Four mathematical surfaces based on an ellipsoid (representing convex slopes), an inverse ellipsoid (representing concave slopes), saddle and plane were generated and the theoretical ‘true’ value of the Specific Catchment Area (SCA) at any given point on the surfaces could be computed using mathematical inference. Based on these models, tests were made on a number of algorithms for SCA computation. The actual output values from these algorithms on the convex, concave, saddle and plane surfaces were compared with the theoretical ‘true’ values, and the errors were then analysed statistically. The strengths and weaknesses of the selected algorithms are also discussed.  相似文献   

12.
基于P2P的空间查询路由算法综述   总被引:1,自引:0,他引:1  
介绍对等网络的历史与典型模式,给出路由的基础概念.列举基于P2P的查询路由经典算法(Chlord、CAN、Pastry等)和相关改进算法(Kademlia、Emergint、SmartBoa等).结合空间数据的特点,综述基于P2P的空间查询路由算法,如共享兴趣点算法、基于Chiord构造内容访问、超级对象、位置复制和几何距离模式的算法.以及空间划分格网、依维度优先级的查询算法和服务封装算法.良好的空间路由算法能够提高网络的性能和执行效率,推动空间信息领域的发展.  相似文献   

13.
Image fusion is the production of high-resolution images by combining the spatial details of a high-resolution image with the spectral features of a low-resolution one. Reports of various quality metrics to evaluate the spectral and spatial qualities of fused images have been published. However, metrics may lead to misinterpretation due to inherent limitations in their mathematical algorithms. Hence, the use of additional assessment techniques in quality evaluation is reasonable. The purpose of the study was to compare the performances of several advanced fusion algorithms in order to help users in their choice of an appropriate fusion algorithm. Four different datasets were fused using advanced fusion algorithms, namely UNB PanSharp, Hyperspherical Color Space, High-Pass Filtering, Ehlers, Subtractive, Wavelet Single Band, Gram-Schmidt, Flexible Pixel-Based, and Criteria-Based. The spectral and spatial qualities of the fused images were evaluated using various quantitative procedures to ensure comprehensive and reliable comparison. The results showed that the Flexible Pixel-Based and High-Pass Filtering algorithms were very successful with regard to spatial quality, whereas the Flexible Pixel-Based and Criteria-Based algorithms were very successful with regard to spectral quality. The authors conclude that the Flexible Pixel-Based algorithm can be used for applications that require high spectral and spatial quality.  相似文献   

14.
在BRDF测试系统环境下利用ASD便携式野外光谱仪采集台湾相思树叶片光谱,并用UV2450-分光光度计对观测叶片进行叶绿素含量测定.对光谱数据采用不同算法获得红边位置变量,并与叶绿素含量进行拟合,构建用于估测台湾相思树叶片叶绿素含量的模型.结果表明,各种算法获得的红边位置变量用于构建模型估测叶绿素含量是可行的,其中,采...  相似文献   

15.
在BRDF测试系统环境下利用ASD便携式野外光谱仪采集台湾相思树叶片光谱,并用UV2450-分光光度计对观测叶片进行叶绿素含量测定.对光谱数据采用不同算法获得红边位置变量,并与叶绿素含量进行拟合,构建用于估测台湾相思树叶片叶绿素含量的模型.结果表明,各种算法获得的红边位置变量用于构建模型估测叶绿素含量是可行的,其中,采用5次多项武拟合算法精度最高.经平滑处理后的一阶导数法精度坎之.  相似文献   

16.
卧虎山水库不同算法LS因子值适用性分析   总被引:1,自引:1,他引:0  
胡刚  宋慧  石星军  张绪良  方海燕 《地理科学》2015,35(11):1482-1488
在总结以往LS因子相关算法研究基础上,结合McCool的LS因子参照值,对LS算法的适用性进行评价分析。研究表明,除去复合算法和Remortel修正算法外,其他不同算法LS因子值都小于参照值;研究区LS因子值的最优算法为基于Remortel迭代运算的修正算法;其次为结合刘宝元的陡坡公式和Remortel改进L指数因子迭代运算的复合算法,以及Remortel第4版AML程序算法;再次为B?hner算法,而Moore算法和Desmet算法,由于其与参照值的相关性相对较差,而且其RMSE相对较大,不推荐在该区使用。  相似文献   

17.
作为GIS的核心功能之一,空间分析逐步向处理数据海量化及分析过程复杂化方向发展,以往的串行算法渐渐不能满足人们对空间分析在计算效率、性能等方面的需求,并行空间分析算法作为解决目前问题的有效途径受到越来越多的关注。该文在简要介绍空间分析方法和并行计算技术的基础上,着重从矢量算法与栅格算法两方面阐述了目前并行空间分析算法的研究进展,评述了在空间数据自身特殊性的影响下并行空间分析算法的发展方向及存在的问题,探讨了在计算机软硬件技术高速发展的新背景下并行空间分析算法设计面临的机遇与挑战。  相似文献   

18.
ABSTRACT

Six routing algorithms, describing how flow (and water borne material) will be routed over Digital Elevation Models, are described and compared. The performance of these algorithms is determined based on both the calculation of the contributing area and the prediction of ephemeral gullies. Three groups of routing algorithms could be identified. Both from a statistical and a spatial viewpoint these groups produce significantly different results, with a major distinction between single flow and multiple flow algorithms. Single flow algorithms cannot accommodate divergent flow and are very sensitive to small errors. Therefore, they are not acceptable for hillslopes. The flux decomposition algorithm, proposed here, seems to be preferable to other multiple flow algorithms as it is mathematically straightforward, needs only up to two neighbours and yields more realistic results for drainage lines. The implications of the routing algorithms on the prediction of ephemeral gullies seem to be somewhat counterintuitive: the single flow algorithms that, at first sight, seem to mimic the process of overland flow, do not yield optimal prediction results.  相似文献   

19.
ABSTRACT

The amount of spatiotemporal data collected by gadgets is rapidly growing, resulting in increasing costs to transfer, process and store it. In an attempt to minimize these costs several algorithms were proposed to reduce the trajectory size. However, to choose the right algorithm depends on a careful analysis of the application scenario. Therefore, this paper evaluates seven general purpose lossy compression algorithms in terms of structural aspects and performance characteristics, regarding four transportation modes: Bike, Bus, Car and Walk. The lossy compression algorithms evaluated are: Douglas-Peucker (DP), Opening-Window (OW), Dead-Reckoning (DR), Top-Down Time-Ratio (TS), Opening-Window Time-Ratio (OS), STTrace (ST) and SQUISH (SQ). Pareto Efficiency analysis pointed out that there is no best algorithm for all assessed characteristics, but rather DP applied less error and kept length better-preserved, OW kept speed better-preserved, ST kept acceleration better-preserved and DR spent less execution time. Another important finding is that algorithms that use metrics that do not keep time information have performed quite well even with characteristics time-dependent like speed and acceleration. Finally, it is possible to see that DR had the most suitable performance in general, being among the three best algorithms in four of the five assessed performance characteristics.  相似文献   

20.
In integration of road maps modeled as road vector data, the main task is matching pairs of objects that represent, in different maps, the same segment of a real-world road. In an ad hoc integration, the matching is done for a specific need and, thus, is performed in real time, where only a limited preprocessing is possible. Usually, ad hoc integration is performed as part of some interaction with a user and, hence, the matching algorithm is required to complete its task in time that is short enough for human users to provide feedback to the application, that is, in no more than a few seconds. Such interaction is typical of services on the World Wide Web and to applications in car-navigation systems or in handheld devices.

Several algorithms were proposed in the past for matching road vector data; however, these algorithms are not efficient enough for ad hoc integration. This article presents algorithms for ad hoc integration of maps in which roads are represented as polylines. The main novelty of these algorithms is in using only the locations of the endpoints of the polylines rather than trying to match whole lines. The efficiency of the algorithms is shown both analytically and experimentally. In particular, these algorithms do not require the existence of a spatial index, and they are more efficient than an alternative approach based on using a grid index. Extensive experiments using various maps of three different cities show that our approach to matching road networks is efficient and accurate (i.e., it provides high recall and precision).

General Terms:Algorithms, Experimentation  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号