首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This article presents an area‐preservation approach for polygonal boundary simplification by the use of structured total least squares adjustment with constraints (STLSC), with the aim being to maintain the area of the original polygons after the simplification. Traditionally, a simplified line is represented by critical points selected from the original one. However, this study focuses on maintaining the areas of the polygons in the process of simplification of polygonal boundaries. Therefore, the proposed method in this article is a supplement to the existing line simplification methods, and it improves the quality of the simplification of polygonal boundaries in terms of positional and area errors. Based on the sub‐divisions of the original polyline, using the critical points detected from the polyline by the use of line simplification methods, the framework of the proposed method includes three main components, as follows: (1) establishment of the straight‐line‐segment fitting model based on both the critical and intermediate points on the sub‐polyline; (2) introduction of both area and end‐point constraints to reduce the geometric distortions due to the line simplification; and (3) derivation of the solution of boundary simplification by the use of STLSC. An empirical example was conducted to test the applicability of the proposed method. The results showed that: (1) by imposing the linear fitting model on both the critical and intermediate points on the sub‐polylines in the proposed STLSC method, the positional differences between the original points and the simplified line are approximately in a normal distribution; and (2) by introducing both end‐point and area constraints in the proposed STLSC method, the areas of the simplified polygons are the same as those of the original ones at different scales, and the two neighboring fitted lines are connected to each other at the optimized position.  相似文献   

2.
张春森  张会  郭丙轩  彭哲 《测绘学报》2020,49(3):334-342
针对二次误差测度(QEM)网格简化算法全局几何特征信息损失严重的缺点,提出一种具有结构感知功能面向城市三维模型重建的网格简化算法。该算法顾及城市影像中平面结构特征,以代理平面为全局特征约束条件,使模型在简化过程中全局结构特征更多地被保持,以利于多层次细节模型(LOD技术)、网格优化提速等模型后续操作。以倾斜摄影获取影像生成的初始三角网格模型为试验数据,采用所给算法对其进行网格简化并与QEM算法进行对比。结果表明:所给算法简化精度及简化效率均优于QEM算法。  相似文献   

3.
This paper describes three aspects of uncertainty in geographical information systems (GIS) and remote sensing. First, the positional uncertainty of an area object in a GIS is discussed as a function of positional uncertainties of line segments and boundary line features. Second, the thematic uncertainty of a classified remote sensing image is described using the probability vectors from a maximum likelihood classification. Third, the "S-band" model is used to quantify uncertainties after combining GIS and remote sensing data.  相似文献   

4.
顾及三维形态特征的河流曲线化简方法   总被引:1,自引:0,他引:1       下载免费PDF全文
鉴于常规曲线化简方法应用于河流曲线化简时难以顾及河流要素的三维特征及其拓扑结构,提出了一种顾及三维形态特征的河流曲线化简方法。该方法利用河流曲线上散点的三维特征对散点进行选取进而实现河流曲线化简。在三维Douglas-Peucker(3D D-P)算法的基础上提出一种三维散点排队法,根据散点的三维特征对河流曲线的离散点集进行排队,并通过初始排队、"3合1"队列合并及约束点位置调整3个过程建立散点队列,然后根据压缩比从队列尾部删除相应比例的点数获得散点综合结果,将综合后的散点按照河流曲线的原始次序重构出化简后的河流曲线。实验结果表明,该方法既能最大程度地保留河流的三维形态特征,又能保证河流曲线之间的拓扑结构一致性。  相似文献   

5.
分别利用直线、圆曲线与多项式曲线的拟合空间曲线实体,估计出拟合曲线与真实曲线之间的模型误差,建立包含模型误差与法线方向位置误差的曲线综合误差带模型。并通过算例证明了含有模型误差的综合误差带模型能更好地反应圆曲线的位置不确定性。  相似文献   

6.
基于折线逼近的曲线位置与模型误差综合建模   总被引:1,自引:0,他引:1  
孙彤  童小华 《测绘工程》2010,19(3):26-30
针对目前GIS中曲线常通过一系列折线来逼近的情况,研究考虑由于折线逼近导致的模型误差和由于测量导致的点位随机误差综合影响的曲线不确定性模型。分析曲线拟合的分段准则,提出折线逼近产生的模型误差可由折线模型到真实曲线的垂直距离描述,建立集成模型不确定性与基于误差传播定律的位置不确定性的曲线误差综合量化模型。  相似文献   

7.
A new method of cartographic line simplification is presented. Regular hexagonal tessellations are used to sample lines for simplification, where hexagon width, reflecting sampling fidelity, is varied in proportion to target scale and drawing resolution. Tesserae constitute loci at which new sets of vertices are defined by vertex clustering quantization, and these vertices are used to compose simplified lines retaining only visually resolvable detail at target scale. Hexagon scaling is informed by the Nyquist–Shannon sampling theorem. The hexagonal quantization algorithm is also compared to an implementation of the Li–Openshaw raster-vector algorithm, which undertakes a similar process using square raster cells. Lines produced by either algorithm using like tessera widths are compared for fidelity to the original line in two ways: Hausdorff distances to the original lines are statistically analyzed, and simplified lines are presented against input lines for visual inspection. Results show that hexagonal quantization offers advantages over square tessellations for vertex clustering line simplification in that simplified lines are significantly less displaced from input lines. Visual inspection suggests lines produced by hexagonal quantization retain informative geographical shapes for greater differences in scale than do those produced by quantization in square cells. This study yields a scale-specific cartographic line simplification algorithm, following Li and Openshaw's natural principle, which is readily applicable to cartographic linework. Open-source Java code implementing the hexagonal quantization algorithm is available online.  相似文献   

8.
The article is composed of two sections. In the first section, the authors describe the application of minimum line dimensions which are dependent on line shape, width and the operational scale of the map. The proposed solutions are based on the Euclidean metric space, for which the minimum dimensions of Saliszczew’s elementary triangle (Elementary triangle – is the term pertaining to model, standard triangle of least dimensions securing recognizability of a line. Its dimensions depend on scale of the map and width of the line representing it. The use of a triangle in the simplification process is as follows: triangles with sides (sections) on an arbitrary line and bases (completing the sides) are compared with lengths of the shorter side and the base of the elementary triangle.) were adapted. The second part of the article describes an application of minimum line dimensions for verifying and assessing generalized data. The authors also propose a method for determining drawing line resolution to evaluate the accuracy of algorithm simplification. Taking advantage of the proposed method, well-known simplification algorithms were compared on the basis of qualitative and quantitative evaluation. Moreover, corresponding with the methods of simplified data accuracy assessment the authors have extended these solutions with the rejected data. This procedure has allowed the identification of map areas where graphic conflicts occurred.  相似文献   

9.
The assessment of positional uncertainty in line and area features is often based on uncertainty in the coordinates of their elementary vertices which are assumed to be connected by straight lines. Such an approach disregards uncertainty caused by sampling and approximation of a curvilinear feature by a sequence of straight line segments. In this article, a method is proposed that also allows for the latter type of uncertainty by modelling random rectangular deviations from the conventional straight line segments. Using the model on a dense network of sub‐vertices, the contribution of uncertainty due to approximation is emphasised; the sampling effect can be assessed by applying it on a small set of randomly inserted sub‐vertices. A case study demonstrates a feasible way of parameterisation based on assumptions of joint normal distributions for positional errors of the vertices and the rectangular deviations and a uniform distribution of missed sub‐vertices along line segments. Depending on the magnitudes of the different sources of uncertainty, not accounting for potential deviations from straight line segments may drastically underestimate the positional uncertainty of line features.  相似文献   

10.
熵理论在确定点位不确定性指标上的应用   总被引:3,自引:0,他引:3  
分析了传统点位不确定性指标的局限性,基于信息论中的联合熵和最大熵定理导出了n维随机点熵不确定指标以及落入其内概率的统一公式;提出了以熵误差椭圆与熵误差椭球作为2维、3维GIS中点元的位置不确定性度量指标。提出的熵指标具有唯一确定、不受置信水平选取的主观性影响等特点,适合于度量未知分布的点位不确定性。  相似文献   

11.
GIS中三维空间直线的误差熵模型   总被引:1,自引:0,他引:1  
从信息熵的角度提出了三维空间直线的误差熵模型,该模型由以垂直直线的平面误差熵为半径的圆柱体和两端点的误差球组成,是一种完全确定的度量空间线元不确定性的模型。理论分析与实验表明,本文所提出的模型具有较好的效果。  相似文献   

12.
In this paper, a method to detect corresponding point pairs between polygon object pairs with a string matching method based on a confidence region model of a line segment is proposed. The optimal point edit sequence to convert the contour of a target object into that of a reference object was found by the string matching method which minimizes its total error cost, and the corresponding point pairs were derived from the edit sequence. Because a significant amount of apparent positional discrepancies between corresponding objects are caused by spatial uncertainty and their confidence region models of line segments are therefore used in the above matching process, the proposed method obtained a high F-measure for finding matching pairs. We applied this method for built-up area polygon objects in a cadastral map and a topographical map. Regardless of their different mapping and representation rules and spatial uncertainties, the proposed method with a confidence level at 0.95 showed a matching result with an F-measure of 0.894.  相似文献   

13.
当前基于弯曲的线要素化简在化简过程中对于连续小弯曲的化简处理有所欠缺。针对此提出了基于三元弯曲组的化简方法。该方法首先将连续的弯曲划分到各个弯曲三元组中;然后针对三元弯曲的不同组合类型采用不同的化简方式进行化简;最后设计循环化简判断规则,重复化简过程直到所有弯曲满足化简阈值,从而实现连续弯曲的间隔化简。实验表明,该方法能够有效地保持弯曲的形态特征以及不同化简阈值结果间的层次性。  相似文献   

14.
Spatial data uncertainty can directly affect the quality of digital products and GIS-based decision making. On the basis of the characteristics of randomicity of positional data and fuzziness of attribute data, taking entropy as a measure, the stochastic entropy model of positional data uncertainty and fuzzy entropy model of attribute data uncertainty are proposed. As both randomicity and fuzziness usually simultaneously exist in linear segments, their omnibus effects are also investigated and quantified. A novel uncertainty measure, general entropy, is presented. The general entropy can be used as a uniform measure to quantify the total uncertainty caused by stochastic uncertainty and fuzzy uncertainty in GIS.  相似文献   

15.
Spatial data uncertainty can directly affect the quality of digital products and GIS-based decision making. On the basis of the characteristics of randomicity of positional data and fuzziness of attribute data, taking entropy as a measure, the stochastic entropy model of positional data uncertainty and fuzzy entropy model of attribute data uncertainty are proposed. As both randomicity and fuzziness usually simultaneously exist in linear segments, their omnibus effects are also investigated and quantified. A novel uncertainty measure, general entropy, is presented. The general entropy can be used as a uniform measure to quantify the total uncertainty caused by stochastic uncertainty and fuzzy uncertainty in GIS.  相似文献   

16.
首先研究基于εσ模型单一折线段不确定性误差带,导出误差带边界线的解析表达式;然后通过算例分析,针对开折线和闭折线两种情况,由单一折线段误差带边界线的解析表达式,编程绘出位置不确定性随机折线的可视化图形。理论分析和可视化图形表明,在两条相邻折线的公共端点处,前一线段的右误差半圆的半径和后一线段的左误差半圆的半径未必相等,实际分析中需考虑到这种情况。  相似文献   

17.
Spatial data uncertainty can directly affect the quality of digital products and GIS-based decision making. On the basis of the characteristics of randomicity of positional data and fuzziness of attribute data, taking entropy as a measure, the stochastic entropy model of positional data uncertainty and fuzzy entropy model of attribute data uncertainty are proposed. As both randomic-ity and fuzziness usually simultaneously exist in linear segments, their omnibus effects are also investigated and quantified. A novel uncertainty measure, general entropy, is presented. The general entropy can be used as a uniform measure to quantify the total un-certainty caused by stochastic uncertainty and fuzzy uncertainty in GIS.  相似文献   

18.
三维激光扫描点位精度受光斑影响较大,激光点在光斑中呈现了不确定性,该不确定性的准确描述关系到激光点位精度的评价。将误差熵模型引入到点位不确定性的评价中,利用激光点位在光斑中不确定性的概率密度函数,推导了激光点位信息熵,并依据误差熵与信息熵的关系得到了激光点位的误差熵。通过分析误差熵与光斑面积的关系,得到点云光斑平均误差熵,实现了将平均误差熵引入到点云不确定性的评价中。通过设置不同扫描间隔得到的点云数据,分析了平均熵模型进行基于光斑影响下的点云精度评价的可行性,最终实现了对光斑中点云不确定性的准确评价。  相似文献   

19.
《The Cartographic journal》2013,50(3):221-233
Abstract

Cartographic generalization aims at simplifying the representation of data to suit the scale and purpose of the map. This paper deals with a method that implements the whole graphic generalization process (roughly defined as the operators simplification, smoothing, exaggeration and displacement) called simultaneous graphic generalization. This method is based on constraints, i.e. requirements that should be fulfilled in the generalization process. The constraints strive to make the map readable while preserving the characteristics of the data, which implies that all constraints cannot be completely satisfied. This study was concentrated on finding the optimal compromise between the constraints in simultaneous graphic generalization by setting weights for the constraints. Four strategies for determining the weights are described and their advantages and disadvantages are discussed. The discussion is based on the following assumptions: the constraints are independent, and the weights are only dependent on constraint type and object type. A comparison of the strategies reveals that the strategy constraint violation is the most promising. One advantage with this strategy is that it is related to the quality requirements of the map, and another advantage is that it provides a numerical measure for quality assessment. The paper concludes with a case study of the constraint violation strategy, in which visualization of the numerical quality measure is used. The case study shows that the constraint violation strategy gives a sound compromise between the constraints.  相似文献   

20.
Line generalisation by repeated elimination of points   总被引:1,自引:0,他引:1  
Abstract

This paper presents a new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cut-off values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号