首页 | 本学科首页   官方微博 | 高级检索  
     

激光与倾斜点云精确配准与无缝融合
引用本文:阮明浩,卢永华,刘玉贤,汤圣君,王伟玺,张宇琦,李游. 激光与倾斜点云精确配准与无缝融合[J]. 测绘通报, 2022, 0(4): 6-10. DOI: 10.13474/j.cnki.11-2246.2022.0101
作者姓名:阮明浩  卢永华  刘玉贤  汤圣君  王伟玺  张宇琦  李游
作者单位:1. 深圳市勘察研究院有限公司, 广东 深圳 518026;2. 深圳大学建筑与城市规划学院& 自然资源部城市国土资源监测与仿真重点实验室, 广东 深圳 518060;3. 辽宁工程技术大学测绘与地理科学学院, 辽宁 阜新 123000
基金项目:国家自然科学基金(41801392;41901329;41971354;41971341);
摘    要:广泛使用的传统激光点云与倾斜摄影等城市空间建模方法存在各自的优缺点,将不同来源的数据进行有效融合并用于城市三维模型的构建是当前城市建模的难点。因此,本文首先提出了一种激光与倾斜点云精确配准与无缝融合的方法,该方法可以克服不同来源点云数据在尺度与精度上不一致的问题,精确计算不同来源数据的空间变换关系;然后在此基础上剔除数据冗余,完成多源点云数据的无缝融合;最后采用3组实际场景中获取的数据对本文剔除方法的有效性及精度进行验证。试验结果表明,通过本文方法获取的融合数据相较于单一平台获取的数据具有更好的完整性和数据精度。

关 键 词:点云配准  变化检测  三维点云  三维变换  融合  
收稿时间:2021-07-14

Registration and seamless fusion of point cloud from LiDAR and oblique photography
RUAN Minghao,LU Yonghua,LIU Yuxian,TANG Shengjun,WANG Weixi,ZHANG Yuqi,LI You. Registration and seamless fusion of point cloud from LiDAR and oblique photography[J]. Bulletin of Surveying and Mapping, 2022, 0(4): 6-10. DOI: 10.13474/j.cnki.11-2246.2022.0101
Authors:RUAN Minghao  LU Yonghua  LIU Yuxian  TANG Shengjun  WANG Weixi  ZHANG Yuqi  LI You
Affiliation:1. Shenzhen Investigation and Research Institute Co., Ltd., Shenzhen 518026, China;2. School of Architecture and Urban Planning, Research Institute for Smart Cities, Shenzhen University & Key Laboratory of Urban Land Resources Monitoring and Simulation, Ministry of Natural Resources, Shenzhen 518060, China;3. School of Geomatics, Liaoning Technical University, Fuxin 123000, China
Abstract:The traditional and widely used urban space modeling methods such as laser point cloud and oblique photography have their own advantages and disadvantages. How to effectively integrate data from different sources and use them in the construction of city 3D models is the current difficulty in city modeling. Therefore, this paper proposes a precise registration and seamless fusion method for laser and oblique point clouds. This method can overcome the inconsistency of scale and accuracy of point cloud data from different sources, accurately calculate the spatial transformation relationship of data from different sources, and on this basis, data redundancy is eliminated, and the seamless integration of multi-source point cloud data is completed. Finally, this paper uses the data obtained in three sets of actual scenes to verify the effectiveness and accuracy of the elimination method in this paper. The experimental results show that the fusion data obtained by this method has better completeness and data accuracy than the single platform data.
Keywords:point cloud registration  changing detection  3D point cloud  3D transformation  fusion  
本文献已被 万方数据 等数据库收录!
点击此处可从《测绘通报》浏览原始摘要信息
点击此处可从《测绘通报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号