帳號:guest(3.144.118.236)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):簡光宏
作者(外文):Chien, Kuang-Hung
論文名稱(中文):結合光達與彩色攝影機之即時定位系統
論文名稱(外文):Real-time Localization from LiDAR and RGB Cues
指導教授(中文):陳煥宗
指導教授(外文):Chen, Hwann-Tzong
口試委員(中文):鄭嘉珉
王書凡
陳鼎介
口試委員(外文):Cheng, Chia-Ming
Wang, Shu-Fan
Chen, Ding-Jie
學位類別:碩士
校院名稱:國立清華大學
系所名稱:資訊系統與應用研究所
學號:105065522
出版年(民國):107
畢業學年度:107
語文別:中文
論文頁數:24
中文關鍵詞:光達即時定位車輛即時定位光達即時定位
外文關鍵詞:LiDARReal-time LocalizationReal-time Vehicle LocalizationReal-time Localization using LiDAR
相關次數:
  • 推薦推薦:0
  • 點閱點閱:366
  • 評分評分:*****
  • 下載下載:13
  • 收藏收藏:0
本篇論文提出一套藉由光達感測器和彩色單目攝影機,結合同步定位與
地圖建構技術而成的車輛即時定位系統。現行的衛星定位技術,僅能用於以
靜態地圖為參考的導航系統,無法即時處理動態環境狀況,且容易受周遭環
境影響訊號強弱,進而導致定位失準。因此在本論文所提出的系統中,我們
利用彩色單目攝影機和光達轉換而成的深度圖,藉由同步定位與地圖建構技
術建立出周遭的點雲資訊,再與預先建立之帶有準確衛星定位座標的點雲圖
資進行比對,得出車輛在圖資中最相近之位置,最後隨著車輛的移動進行比
對和修正,得出車輛的準確位置和移動軌跡。
This thesis describes a real-time vehicle localization system. This system uses LiDAR and RGB Cues to achieve simultaneous localization and mapping (SLAM). Existing localization methods that rely on merely the GPS information are more suitable for navigation systems with static maps. Such navigation systems cannot either adapt to dynamic scenarios or benefit from the abundant visual cues in the surroundings. In our system, we use simultaneous localization and mapping (SLAM) to construct the point cloud from the visual cues acquired by a monocular RGB camera and the corresponding LiDAR information. Our system compares the current reconstructed point cloud with the pre-acquired, GPS-aware point cloud in the database, and finally infers the accurate position and odometry of the vehicle via continuous adjustment as the vehicle moves forward.
摘要.............................................2
Abstract ........................................3
Acknowledgements ................................4
1 Introduction ..................................8
2 RelatedWork ..................................10
2.1 SLAM 定位系統...............................10
2.2 光達定位系統 ...............................11
3 Proposed Method ..............................12
3.1 相機與光達的校正 ...........................12
3.2 產生深度圖 .................................14
3.3 位置與GPS 之轉換關係........................15
4 Experiments ..................................17
4.1 實驗環境設定 ...............................17
4.2 實驗資料蒐集 ...............................17
4.3 相機與光達的校正 ...........................18
4.4 產生深度圖 .................................19
4.5 SLAM 結果與相機軌跡和 GPS 之轉換關係驗證 ...19
5 Conclusion and FutureWork ....................23
5.1 深度圖 .....................................23
5.2 基於深度學習之彩色點雲辨識 .................23
Bibliography ...................................24
[1] G. Bradski. The OpenCV Library. Dr. Dobb’s Journal of Software Tools, 2000.
[2] J. Engel, T. Schöps, and D. Cremers. LSD-SLAM: Large-Scale Direct Monocular SLAM. In European Conference on Computer Vision (ECCV), September 2014.
[3] M. Labbé and F. Michaud. RTAB-Map : Real-Time Appearance-Based Mapping.
[4] R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robotics, 31(5):1147–1163, 2015.
[5] R. Mur-Artal and J. D. Tardós. ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. CoRR, abs/1610.06475, 2016.
[6] T. Shan and B. Englot. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages Accepted, To Appear in October. IEEE, 2018.
[7] K. Tateno, F. Tombari, I. Laina, and N. Navab. CNN-SLAM: Real-Time Dense Monocular SLAM with Learned Depth Prediction. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), volume 00, pages 6565–6574, July 2017.
[8] M. Velas, M. Spanel, Z. Materna, and A. Herout. Calibration of RGB Camera With Velodyne LiDAR. 2014.
[9] J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems Conference, July 2014.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *