帳號:guest(18.227.134.45)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):島津達則
作者(外文):Shimazu Tatsunori
論文名稱(中文):適用於混沌光達系統之每秒10幀即時深度圖FPGA系統設計與實作
論文名稱(外文):A 10 Frames/Sec Real-Time Depth Map FPGA System for Chaos LiDAR Systems
指導教授(中文):黃元豪
指導教授(外文):Huang, Yuan-Hao
口試委員(中文):蔡佩芸
陳坤志
沈中安
口試委員(外文):Tsai, Pei-Yun
Chen, Kun-Chih
Shen, Chung An
學位類別:碩士
校院名稱:國立清華大學
系所名稱:電機工程學系
學號:108061568
出版年(民國):111
畢業學年度:110
語文別:英文
論文頁數:57
中文關鍵詞:混沌光達即時現場可程式化邏輯閘陣列
外文關鍵詞:chaosLiDARReal-timeFPGA
相關次數:
  • 推薦推薦:0
  • 點閱點閱:479
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
光達是一種測量目標距離或深度訊息的技術。我們可以通過估計光信號從設備傳輸到目標的時間來計算到目標的距離。由於我們在混沌光達系統中使用的光源是混沌光,它是一種類噪聲信號,因此該系統被稱為混沌光達系統。
混沌光達系統可以應用於各個領域,包括深度計算、自動駕駛、3D 建模、擴增時實境、虛擬實境等等。為了保證混沌光達系統的可靠性和有效性,實時光達系統勢在必行。本文介紹了在現場可程式化邏輯閘陣列和個人電腦上的混沌光達實時實現,以及研究過程中遇到的挑戰,例如光學儀器之間延遲、跨時脈域訊號處理等等,並且建立了圖形使用者介面,說明了以上問題的解決方案和演示結果並與現有的產品進行比較。以每秒 11 萬像素的吞吐量,混沌光達系統每秒提供 10 幀長100點寬100點深度分辨率,並且具有毫米等級的精準度。
Light detection and ranging (LiDAR) is a technique that measures the distance or depth information of the target. We can calculate the distance of the target by estimating the time of the transmission signal transmitting from device to target. Since the light source we use in the chaos LiDAR system is chaos laser, which is a noise-like signal, the system is called chaos LiDAR system. Chaos LiDAR system can be applied to various fields, including depth calculation, autopilot, 3D modeling, augmented reality, virtual reality, etc. To ensure the reliability and validity of chaos LiDAR system, an access to real-time LiDAR system is imperative. This thesis introduces the chaos LiDAR real-time implementation on field programmable gate array (FPGA) and personal computer (PC), followed by challenges during the research process, solutions to the problems and the demo results. With a throughput of 110 thousand pixels per second, the chaos LiDAR system provides 10 frames of 100x100 depth map resolution per second.
1 Introduction
1.1 Chaos LiDAR................................. 2
1.2 Research motivation............................. 5
1.3 Organization of This Thesis......................... 6

2 Depth Map Sensing of Chaos LiDAR
2.1 Time of Flight (ToF) technology....................... 9
2.2 Correlation of Reference and Target Signals................ 10
2.3 Interpolation Algorithm........................... 12
2.3.1 Interpolation Complexity Reduction................. 16

3 Implementation of Real-Time Chaos LiDAR System
3.1 Experiment Device.............................. 19
3.2 Chaos LiDAR System Diagram....................... 23
3.3 Hardware Block Diagram of Chaos LiDAR system............. 25
3.4 Chaos LiDAR Control Flow......................... 27
3.5 Hardware Design of depth-sensing Algorithm................ 33
3.6 FPGA placement problem.......................... 38
3.7 Software and Graphical User Interface (GUI)................ 41

4 Demonstrations of System and Results

5 Conclusion and Future Work
5.1 Conclusion................................... 53
5.2 Future Work.................................. 54

References 55
[1] F.-Y. Lin and J.-M. Liu, “Chaotic lidar,”IEEE Journal of Selected Topics in Quantum Electronics, vol. 10, no. 5, pp. 991–997, 2004.
[2] Abaco, “fmc126.” [Online]. Available: https://www.abaco.com/products/fmc126-fpga-mezzanine-card
[3] Infineon, “cyusb3fx3.” [Online]. Available: https://www.infineon.com/cms/en/product/evaluation-boards/cyusb3kit-003/
[4] Sony, “Dsc-rx100m5a. ”[Online]. Available: https://www.sony.com.tw/zh/electronics/cyber-shot-compact-cameras/dsc-rx100m5a\#related
roducts\dynamic\default
[5] F. Pellen, P. Olivard, Y. Guern, J. Cariou, and J. Lotrian, “Radio frequency modulation on an optical carrier for target detection enhancement in sea-water,”Journalof Physics D: Applied Physics, vol. 34, pp. 1122–1130, 2001.
[6] K. Myneni, T. A. Barr, B. R. Reed, S. D. Pethel, and N. J. Corron, “High-precision ranging using a chaotic laser pulse train,”Applied Physics Letters, vol. 78, no. 11,pp. 1496–1498, 2001.
[7] J. Liu, Q. Sun, Z. Fan, and Y. Jia, “Tof lidar development in autonomous vehicle,”2018 IEEE 3rd Optoelectronics Global Conference (OGC), 2008.

[8] A. Zomet, A. Rav-Acha, and S. Peleg, “Robust super-resolution,” inProceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, vol. 1. IEEE, 2001, pp. I–645–I–650.
[9] J. Mark, B. Tromborg, and J. Mark, “Chaos in semiconductor lasers with optical feedback: Theory and experiment,”IEEE Journal of Quantum Electronics, vol. 28,no. 1, pp. 93–108, 1992.
[10] Y. H. L. W. T. Wu and F. Lin, “Noise suppressions in chaotic lidars under different synchronization schemes,” in 2010 23rd Annual Meeting of the IEEE Photonics Society, 2010, pp. 183-184, doi: 10.1109/PHOTONICS.2010.5698819. IEEE, 2010.
[11] Y. Li and J. Ibanez-Guzman, “Lidar for autonomous driving: The principles,challenges, and trends for automotive lidar and perception systems,” ininIEEE Signal Processing Magazine, vol. 37, no. 4, pp. 50-61, July 2020, doi:10.1109/MSP.2020.2973615.IEEE, 2020.
[12] Y.-C. Lin, P.-H. Hsieh, J.-L. Hong, Y.-H. Lai, J.-D. Chen, F.-Y. Lin, Y.-H. Huang,and P.-C. Huang, “A cross-correlation-based time-of-flight design for chaos lidar systems,” in2021 IEEE Asian Solid-State Circuits Conference (A-SSCC), 2021,pp. 1–3.
[13] H.-L. Ho, J.-D. Chen, C.-A. Yang, C.-C. Liu, C.-T. Lee, Y.-H. Lai, and F.-Y.Lin, “High-speed 3d imaging using a chaos lidar system,”The European Physical Journal Special Topics, 01 2022.
[14] H. L. Ho, “Study of chaos-modulated pulses generates by different modulationdevices and their characteristics in the master oscillator power amplifier configuration,” Master’s thesis, National Tsing Hua University, Taiwan, 2020.
[15] J. P. Queralta, F. Yuhong, L. Salomaa, L. Qingqing, T. N. Gia, Z. Zou, H. Ten-hunen, and T. Westerlund, “Fpga-based architecture for a low-cost 3d lidar design and implementation from multiple rotating 2d lidars with ros,” in2019 IEEE SEN-SORS, 2019, pp. 1–4.
[16] Xilinx, “Xilinx virtex-7 fpga vc707 evaluation kit.” [Online]. Available:https://www.xilinx.com/products/boards-and-kits/ek-v7-vc707-g.html
[17] A. system,FMC122/FMC125/FMC126 User Manual, Abaco system.
[18] W. L. Lin, “Real-time fpga system for chaos lidar depth map and its alignmentmethod with camera images,” Master’s thesis, National Tsing Hua University, Taiwan, 2020.
[19] J.-J. Guo, “Low-complexity depth-sensing algorithm and architecture design forchaotic lidar system,” Master’s thesis, National Tsing Hua University, Taiwan,2019.
[20] Xilinx,Vivado Design Suite User Guide-Design Analysis and Closure Techniques,Xilinx.
[21] G. Bradski, “The OpenCV Library,”Dr. Dobb’s Journal of Software Tools, 2000.
[22] R. Szeliski,Computer Vision: Algorithms and Applications, 2nd ed., 2020.
[23] Intel, “Intel realsense camera depth testing methodology.” [Online]. Avail-able: https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSenseDepthQualityTesting.pdf
[24] Intel, “Intel lidar-camera-d400-datasheet.” [Online]. Available:https://dev.intelrealsense.com/docs/intel-realsense-d400-series-product-family-datasheet
[25] Intel, “Intel lidar-camera-l515-datasheet.” [Online]. Available:https://dev.intelrealsense.com/docs/lidar-camera-l515-datasheet
[26] D. Pagliari and L. Pinto, “Calibration of kinect for xbox one and comparison be-tween the two generations of microsoft sensors,”Sensors, vol. 15, pp. 27569–27589,10 2015
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *