帳號:guest(3.133.120.91)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):林郁庭
作者(外文):Lin, Yu-Ting
論文名稱(中文):基於YOLO之雷達目標偵測演算法
論文名稱(外文):YOLO-CFAR: a Novel CFAR Target Detection Method Based on YOLO
指導教授(中文):鍾偉和
指導教授(外文):Chung, Wei-Ho
口試委員(中文):張佑榕
吳仁銘
翁詠祿
口試委員(外文):Chang, Ronald Y.
Wu, Jen-Ming
Ueng, Yeong-Luh
學位類別:碩士
校院名稱:國立清華大學
系所名稱:通訊工程研究所
學號:108064545
出版年(民國):110
畢業學年度:109
語文別:中文
論文頁數:41
中文關鍵詞:恆虛警率目標偵測深度學習物件偵測YOLO動態範圍壓縮
外文關鍵詞:Constant False Alarm RateTarget DetectionDeep LearningObject DetectionYOLODynamic Range Compression
相關次數:
  • 推薦推薦:0
  • 點閱點閱:102
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
恆虛警率(Constant False Alarm Rate,簡稱CFAR)檢測是雷達系統中常見的目標偵測算法。然而,傳統的CFAR檢測器在非均質環境(Nonhomogeneous scenario)中,例如:多目標環境(Multiple target scenario)及雜波環境(Clutter scenario),其檢測能力會明顯下降。雖然基於深度學習(Deep Learning,簡稱DL)之CFAR檢測器(簡稱DL-CFAR)改善了多目標環境中的檢測效能,但其依舊無法解決雜波環境中檢測效能低下的問題。造成傳統CFAR及DL-CFAR效能低下的原因通常是其對雜訊電平(Noise level)的估計不夠準確,為了提高CFAR檢測效能,本研究提出了一種新的想法,即是將距離都卜勒圖(Range Doppler map,簡稱RD map)視為一張圖片,使用深度學習中物件偵測之模型來偵測目標,少了估計雜訊的步驟,可以減少造成錯誤延遲(Error propagation)的可能性,並且提升CFAR檢測器效能。因本研究所應用之物件偵測模型為YOLO(You Only Look Once),因此將此算法命名為:YOLO-CFAR。
在本研究中,提出了一種基於深度學習之物件偵測模型的目標偵測算法,我們除了引入YOLO模型外,也使用動態範圍壓縮(Dynamic Range Compression,簡稱DRC)對資料作前處理,除此之外,我們還加入深度神經網路(Deep Neural Network,簡稱DNN),進一步提升YOLO-CFAR在多目標環境中的檢測效能。最後,經由模擬結果顯示,本研究提出的方法除了在均質環境(Homogeneous scenario)中有良好的效能外,在非均質環境中的效能更明顯優於其他現有DL算法及傳統算法,並且其檢測速度可達即時(Real time)檢測。
Constant False Alarm Rate (CFAR) detection is a common target detection algorithm in radar systems. However, nonhomogeneous scenarios, such as multi-target scenarios and clutter scenarios, can dramatically affect the CFAR target detection performance because of the erroneous noise level estimation. In order to improve the CFAR target detection performance in nonhomogeneous scenarios, we propose a novel CFAR target detection method, based on a deep learning model: you only look once (YOLO), called YOLO-CFAR. The proposed CFAR scheme does not require to estimate the noise level and use deep learning model for object detection to detect targets in an RD map. The possibility of error propagation caused by inaccurate noise level estimation decreased, thus getting better CFAR target detection performance.
In this paper, we not only introduce YOLO in CFAR target detection, but also use dynamic range compression (DRC) to pre-process the input data and add deep neural network (DNN) to further improve the performance of YOLO-CFAR. Simulation results demonstrate that YOLO-CFAR outperforms other CFAR schemes especially in nonhomogeneous scenarios, furthermore, YOLO-CFAR can achieve real-time detection.
摘要-------------------------------------------i
Abstract--------------------------------------ii
誌謝-------------------------------------------iii
目錄-------------------------------------------iv
圖次-------------------------------------------vii
第一章 緒論------------------------------------1
1.1 研究背景與動機----------------------------1
1.2 論文章節內容安排--------------------------3
第二章 相關背景及系統模型---------------------4
2.1 相關背景-----------------------------------4
2.1.1 調頻連續波雷達--------------------------4
2.1.2 距離都卜勒圖----------------------------5
2.2 系統模型----------------------------------6
第三章 恆虛警率-------------------------------8
3.1 恆虛警率概念------------------------------8
3.2 恆虛警率檢測器----------------------------9
3.2.1 單元平均恆虛警率檢測器-----------------11
3.2.2 最大單元平均恆虛警率檢測器-------------12
3.2.3 最小單元平均恆虛警率檢測器-------------14
3.2.4 有序統計恆虛警率檢測器-----------------15
3.2.5 深度學習恆虛警率檢測器-----------------16
第四章 本論文所提出YOLO-CFAR演算法--------17
4.1 總覽 YOLO-CFAR--------------------------17
4.2 動態範圍壓縮------------------------------19
4.3 YOLO介紹---------------------------------21
4.3.1 深度學習之物件偵測介紹-----------------21
4.3.2 YOLO技術介紹---------------------------22
4.3.3 簡化YOLO模型---------------------------26
4.4 深度神經網路------------------------------27
第五章 模擬結果與分析-------------------------28
5.1 模擬環境與比較對象------------------------28
5.2 訓練資料產生與設定------------------------28
5.3 模擬結果----------------------------------28
5.3.1 單目標環境------------------------------29
5.3.2 多目標環境------------------------------31
5.3.3 雜波環境--------------------------------33
5.3.4 加入深度神經網路與否之比較-------------36
5.3.5 使用動態範圍壓縮與使用截斷運算之比較--37
第六章 結論------------------------------------38
參考文獻---------------------------------------39
[1] M. A. Richards, J. A. Scheer, and W. A. Holm, "Constant false alarm rate detectors," in Principles of modern radar, Raleigh, NC: SciTech Publishing, 2010, pp. 589–620.
[2] A. Jalil, H. Yousaf, and M. I. Baig, "Analysis of cfar techniques," in Proc. 13th Int. Bhurban Conf. Appl. Sci. Technol. (IBCAST), Jan. 2016, pp. 654–659.
[3] V. G. Hansen and J. H. Sawyers, "Detectability loss due to greatest of selection in a cell-averaging cfar," IEEE Trans. Aerosp. Electron. Syst., vol. AES-16, no. 1, pp. 115–118, Jan. 1980.
[4] G. V. Trunk, "Range resolution of targets using automatic detectors," IEEE Trans. Aeros. Electron. Syst., vol. AES-14, no. 5, pp. 750–755, Sept. 1978.
[5] H. Rohling, "Radar cfar thresholding in clutter and multiple target situations," IEEE Trans. Aerosp. Electron. Syst., vol. AES-19, no. 4, pp. 608–621, Jul. 1983.
[6] J. T. Rickard and G. M. Dillard, "Adaptive detection algorithms for multiple-target situations," IEEE Trans. Aeros. Electron. Syst., vol. AES-13, no. 4, pp. 338–343, July 1977.
[7] C. Lin, Y. Lin, Y. Bai, W. Chung, T. Lee, and H. Huttunen, "Dl-cfar: a novel cfar target detection method based on deep learning," 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall), Honolulu, HI, USA, Sept. 2019.
[8] J. Redmon and A. Farhadi, "Yolov3: an incremental improvement," Apr. 2018, arXiv:1804.02767. [Online]. Available: https://arxiv.org/abs/1804.02767
[9] A. G. Stove, "Linear fmcw radar techniques," IEE Proc. F Radar Signal Process., vol. 139, no. 5, pp. 343–350, Oct. 1992.
[10] J. Fink and F. K. Jondral, "Comparison of ofdm radar and chirp sequence radar", 2015 16th Int. Radar Symp., pp. 315-320, 2015.
[11] A. Wojtkiewicz, J. Misiurewicz, M. Nalecz, K. Jedrzejewski, and K. Kulpa, "Two-dimensional signal processing in fmcw radars," in Proc. 20th KKTOiUE, Warszawa, Poland, 1996, pp. 475–480.
[12] M. Kronauge and H. Rohling, "New chirp sequence radar waveform," IEEE Trans. Aerosp. Electron. Syst., vol. 50, no. 4, pp. 2870–2877, Oct. 2014.
[13] D. Giannoulis, M. Massberg, and J. D. Reiss, "Parameter automation in a dynamic range compressor," J. Audio Eng. Soc., vol. 61, no. 10, Oct. 2013.
[14] Z.-Q. Zhao, P. Zheng, S.-T. Xu, and X. Wu, "Object detection with deep learning: a review," IEEE Trans. Neural Netw. Learn. Syst., vol. 30, no. 11, pp. 3212–3232, Nov. 2019.
[15] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You only look once: unified, real-time object detection," in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), June 2016, pp. 779–788.
[16] J. Redmon and A. Farhadi, "Yolo9000: better, faster, stronger," in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), Jul. 2017, pp. 6517–6525.
[17] S. Ren, K. He, R. Girshick, and J. Sun, "Faster r-cnn: towards real-time object detection with region proposal networks," IEEE Trans. Pattern Anal. Mach. Intell., vol. 39, no. 6, pp. 1137–1149, Jun. 2017.
[18] K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., Jun. 2016, pp. 770–778.
[19] T.-Y. Lin, P. Dollar, R. Girshick, K. He, B. Hariharan, and S. Belongie, "Feature pyramid networks for object detection," in Proc. CVPR, 2017, pp. 936–944.
[20] Y. L. Sit and T. Zwick, "Automotive mimo ofdm radar: subcarrier allocation techniques for multiple-user access and doa estimation," in 11th European Radar Conf., Oct. 2014, pp. 153–156.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *