帳號:guest(3.148.107.229)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):蕭弘珉
作者(外文):Hsiao, Hung-Min
論文名稱(中文):基於UIU-Net之CFAR方法於多目標和雜波環境中偵測雷達目標
論文名稱(外文):UIU-Net-Based CFAR Approach for Radar Target Detection in Multi-Target and Cluttered Environments
指導教授(中文):鍾偉和
指導教授(外文):Chung, Wei-Ho
口試委員(中文):劉光浩
張大中
口試委員(外文):Liu, Kuang-Hao
Chang, Dah-Chung
學位類別:碩士
校院名稱:國立清華大學
系所名稱:通訊工程研究所
學號:111064517
出版年(民國):113
畢業學年度:112
語文別:中文
論文頁數:49
中文關鍵詞:恆定虛警率目標偵測UIU-Net深度學習調頻連續波雷達距離速度圖
外文關鍵詞:CFARtarget detectionUIU-Netdeep learningFMCW radarRange Doppler map
相關次數:
  • 推薦推薦:0
  • 點閱點閱:6
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
本研究針對 FMCW 雷達中的距離速度圖(Range-Doppler Map, RD map)目標偵測問題,提出了一種基於深度學習的偵測方法。傳統方法在雜波和多目標環境中,經常受到遮蔽效應的影響,導致偵測準確度下降。為了解決這一問題,本研究基於 UIU-Net 語義分割模型,通過多尺度特徵提取和交叉通道注意力機制,有效提高了目標偵測的準確度。

實驗結果顯示,在多目標和不同信雜比(SNR)的雜波環境下,UIU-Net 的偵測性能優於傳統的 CFAR 方法。當 SCR 為 -5 和 -10 時,UIU-Net 在虛警率(False Alarm Rate)為 10^-4 時,其偵測率(Detection Rate)均保持在 0.9 以上,顯示其在低 SNR 條件下的卓越表現。在有 2 到 4 個目標的場景中,UIU-Net 在保持偵測率為 0.85 以上的情況下,也能保持虛警率為 10^-4 的表現。此外,與 YOLO-CFAR 方法相比,UIU-Net 的多尺度特徵提取能力使其在多目標偵測中具有更高的準確性和穩定性。值得注意的是,在 Carrada 真實資料中,UIU-Net 也表現出顯著優於常見 CFAR 方法的性能。
This study addresses the target detection problem in Range-Doppler Maps (RD maps) for FMCW radar by proposing a deep learning-based detection method. Traditional methods often suffer from masking effects in cluttered and multi-target environments, leading to reduced detection accuracy. To address this issue, this research utilizes the UIU-Net semantic segmentation model, which employs multi-scale feature extraction and inter-cross attention mechanisms to effectively enhance detection accuracy.

The experimental results demonstrate that UIU-Net outperforms traditional CFAR methods in cluttered environments with multiple targets and varying Signal-to-Clutter Ratios (SCR). When SCR is -5 and -10, UIU-Net maintains a Detection Rate above 0.9 at a False Alarm Rate of 10^-4, indicating its excellent performance under low SNR conditions. In scenarios with 2 to 4 targets, UIU-Net maintains a Detection Rate above 0.85 while also sustaining a False Alarm Rate of 10^-4. Additionally, compared to the YOLO-CFAR method, UIU-Net exhibits higher accuracy and stability in multi-target detection due to its multi-scale feature extraction capability. It is noteworthy that UIU-Net demonstrates significantly better performance than common CFAR methods on Carrada real-world data.
摘要 i
Abstract ii
誌謝 iii
目錄 iii
第一章 緒論 -------------------------------------------------------- 1
1.1 研究背景與動機 -------------------------------------------------- 1
1.2 主要貢獻 ------------------------------------------------------- 4
1.3 論文架構 ------------------------------------------------------- 4
第二章 相關技術與背景 ----------------------------------------------- 5
2.1 多目標之調頻連續波雷達通道矩陣 ----------------------------------- 5
2.2 恆定虛警率偵測 ------------------------------------------------- 8
2.2.1 單元平均-恆定虛警率 ------------------------------------------- 9
2.2.2 最大平均單元-恆定虛警率 --------------------------------------- 9
2.2.3 最小平均單元-恆定虛警率 -------------------------------------- 10
2.2.4 順序統計量-恆定虛警率 ---------------------------------------- 11
2.3 U-Net -------------------------------------------------------- 12
2.4 Attention ---------------------------------------------------- 14
2.4.1 原理 ------------------------------------------------------- 14
2.4.2 常見分類 ---------------------------------------------------- 15
2.4.3 常見注意力模型 ----------------------------------------------- 16
第三章 雷達目標距離速度估計演算法 ----------------------------------- 19
3.1 問題描述 ------------------------------------------------------ 19
3.2 系統模型 ------------------------------------------------------ 20
3.3 UIU-Net ------------------------------------------------------ 22
3.3.1 UIU-Net 概述 ----------------------------------------------- 22
3.3.2 Resolution-Maintenance Deep Supervision 模組 ---------------- 24
3.3.3 Interactive-Cross Attention 模組 ---------------------------- 27
第四章 模擬結果與分析 ---------------------------------------------- 31
4.1 Carrada 資料集 ------------------------------------------------ 31
4.2 評估指標 ------------------------------------------------------ 33
4.3 測試模擬環境與比較對象 ----------------------------------------- 33
4.4 訓練參數設定與訓練及測試資料 ------------------------------------ 34
4.5 實驗結果與分析 ------------------------------------------------- 36
4.5.1 真實環境 ---------------------------------------------------- 36
4.5.2 單目標環境 -------------------------------------------------- 37
4.5.3 多目標環境 -------------------------------------------------- 39
4.5.4 雜波環境 ---------------------------------------------------- 40
4.5.5 執行時間 ---------------------------------------------------- 45
第五章 總結及未來展望 ---------------------------------------------- 46
參考文獻 ---------------------------------------------------------- 47
[1] J. Gamba, “Radar signal processing for autonomous driving,” pp. 56–62, 2020.
[2] M. I. Skolnik, Introduction to Radar Systems. McGraw-Hill, 2001.
[3] A. Jalil, H. Yousaf, and M. I. Baig, “Analysis of CFAR techniques,” in 2016 13th International Bhurban Conference on Applied Sciences and Technology (IBCAST), 2016, pp. 654–659.
[4] C.-H. Lin, Y.-C. Lin, Y. Bai, W.-H. Chung, T.-S. Lee, and H. Huttunen, “DL-CFAR: A novel CFAR target detection method based on deep learning,” 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall), pp. 1–6, 2019.
[5] K. El-Darymli, P. McGuire, D. Power, and C. Moloney, “Target detection in synthetic aperture radar imagery: A state-of-the-art survey,” Journal of Applied Remote Sensing, pp. 071 598–071 598, 2013.
[6] A. Farina and F. A. Studer, “A review of CFAR detection techniques in radar systems,” Microwave Journal, vol. 29, p. 115, 1986.
[7] C. R. Barrett, Adaptive Thresholding and Automatic Detection. Boston, MA: Springer US, 1987, pp. 368–393.
[8] V. G. Hansen and J. H. Sawyers, “Detectability loss due to ”Greatest Of” selection in a Cell-Averaging CFAR,” IEEE Transactions on Aerospace and Electronic Systems, pp. 115–118, 1980.
[9] G. Trunk, “Range resolution of targets using automatic detectors,” IEEE Transactions on Aerospace and Electronic Systems, pp. 750–755, 1978.
[10] M. Shor and N. Levanon, “Performances of order statistics CFAR,” IEEE Transactions on Aerospace and Electronic Systems, pp. 214–224, 1991.
[11] H. Rohling and R. Mende, “OS CFAR performance in a 77 ghz radar sensor for car application,” in Proceedings of International Radar Conference, 1996, pp. 109–114.
[12] H. Rohling, “Radar CFAR thresholding in clutter and multiple target situations,” IEEE Transactions on Aerospace and Electronic Systems, pp. 608–621, 1983.
[13] P. Gandhi and S. Kassam, “Analysis of CFAR processors in nonhomogeneous background,” IEEE Transactions on Aerospace and Electronic Systems, pp. 427–445, 1988.
[14] J. T. Rickard and G. M. Dillard, “Adaptive detection algorithms for multiple-target situations,” IEEE Transactions on Aerospace and Electronic Systems, pp. 338–343, 1977.
[15] A. Abbadi, A. Abbane, M. L. Bencheikh, and F. Soltani, “A new adaptive CFAR processor in multiple target situations,” in 2017 Seminar on Detection Systems Architectures and Technologies (DAT), 2017.
[16] Y.-r. Zhang, M.-g. Gao, and Y.-j. Li, “Performance analysis of typical mean-level CFAR detectors in the interfering target background,” in 2014 9th IEEE Conference on Industrial Electronics and Applications, 2014, pp. 1045–1048.
[17] M. Richards, J. Scheer, and W. Holm, Principles of Modern Radar. SciTech Pub., 2010.
[18] Z. Geng, H. Yan, J. Zhang, and D. Zhu, “Deep-Learning for radar: A survey,” IEEE Access, pp. 141 800–141 818, 2021.
[19] W. Jiang, Y. Ren, Y. Liu, and J. Leng, “Artificial neural networks and deep learning techniques applied to radar target detection: A review,” Electronics, 2022.
[20] F. J. Abdu, Y. Zhang, M. Fu, Y. Li, and Z. Deng, “Application of deep learning on millimeter-wave radar signals: A review,” Sensors, 2021.
[21] J. Pearl, “Theoretical impediments to machine learning with seven sparks from the causal revolution,” arXiv preprint arXiv:1801.04016, 2018.
[22] X. Wu, D. Hong, and J. Chanussot, “UIU-Net: U-net in U-Net for infrared small object detection,” IEEE Transactions on Image Processing, vol. 32, pp. 364–376, 2023.
[23] V. Winkler, “Range doppler detection for automotive fmcw radars,” in 2007 European Microwave Conference, 2007, pp. 1445–1448.
[24] O. Ronneberger, P. Fischer, and T. Brox, “U-net: Convolutional networks for biomedical image segmentation,” in Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015. Springer International Publishing, 2015, pp. 234–241.
[25] N. Siddique, S. Paheding, C. P. Elkin, and V. Devabhaktuni, “U-net and its variants for medical image segmentation: A review of theory and applications,” IEEE Access, vol. 9, pp. 82 031–82 057, 2021.
[26] V. Ashish, S. Noam, P. Niki, U. Jakob, J. Llion, G. Aidan, N., K. Lukasz, and I. Polosukhin, “Attention is all you need,” in Proc. Int. Conf. Neural Inf. Process. Syst, 2017, pp. 6000–6010.
[27] Z. Niu, G. Zhong, and H. Yu, “A review on the attention mechanism of deep learning,” Neurocomputing, vol. 452, pp. 48–62, 2021.
[28] X. Qin, Z. Zhang, C. Huang, M. Dehghan, O. R. Zaiane, and M. Jagersand, “U2-Net: Going deeper with nested U-structure for salient object detection,” Pattern Recognition, vol. 106, p. 107404, 2020.
[29] A. Ouaknine, A. Newson, J. Rebut, F. Tupin, and P. Perez, “CARRADA Dataset: Camera and automotive radar with range- angle- doppler annotations,” in 2020 25th International Conference on Pattern Recognition (ICPR). IEEE Computer Society, jan 2021, pp. 5068–5075.
[30] Y.-T. Lin, YOLO-CFAR: a Novel CFAR Target Detection Method Based on YOLO. Institute of Communications Engineering, National Tsing Hua University, 2021.
(此全文20290730後開放外部瀏覽)
電子全文
摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *