帳號:guest(18.116.40.28)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):高暐哲
作者(外文):Kao, Wei-Tse
論文名稱(中文):為光流演算法設計的生物啟發的運動偵測模型
論文名稱(外文):A bio-inspired motion detection model for optical flow
指導教授(中文):羅中泉
指導教授(外文):Lo, Chung-Chuan
口試委員(中文):謝志成
焦傳金
鄭桂忠
口試委員(外文):Hsieh, Chih-Cheng
Chiao, Chuan-Chin
Tang, Kea-Tiong
學位類別:碩士
校院名稱:國立清華大學
系所名稱:系統神經科學研究所
學號:105080551
出版年(民國):109
畢業學年度:108
語文別:英文
論文頁數:28
中文關鍵詞:光流移動偵測Gabor濾波器
外文關鍵詞:optical flowmotion detectonGabor filter
相關次數:
  • 推薦推薦:0
  • 點閱點閱:270
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
光流估算法是計算機視覺的一個重要領域。由於最近在物體追蹤及辨識的應用發展,實時運算的光流法有工程領域上的需求。相較於人工系統,生物視覺系統以高效及低功耗的方式辨識物體的移動。在本文我提出一個以果蠅視覺系統啟發的光流估算法:時空濾波Reichardt模型(STR模型)。相較傳統的生物視覺模型,此模型利用空間濾波以提取廣範圍的物體移動訊息。經過實驗表明其在實際影像處理的可靠性。此演算法適用於平行運算架構,未來可應用於機器人控制與導航系統。
Optical flow estimation is an important task in computer vision. Due to the development of the application of the object recognition and tracking technology, the real-time computing optical flow becomes a requirement in engineering. Compared to artificial system, the biological visual system is able to detect the motion with high efficiency and low power consumption. In this research I proposed a motion estimation algorithm inspired by fly visual system called spatial-temporal filter Reichardt model (SRT model). The spatial filter is applied to extract wide range of motion information. This method is reliable on real image processing. This algorithm is suited for parallel computing architecture and can be applied to robotic control and navigation in the future.
論文摘要 I
Abstract II
Table of Contents III
1 Introduction 1
2 Methods 3
2.1 Spatial-temporal filter Reichardt model 3
2.2 Optical flow algorithm: Lucas–Kanade method 6
2.3 Borst’s Model 7
2.4 Visual stimulus generation 8
3 Results 9
3.1 Spatial-temporal filter Reichardt model 9
3.2 General response properties of the motion detector and the comparison 10
3.3 Testing in real image sequence 17
3.4 Comparison of Computation Cost 19
4 Discussion 21
References 24
[1] M. Joesch, B. Schnell, S. V. Raghu, D. F. Reiff, and A. Borst, “On and off pathwaysin drosophila motion vision,”Nature, vol. 468, no. 7321, p. 300, 2010.
[2] H. B. Barlow and R. M. Hill, “Selective sensitivity to direction of movement inganglion cells of the rabbit retina,”Science, vol. 139, no. 3553, pp. 412–412, 1963.
[3] J. S. Kim, M. J. Greene, A. Zlateski, K. Lee, M. Richardson, S. C. Turaga, M. Pur-caro, M. Balkam, A. Robinson, B. F. Behabadi,etal., “Space–time wiring specificitysupports direction selectivity in the retina,”Nature, vol. 509, no. 7500, p. 331, 2014.
[4] D. H. Hubel and T. N. Wiesel, “Receptive fields of single neurones in the cat’s striatecortex,”The Journal of physiology, vol. 148, no. 3, pp. 574–591, 1959.
[5] N. J. Strausfeldet al., “Atlas of an insect brain,” 1976.
[6] T. Poggio and W. Reichardt, “Considerations on models of movement detection,”Kybernetik, vol. 13, no. 4, pp. 223–227, 1973.
[7] H. Barlow and W. R. Levick, “The mechanism of directionally selective units inrabbit’s retina.,”The Journal of physiology, vol. 178, no. 3, pp. 477–504, 1965.
[8] E. H. Adelson and J. R. Bergen, “Spatiotemporal energy models for the perceptionof motion,”Josa a, vol. 2, no. 2, pp. 284–299, 1985.
[9] B. Hassenstein and W. Reichardt, “Systemtheoretische analyse der zeit-, reihenfolgen-und vorzeichenauswertung bei der bewegungsperzeption desrüsselkäfers chlorophanus,”Zeitschrift für Naturforschung B, vol. 11, no. 9-10,pp. 513–524, 1956.
[10] J. P. Van Santen and G. Sperling, “Elaborated reichardt detectors,”JOSA A, vol. 2,no. 2, pp. 300–321, 1985.
[11] A. Borst and T. Euler, “Seeing things in motion: models, circuits, and mechanisms,”Neuron, vol. 71, no. 6, pp. 974–994, 2011.
[12] K. Yonehara and B. Roska, “Motion detection: neuronal circuit meets theory,”Cell,vol. 154, no. 6, pp. 1188–1189, 2013.24

[13] A. Borst, “Fly visual course control: behaviour, algorithms and circuits,”NatureReviews Neuroscience, vol. 15, no. 9, pp. 590–599, 2014.
[14] A. Borst and M. Helmstaedter, “Common circuit design in fly and mammalian mo-tion vision,”nature neuroscience, vol. 18, no. 8, p. 1067, 2015.
[15] S.-y. Takemura, A. Nern, D. B. Chklovskii, L. K. Scheffer, G. M. Rubin, and I. A.Meinertzhagen, “The comprehensive connectome of a neural substrate for‘on’motion detection in drosophila,”Elife, vol. 6, p. e24394, 2017.
[16] J. A. Strother, S.-T. Wu, A. M. Wong, A. Nern, E. M. Rogers, J. Q. Le, G. M. Rubin,and M. B. Reiser, “The emergence of directional selectivity in the visual motionpathway of drosophila,”Neuron, vol. 94, no. 1, pp. 168–182, 2017.
[17] A. Borst, “A biophysical mechanism for preferred direction enhancement in fly mo-tion vision,”PLoS computational biology, vol. 14, no. 6, p. e1006240, 2018.
[18] H. Eichner, M. Joesch, B. Schnell, D. F. Reiff, and A. Borst, “Internal structure ofthe fly elementary motion detector,”Neuron, vol. 70, no. 6, pp. 1155–1164, 2011.
[19] H. BKP, “Robot vision,” 1986.
[20] A. B. Watson and A. J. Ahumada Jr, “A look at motion in the frequency domain,”1983.
[21] D. J. Heeger, “Model for the extraction of image flow,”JOSA A, vol. 4, no. 8,pp. 1455–1471, 1987.
[22] D. J. Heeger, “Optical flow using spatiotemporal filters,”International journal ofcomputer vision, vol. 1, no. 4, pp. 279–302, 1988.
[23] D. J. Fleet and A. D. Jepson, “Computation of component image velocity from localphase information,”International journal of computer vision, vol. 5, no. 1, pp. 77–104, 1990.
[24] M. Sutton, W. Wolters, W. Peters, W. Ranson, and S. McNeill, “Determination ofdisplacements using an improved digital correlation method,”Imageandvisioncom-puting, vol. 1, no. 3, pp. 133–139, 1983.
[25] B. D. Lucas, T. Kanade,et al., “An iterative image registration technique with anapplication to stereo vision,” 1981.25

[26] B. K. Horn and B. G. Schunck, “Determining optical flow,”Artificial intelligence,vol. 17, no. 1-3, pp. 185–203, 1981.
[27] G. Farnebäck, “Two-frame motion estimation based on polynomial expansion,” inScandinavian conference on Image analysis, pp. 363–370, Springer, 2003.
[28] A. Dosovitskiy, P. Fischer, E. Ilg, P. Hausser, C. Hazirbas, V. Golkov, P. VanDer Smagt, D. Cremers, and T. Brox, “Flownet: Learning optical flow with convolu-tional networks,” inProceedings of the IEEE international conference on computervision, pp. 2758–2766, 2015.
[29] E. Ilg, N. Mayer, T. Saikia, M. Keuper, A. Dosovitskiy, and T. Brox, “Flownet 2.0:Evolution of optical flow estimation with deep networks,” inProceedings of theIEEE conference on computer vision and pattern recognition, pp. 2462–2470, 2017.
[30] A. R. Bruss and B. K. Horn, “Passive navigation,”Computer Vision, Graphics, andImage Processing, vol. 21, no. 1, pp. 3–20, 1983.
[31] A. Wedel, T. Brox, T. Vaudrey, C. Rabe, U. Franke, and D. Cremers, “Stereo-scopic scene flow computation for 3d motion understanding,”International Journalof Computer Vision, vol. 95, no. 1, pp. 29–51, 2011.
[32] C. Mead, “Neuromorphic electronic systems,”Proceedings of the IEEE, vol. 78,no. 10, pp. 1629–1636, 1990.
[33] R. R. Harrison, “A biologically inspired analog ic for visual collision detection,”IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 52, no. 11,pp. 2308–2318, 2005.
[34] T. Zhang, H. Wu, A. Borst, K. Kuhnlenz, and M. Buss, “An fpga implementation ofinsect-inspired motion detector for high-speed vision systems,” in2008 IEEE Inter-national Conference on Robotics and Automation, pp. 335–340, IEEE, 2008.
[35] H. Wu, K. Zou, T. Zhang, A. Borst, and K. Kühnlenz, “Insect-inspired high-speedmotion vision system for robot control,”Biological cybernetics, vol. 106, no. 8-9,pp. 453–463, 2012.
[36] I. Ridwan and H. Cheng, “An event-based optical flow algorithm for dynamic visionsensors,” inInternationalConferenceImageAnalysisandRecognition, pp. 182–189,Springer, 2017.26

[37] G. Haessig, A. Cassidy, R. Alvarez, R. Benosman, and G. Orchard, “Spiking opticalflow for event-based sensors using ibm’s truenorth neurosynaptic system,”IEEEtransactions on biomedical circuits and systems, vol. 12, no. 4, pp. 860–870, 2018.
[38] S. Marĉelja, “Mathematical description of the responses of simple cortical cells,”JOSA, vol. 70, no. 11, pp. 1297–1300, 1980.
[39] J. G. Daugman, “Uncertainty relation for resolution in space, spatial frequency, andorientation optimized by two-dimensional visual cortical filters,”JOSA A, vol. 2,no. 7, pp. 1160–1169, 1985.
[40] I. Fogel and D. Sagi, “Gabor filters as texture discriminator,”Biological cybernetics,vol. 61, no. 2, pp. 103–113, 1989.
[41] D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their prin-ciples,” in2010 IEEE computer society conference on computer vision and patternrecognition, pp. 2432–2439, IEEE, 2010.
[42] S. Baker, D. Scharstein, J. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A databaseand evaluation methodology for optical flow,”International Journal of ComputerVision, vol. 92, no. 1, pp. 1–31, 2011.
[43] J.-Y. Bouguetet al., “Pyramidal implementation of the affine lucas kanade featuretracker description of the algorithm,”Intel Corporation, vol. 5, no. 1-10, p. 4, 2001.
[44] M. J. Black and P. Anandan, “A framework for the robust estimation of optical flow,”in1993 (4th) International Conference on Computer Vision, pp. 231–236, IEEE,1993.
[45] A. Ranjan and M. J. Black, “Optical flow estimation using a spatial pyramid net-work,” inProceedings of the IEEE Conference on Computer Vision and PatternRecognition, pp. 4161–4170, 2017.
[46] C. Forster, M. Pizzoli, and D. Scaramuzza, “Svo: Fast semi-direct monocular vi-sual odometry,” in2014 IEEE international conference on robotics and automation(ICRA), pp. 15–22, IEEE, 2014.
[47] S. Baker and I. Matthews, “Lucas-kanade 20 years on: A unifying framework,”In-ternational journal of computer vision, vol. 56, no. 3, pp. 221–255, 2004.27

[48] O. Haggui, C. Tadonki, F. Sayadi, and O. Bouraoui, “Efficient gpu implementationof lucas-kanade through openacc,” inProceedings of the 14th International JointConference on Computer Vision, Imaging and Computer Graphics Theory and Ap-plications, VISAPP, Prague, Czech Republic, vol. 5, pp. 768–775, 2019.
(此全文未開放授權)
電子全文
中英文摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *