帳號:guest(18.221.54.138)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):謝妤榛
作者(外文):Xie, Yu Zhen
論文名稱(中文):適用於無線低功耗嵌入式裝置上即時手勢辨識的字串匹配演算法
論文名稱(外文):String Matching Algorithm for Real-Time Hand Gesture Recognition on a Wireless Low-Power Embedded Device
指導教授(中文):周百祥
指導教授(外文):Chou, Pai Hsiang
口試委員(中文):周志遠
蔡明哲
口試委員(外文):Chou, Chih Yuan
Tsai, Ming Jer
學位類別:碩士
校院名稱:國立清華大學
系所名稱:資訊工程學系
學號:103062533
出版年(民國):105
畢業學年度:104
語文別:英文
論文頁數:32
中文關鍵詞:手勢辨識字串匹配嵌入式裝置加速規
外文關鍵詞:Hand Gesture RecognitionString MatchingEmbedded DeviceAccelerometer
相關次數:
  • 推薦推薦:0
  • 點閱點閱:431
  • 評分評分:*****
  • 下載下載:11
  • 收藏收藏:0
本論文提出一個即時手勢辨識演算法。它適用於有無線傳輸功能、穿載式裝置裡使用的低功耗微處理器上執行。此演算法把動作片段對映到符號,而手勢的加速與方向則以字串表達出來,如此可以套用字串匹配演算法辨識手勢。實驗結果顯示,我們的演算法辨識Graffiti字母與一組自定手勢,有相當高的成功辨識率。同時,演算法的複雜度可保持能在八位元微處理器上即時執行與傳輸的範圍內。
This thesis presents a real-time hand-gesture recognition algorithm suitable for implementation on low-power microcontroller units (MCU) for wearable devices with wireless transmission capability. The algorithm performs string matching against a set of patterns that symbolize acceleration and motion directions. Experimental results show that our algorithm can recognize and transmit characters in the Graffiti alphabet and gestures with high accuracy while keeping complexity low enough to be feasible to run on an 8-bit microcontroller unit.
Contents i
Acknowledgments v
1 Introduction 1
1.1 Motivation 1
1.2 Objectives 3
1.3 Approach 3
1.4 Contributions 4
1.5 Thesis Organization 4
2 Related Work 5
2.1 Hidden Markov Model 5
2.2 Dynamic Time Warping 5
2.3 String Matching 6
3 Recognition Method 7
3.1 Levenshtein Distance 7
3.2 Recognition Algorithm 8
3.2.1 Temporal Aggregation of Acceleration 9
3.2.2 Calculation of Trajectory Direction 10
3.2.3 Symbolization of Acceleration 10
3.2.4 Symbolization of Trajectory Direction 11
3.2.5 Spotting 12
3.2.6 Resolving Spotting Conflicts 14
3.3 Complexity Analysis 14
3.3.1 Time Complexity 16
3.3.2 Space Complexity 16
4 Evaluation 18
4.1 Gesture Vocabulary 18
4.1.1 Basic Gestures 18
4.1.2 Graffiti Alphabet 18
4.2 Experimental Setup 19
4.2.1 Wearable Platform 19
4.2.2 Host Device 19
4.2.3 Data Collection 20
4.3 Experimental Results 20
4.3.1 Recognition Accuracy 20
4.3.2 Run-Time Complexity 24
5 Conclusions and Future Work 27
5.1 Conclusions 27
5.2 Future Work 28
5.2.1 Gesture Segmentation Algorithm 28
5.2.2 Target Hardware Platform 28

List of Figures
3.1 The flowchart of our recognition algorithm 9
3.2 Illustration of a sliding window 9
3.3 Effect of the temporal aggregation 9
3.4 A trajectory direction in the two-dimensional Cartesian coordinate system 10
3.5 Symbolization of acceleration of a motion 11
3.6 The four unit vectors in the two-dimensional Cartesian coordinate system in the codebook 12
3.7 Symbolization of direction vectors of a motion 13
3.8 Motion of up-down gesture 14
3.9 Initialization of the three Levenshtein-distance tables for the up-down gesture 15
3.10 Determine each element in the y-axis Levenshtein-distance table 15
3.11 Determine each element in the direction Levenshtein-distance table 16
4.1 Motions of eight basic gestures 18
4.2 Motions of 26 letters and one space in the Graffiti alphabet 19
4.3 The top view of EcoBT Super 20
4.4 EcoBT Super with a button 21
4.5 The testing process of basic gestures 21
4.6 The recognition accuracy of the basic gesture recognition 22
4.7 The recognition accuracy of the handwriting Graffiti recognition 23
4.8 Confusion matrices for the basic gesture recognition (the average accuracy is 87.6%) 24
4.9 Confusion matrices for the user-dependent handwriting Graffiti alphabet recognition
(the average accuracy is 82.5%) 25
4.10 The execution time of recognizing one basic gesture 26
4.11 The execution time of recognizing one letter in Graffiti alphabet 26
[1] C. Wang, Z. Liu, and S. C. Chan. Superpixel-based hand gesture recognition with kinect depth camera. IEEE Transactions on Multimedia, 17(1):29–39, January 2015. ISSN 1520-9210.
[2] Lei Jing, Kaoru Yamagishi, Junbo Wang, Yinghui Zhou, Tongjun Huang, and Zixue Cheng. A unified method for multiple home appliances control through static finger gestures. International Symposium on Applications and the Internet, pages 82–90, 2011.
[3] Hark-Joon Kim, Kyung-Ho Jeong, Seon-Kyo Kim, and Tack-Don Han. Ambient wall: Smart wall display interface which can be controlled by simple gesture for smart home. SIGGRAPH Asia, 2011.
[4] Christine Kühnel, Tilo Westermann, Fabian Hemmert, Sven Kratzc, Alexander Müllera, and Sebastian Möllera. I’m home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies, 69:693–704, October 2011.
[5] Utpal V. Solanki and Nilesh H. Desai. Hand gesture based remote control for home applicances: Handmote. Information and Communication Technologies (WICT), pages 419–423, December 2011.
[6] Tao Lu. A motion control method of intelligent wheelchair based on hand gesture recognition. Proc. 8th IEEE Conf. Ind. Electron. Appl. (ICIEA), pages 957–962, June 2013.
[7] Xu Zhang and Ping Zhou. High-density myoelectric pattern recognition toward improved stroke rehabilitation. IEEE Transactions on Biomedical Engineering, pages 1649–1657, March 2012.
[8] Hye Sun Park, Eun Yi Kim, Sang Su Jang, Se Hyun Park, Min Ho Park, and Hang Joon Kim. HMM-based gesture recognition for robot control. Iberian Conference on Pattern Recognition and Image Analysis, pages 607–614, June 2005.
[9] Zhe Ji, Zhi-Yi Li, Peng Li, and MaoBo An. A new effective wearable hand gesture recognition algorithm with 3-axis accelerometer. Fuzzy Systems and Knowledge Discovery (FSKD), pages 1243–1247, August 2015.
[10] Ahmad Akl, Chen Feng, and Shahrokh Valaee. A novel accelerometer-based gesture recognition system. IEEE Transactions on Signal Processing, 59(12):6197–6205, December 2011.
[11] Chao Xu, Parth H. Pathak, and Prasant Mohapatra. Finger-writing with smartwatch: A case for finger and hand gesture recognition using smartwatch. Proceedings of the 16th International Workshop on Mobile Computing Systems and Applications, pages 9–14, 2015.
[12] Bluetooth Low Energy. https://www.bluetooth.com/what-is-bluetooth-technology/bluetooth-technology-basics/low-energy.
[13] ANT+. http://www.thisisant.com/.
[14] ZigBee. http://www.zigbee.org/.
[15] V. I. Levenshtein. Binary codes capable of correcting deletions, insertions, and reversals. Soviet Physics Doklady, 10(8):707–710, February 1966.
[16] Christoph Amma, Marcus Georgi, and Tanja Schultz. Airwriting: a wearable handwriting recognition system. Personal and Ubiquitous Computing, 18:191–203, January 2014.
[17] Thomas Schlömer, Benjamin Poppinga, Niels Henze, and Susanne Boll. Gesture recognition with aWii controller. Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pages 11–14, February 2008.
[18] Dae-Won Kima, Jaesung Leea, Hyunki Lima, Jeongbong Seoa, and Bo-Yeong Kang. Efficient dynamic time warping for 3D handwriting recognition using gyroscope equipped smartphones. Expert Systems with Applications, 41:5180––5189, September 2014.
[19] G.A. ten Holt, M.J.T. Reinders, and E.A. Hendriks. Multi-dimensional dynamic time warping for gesture recognition. The Conference of the Advanced School for Computing and Imaging, June 2007.
[20] Thomas Stiefmeier, Daniel Roggen, and Gerhard Tröster. Gestures are strings: Efficient online gesture spotting and classification using string matching. Proceedings of the ICST 2nd International Conference on Body Area Network, pages 1–8, June 2007.
[21] Ruize Xu, Shengli Zhou, and Wen J. Li. Mems accelerometer based nonspecific-user hand gesture recognition. IEEE Sensors Journal, 12(5), May 2012.
[22] Long-Van Nguyen-Dinh, Daniel Roggen, Alberto Calatroni, and Gerhard Tröster. Improving online gesture recognition with template matching methods in accelerometer data. Intelligent Systems Design and Applications, pages 831–836, November 2012.
[23] R. A. Wagner and M. J. Fischer. The string-to-string correction problem. Journal of the ACM (JACM)), 21(10):168–173, 1974.
[24] R. Bellman. On the theory of dynamic programming. Proc Natl Acad Sci USA, 38(8):716–719, 1952.
[25] C. H. Blickenstorfer. Graffiti: Wow! Pen Computing Magazine, 1995.
[26] Hussain Tinwala and I. Scott MacKenzie. Eyes-free text entry on a touchscreen phone. Proceedings of the IEEE Toronto International Conference Science and Techonology for Humanity, pages 83–89, 2009.
[27] Texas Instruments. 2.4-GHz Bluetooth Low Energy System-on-Chip. http://www.ti.com/lit/ds/symlink/cc2540.pdf, November 2012.
[28] ST Microelectronics. MEMS digital output motion sensor: Ultra low-power high performance 3-axes “nano” accelerometer. http://www.st.com/st-web-ui/static/active/en/resource/technical/document/datasheet/CD00213470.pdf.
[29] Punch Through Design LLC. LightBlue Explorer. https://itunes.apple.com/us/app/ lightblue-explorer-bluetooth/id557428110?mt=8/.
[30] J. S. Wang and F. C. Chuang. An accelerometer-based digital pen with a trajectory recognition algorithm for handwritten digit and gesture recognition. IEEE Transactions on Industrial Electronics, 59(7):2998–3007, July 2012. ISSN 0278-0046.
[31] Kai-Ling Huang. Power management for motion applications based on Bluetooth 4.0 low energy
technology. Master’s thesis, 2013.
[32] Yinghui Zhou, Zixue Cheng, and Lei Jing. Threshold selection and adjustment for online segmentation of one-stroke finger gestures using single tri-axial accelerometer. Multimedia Tools and Applications, 74(21):9387–9406, 2015. ISSN 1573-7721.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *