帳號:guest(3.133.148.31)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):李信恩
作者(外文):Lee, Hsin-En
論文名稱(中文):使用步態信息辨識情緒狀態
論文名稱(外文):Employ Gait Information to Identify Emotional States
指導教授(中文):李昀儒
指導教授(外文):Lee, Yun-Ju
口試委員(中文):王俊堯
黃柏鈞
温玉瑭
口試委員(外文):Wang, Chun-Yao
Huang, Po-Chiun
Wen, Yu-Tang
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:107034571
出版年(民國):110
畢業學年度:109
語文別:中文
論文頁數:57
中文關鍵詞:步態情緒辨識心智負荷機器學習
外文關鍵詞:gaitemotion recognitionmental workloadmachine learning
相關次數:
  • 推薦推薦:0
  • 點閱點閱:95
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
步態辨識可用於身份認證和及時追蹤可疑的行為。亦可以應用在人機互動方面,及早發現憂鬱及高壓的傾向。不同的情緒在步態上的表現不盡相同,如步行的速度、步長、關節擺動的幅度等。當人在憤怒時步行速度、手臂擺動的速度會增加;在悲傷時,步行速度、手臂擺動速度則會降低,並且頭部向前傾斜的角度增加。因此,亦可以應用在情緒與步態分析方面,及早發現憂鬱及高壓的傾向。
本實驗招募30位健康的18至30歲之間的受試者,收集四種情緒之步態:中立、開心、悲傷、高心智負荷。藉由回想法分別引起受試者的開心與悲傷情緒;高心智負荷藉由使受試者進行卡排配對測驗以誘發。在誘發情緒之後,受試者於實驗室走道上來回行走,並使用Vicon光學式動作捕捉系統收集不同情緒之步態。以頭部、頸部、肩膀、手肘、胸部、脊椎、腳踝、足部等姿勢參數及步行速度、步長以及步頻等運動學參數作為步態特徵,以SVM(Support Vector Machine, SVM)作為分類模型,使用10折交叉驗證法進行訓練,並以受試者70%的腳步作為訓練集,受試者30%的腳步作為測試集。對中立、開心、悲傷、高心智負荷四種情緒進行辨識。
本研究結果顯示使用高斯核函數(Radial basis function, RBF)對情緒進行辨識可得到較高的準確度,情感辨識之準確度可達88%。並且頭部、頸部、肩膀、手肘、胸部、脊椎、腳踝、足部角度及步行速度、步長、步頻可作為對高心智負荷的辨識特徵。並且步長及步行速度相較於其他情緒有顯著增加。在未來的研究中,可納入更多在不同情緒下之腳步,以增加模型的穩定性及辨識準確度。
Gait recognition can be used as personal identification and suspicious behavior traces. Different emotions result in different walking performances, for instance, walking speed, stride length, swinging amplitude of joints, etc. When people are angry, walking speed, stride length, and swinging speed of arms increase. When people are sad, walking speed and swinging speed of arms decrease, and the head angle of forwarding tilt increases. Hence, it can also be used in emotional and gait analysis applications to early detect depression and high pressure mental state.
In the current study, 30 participants from 18 to 30 years old were recruited for collecting gait performances of neutral, happy, sad and high mental workload. Happy and sad were elicited through autobiographical recollection, and high mental workload was elicited through concentration tasks. After emotion elicitation, participants walked straight on in the laboratory hall back and forth. The gait was recorded with a Vicon optical tracking system. Posture features were the joint angles of head, neck, shoulder, elbow, thorax, spine, ankle and foot. Kinematic features were velocity, cadence and stride length. Neutral, happy, sad and high mental workload identifications were through an SVM classifier with 10-folder cross validation while the proportion of training set and test set was 7:3.
The RBF-SVM model showed high accuracies in emotion classification. The accuracy in emotion classification was up to 88%. In addition, the result revealed that joint angles of head, neck, shoulder, elbow, thorax, spine, ankle, foot, and velocity, cadence, stride length are able to identify high mental workload, and velocity and stride length significantly increased more than other emotion. In future research, more subject data will be collected to enhance the robustness of the model and classification accuracy.
摘要 I
Abstract II
目錄 III
表目錄 IV
圖目錄 V
第一章 緒論 1
1.1. 研究背景與動機 1
1.2. 研究目的與範圍 2
1.3. 研究架構與流程 3
第二章 文獻回顧 4
2.1. 影響步態的因素 4
2.1.1. 步態基本特性 4
2.1.2. 生理性因素 6
2.1.3. 心理性因素 7
2.2. 情緒定義 10
2.2.1. 正負向情緒量表 (PANAS) 12
2.3. 心智負荷定義 15
2.3.1. NASA任務負荷問卷 (NASA – TLX) 16
2.4. 步態辨識之分析方法 19
第三章 研究方法 22
3.1. 研究對象與使用器材 22
3.2. 實驗方法與流程 24
3.3. 數據分析 25
第四章 研究結果與討論 29
4.1. 量表評估結果 29
4.1.1. Postive And Negative Affect Schedule(PANAS) 29
4.1.2. NASA-TLX 31
4.2. 辨識績效 33
4.2.1. 績效指標評估 34
4.2.2. 特徵重要性 37
4.3. 特徵選擇後之績效 41
4.4. 研究限制 44
第五章 結論與未來方向 45
附錄一 46
附錄二 52
參考文獻 54

中文部分:
1. 李棟洲,武俊傑(2015)。老年步態障礙。臺灣老年醫學暨老年學雜誌,10(1),1-15。
2. 陳皎眉,楊家雯(2009)。情緒調節與情緒管理。T&D飛訊第81期,1-18。
3. 許勝雄、彭游、吳水丕(2004),人因工程,滄海出版社。
4. 劉伯祥、石裕川、孫益生(2010)。大貨車五項駕訓科目之工作負荷及訓練成效 評估研究。運輸計劃季刊,39(3),251-276。
5. 盧毓文(2012)。情緒對類別學習之影響。政大心理學研究所,碩士論文。
外文部分:
1. Argyle, M. (2013). Bodily communication: Routledge.
2. Begg, R., & Kamruzzaman, J. (2005). A machine learning approach for automated
recognition of movement patterns using basic, kinetic and kinematic gait data. Journal of
biomechanics, 38(3), 401-408.
3. Brans, K., & Verduyn, P. (2014). Intensity and duration of negative emotions: Comparing
the role of appraisals and regulation strategies. PLoS One, 9(3), e92410.
4. Brewer, D., Doughtie, E. B., & Lubin, B. (1980). Induction of mood and mood shift. Journal of Clinical Psychology, 36(1), 215-226.
5. Butterfield, P., Valanis, B., Spencer, P., Lindeman, C., & Nutt, J. (1993). Environmental antecedents of young‐onset Parkinson's disease. Neurology, 43(6), 1150-1150.
6. Cain, B. (2007). A review of the mental workload literature. Retrieved from
7. Chiu, M., Shu, J., & Hui, P. (2018). Emotion Recognition through Gait on Mobile Devices. Paper presented at the 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).
8. Crane, E., & Gross, M. (2007). Motion capture and emotion: Affect detection in whole body movement. Paper presented at the International Conference on Affective Computing and Intelligent Interaction.
9. De Gelder, B. (2006). Towards the neurobiology of emotional body language. Nature Reviews Neuroscience, 7(3), 242.
10. Deligianni, F., Guo, Y., & Yang, G.-Z. (2019). From Emotions to Mood Disorders: A Survey on Gait Analysis Methodology. IEEE journal of biomedical and health informatics, 23(6), 2302-2316.
11. Ekman, P. (1993). Facial expression and emotion. American psychologist, 48(4), 384.
12. Grier, R. A. (2015). How high is high? A meta-analysis of NASA-TLX global workload scores. Paper presented at the Proceedings of the Human Factors and Ergonomics Society Annual Meeting.
13. Guyon, I., Weston, J., Barnhill, S., & Vapnik, V. (2002). Gene selection for cancer classification using support vector machines. Machine learning, 46(1-3), 389-422.
14. Halovic, S., & Kroos, C. (2018). Not all is noticed: Kinematic cues of emotion-specific gait. Human movement science, 57, 478-488.
15. Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology (Vol. 52, pp. 139-183): Elsevier.
16. Izard, C. E. (2013). Human emotions: Springer Science & Business Media.
17. James, W. T. (1932). A study of the expression of bodily posture. The Journal of general psychology, 7(2), 405-437.
18. Jiménez, F. (2017). Intelligent Vehicles: Enabling technologies and future developments: Butterworth-Heinemann.
19. Karg, M., Kuhnlenz, K., & Buss, M. (2010). Recognition of affect based on gait patterns. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 40(4), 1050-1061.
20. Karg, M., Samadani, A.-A., Gorbet, R., Kühnlenz, K., Hoey, J., & Kulić, D. (2013). Body movements for affective expression: A survey of automatic recognition and generation. IEEE Transactions on Affective Computing, 4(4), 341-359.
21. Kleinsmith, A., & Bianchi-Berthouze, N. (2012). Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1), 15-33.
22. LaValle, S. M., Branicky, M. S., & Lindemann, S. R. (2004). On the relationship between classical grid search and probabilistic roadmaps. The International Journal of Robotics Research, 23(7-8), 673-692.
23. Lin, H.-T., & Lin, C.-J. (2003). A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. submitted to Neural Computation, 3(1-32), 16.
24. Lin, M.-I. B., & Lin, K.-H. (2016). Walking while performing working memory tasks changes the prefrontal cortex hemodynamic activations and gait kinematics. Frontiers in behavioral neuroscience, 10, 92.
25. Loutfi, A., Widmark, J., Wikstrom, E., & Wide, P. (2003). Social agent: Expressions driven by an electronic nose. Paper presented at the IEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, 2003. VECIMS'03. 2003.
26. Mahlknecht, P., Kiechl, S., Bloem, B. R., Willeit, J., Scherfler, C., Gasperi, A., . . . Seppi, K. (2013). Prevalence and burden of gait disorders in elderly men and women aged 60–97 years: a population-based study. PLoS One, 8(7), e69627.
27. Mason, J. E., & Woungang, I. (2016). Machine Learning Techniques for Gait Biometric Recognition: Springer.
28. Matthews, R., Legg, S., & Charlton, S. (2003). The effect of cell phone type on drivers subjective workload during concurrent driving and conversing. Accident Analysis & Prevention, 35(4), 451-457.
29. Montepare, J. M., Goldstein, S. B., & Clausen, A. (1987). The identification of emotions from gait information. Journal of Nonverbal Behavior, 11(1), 33-42.
30. Murray, M. P., Drought, A. B., & Kory, R. C. (1964). Walking patterns of normal men. JBJS, 46(2), 335-360.
31. Perry, J., & Davids, J. R. (1992). Gait analysis: normal and pathological function. Journal of Pediatric Orthopaedics, 12(6), 815.
32. Randhavane, T., Bera, A., Kapsaskis, K., Bhattacharya, U., Gray, K., & Manocha, D. (2019). Identifying Emotions from Walking using Affective and Deep Features. arXiv preprint arXiv:1906.11884.
33. Roether, C. L., Omlor, L., Christensen, A., & Giese, M. A. (2009). Critical features for the perception of emotion from gait. Journal of vision, 9(6), 15-15.
34. Russell, J. A., Bachorowski, J.-A., & Fernández-Dols, J.-M. (2003). Facial and vocal expressions of emotion. Annual review of psychology, 54(1), 329-349.
35. Sawada, M., Suda, K., & Ishii, M. (2003). Expression of emotions in dance: Relation between arm movement characteristics and emotion. Perceptual and motor skills, 97(3), 697-708.
36. Schmidt, K. L., & Cohn, J. F. (2001). Human facial expressions as adaptations: Evolutionary questions in facial expression research. American Journal of Physical Anthropology: The Official Publication of the American Association of Physical Anthropologists, 116(S33), 3-24.
37. Wallbott, H. G. (1998). Bodily expression of emotion. European journal of social psychology, 28(6), 879-896.
38. Watson, D., & Clark, L. A. (1984). Negative affectivity: the disposition to experience aversive emotional states. Psychological bulletin, 96(3), 465.
39. Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: the PANAS scales. Journal of personality and social psychology, 54(6), 1063.
40. Watson, D., & Tellegen, A. (1985). Toward a consensual structure of mood. Psychological bulletin, 98(2), 219.
41. Wickens, C. D., Hollands, J. G., Banbury, S., & Parasuraman, R. (2015). Engineering psychology and human performance: Psychology Press.
42. Zeng, Z., Pantic, M., Roisman, G. I., & Huang, T. S. (2008). A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE transactions on pattern analysis and machine intelligence, 31(1), 39-58.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *