帳號:guest(18.191.210.170)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):蔡福昇
作者(外文):Tsai, Fu-Sheng
論文名稱(中文):開發計算與量化人類行為於個人文化維度與疼痛情緒之應用
論文名稱(外文):Toward Development of Computational Modeling and Quantifying Behaviors in Culture Dimension and Pain-Affect
指導教授(中文):李祈均
指導教授(外文):Lee, Chi-Chun
口試委員(中文):胡敏君
黃元豪
郭柏志
郭立威
曹昱
口試委員(外文):Hu, Min-Chun
Huang, Yuan-Hao
Kuo, Po-Chih
Kuo, Li-Wei
Tsao, Yu
學位類別:博士
校院名稱:國立清華大學
系所名稱:電機工程學系
學號:104061536
出版年(民國):110
畢業學年度:109
語文別:英文
論文頁數:124
中文關鍵詞:行為訊號處理疼痛情緒文化權力距離功能性磁振造影韻律學中心損失多頭注意力多任務學習
外文關鍵詞:Behavioral Signal Processing (BSP)PainEmotionCulture power distancefMRIprosodycenter-loss embeddingMulti-Head AttentionMulti-Task Learning
相關次數:
  • 推薦推薦:0
  • 點閱點閱:709
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
人類行為是複雜的相互作用,其表達在本質上是多模態的。語音、語言、視覺、生理訊號等行為線索提供量測模擬人類行為的方法。以人為中心的應用提供了提取行為信息與量化成數位訊號的計算方法。在本論文中,我們發表行為信號處理 (BSP) 中的量化建模應用於理解文化維度、疼痛和情感。文化是在現實生活互動中支配人類思想和社會行為的社會建構,使用多模態聲學韻律和大腦連結自動評估文化權力距離。我們提出的社會條件增強網絡達到 96.2% 的二元分類準確率,分析揭示了重要的韻律特徵和大腦區域,證明了量化個人對權力狀態的作用。另一方面,疼痛為身體或精神損傷引起的情緒體驗和不愉快的感覺。我們提出了屬性增強多頭注意力網絡,其中包括情緒、個人特徵和臨床參數,透過病患的聲音特徵和臉部表情進行建模來測量疼痛與情緒表達。在病患自述的疼痛指數辨識中,我們在二元和三元分類中分別獲得了 70.1% 和 52.1%,並證明了情緒的正負性與自述疼痛指數呈高度負相關,且比直接觀察疼痛更相關於自述疼痛指數。此外,我們提出了一個與情緒狀態、臨床參數和個人特徵共同學習的架構,達到了 58.64%(輕度疼痛、年齡、性別和疼痛部位的組合)和 46.88%(輕度疼痛、性別和疼痛部位的組合)於三元正負性(Valence)和喚醒度(Arousal)分類任務中。視覺化分析顯示多頭注意力機制對患者的情感狀態有分佈式專注化的效應。此外,臨床結果的分析顯示屬性對於用藥與住院處置有單變量和多變量顯著性。
Human behaviors are the complex interplay, and the expression are inherently multimodal. Behavioral cues such as speech, language, visual, physiological signals offer the means to measure and model human behavior. Computational approaches bring the way to extract behavior informatics and quantify as digital signals for human-centered applications. In this dissertation, we describe applications in behavioral signal processing (BSP) for quantitatively modeling and understanding cultural dimension, pain, and emotion. Culture is the social construct that dictates human thoughts and social behaviors during real life interaction. The first application is to automatically assess cultural power distance using multimodal acoustic prosody and brain connectivity. Our proposed social condition-enhanced network achieves 96.2% binary recognition accuracy, and the analyses reveal the significant prosodic features and brain region, demonstrating the role in quantifying individual belief about power status. On the other hand, pain is defined as an emotional experience and unpleasant sensation that results from physical or mental damages. We propose attribute-enhanced multi-head attention network which includes emotion, personal traits, and clinical parameters for measuring the pain-related emotion expressions by modeling patient’s vocal characteristics and facial expressions. In the experiment of patient’s self-reported pain level recognition, we obtain 70.1% and 52.1 in binary and ternary classification, and demonstrates that rated valence state is highly negative to self-reported pain level and is more correlated to pain level than to rate directly on the pain intensity. In addition, we present a computational framework that jointly learned with emotional states, clinical parameters, and personal traits, and achieves 58.64% (for the combination of mild pain, age, gender, and the pain site) and 46.88% (for the combination of mild pain, gender, and the pain site) in ternary arousal and valence classification tasks. The analysis of visualization indicates the multi-head attention models patient’s affect states with distributed and diverse attentive effects. In addition, the analyses of clinical outcomes demonstrate that attributes have both univariate and multivariate significance for analgesic prescription and patient disposition.
中文摘要 1
Abstract 2
致 謝 4
Contents 5
List of Figures 7
List of Tables 9
Culture Dimension 11
Introduction 12
1.1 Research Background 12
1.2 Related Work 15
1.3 Motivation 18
1.4 Contributions 22
1.5 Organization 24
Research Dataset 25
2.1 The Multimodal Social-Distance Dataset 25
Research Methodology 33
3.1 Acoustic Dynamic Prosodic Feature Extraction 33
3.2 Neural Connectivity Graph Embedding 36
3.3 Social Condition-Enhanced Network (SC-eN) 38
3.4 Multimodal Power Distance Classification 40
Experimental Setup and Results 41
4.1 Experiment I: Power Distance Recognition 41
Conclusion and Future Works 56
5.1 Conclusions 56
5.2 Future Works 58
Pain-Affect 60
Introduction 61
6.1 Research Background 61
6.2 Related Work 64
6.3 Motivation 66
6.4 Contributions 68
6.5 Organization 70
Research Database 71
7.1 The Multimodal Triage Pain-Level Database 71
Research Methodology 77
8. 77
8.1 Attribute-Enhanced Multi-Head Attention Network 77
Experimental Setup and Results 82
9.1 Experiment I: Emotional States Analyses 82
9.2 Experiment II: Speech Emotion Recognition on Triage Pain Database 84
9.3 Experiment III: Multivariate Analysis: Attributes versus Outcomes 99
Conclusions and Future Works 109
10.1 Conclusions 109
10.2 Future Works 111
Reference 113

[1] O. P. John, E. M. Donahue, and R. L. Kentle, "Big five inventory," Journal of Personality and Social Psychology, 1991.
[2] L. Y. Mano et al., "Exploiting IoT technologies for enhancing Health Smart Homes through patient identification and emotion recognition," Computer Communications, vol. 89, pp. 178-190, 2016.
[3] S. Narayanan and P. G. Georgiou, "Behavioral signal processing: Deriving human behavioral informatics from speech and language," Proceedings of the IEEE, vol. 101, no. 5, pp. 1203-1233, 2013.
[4] E. B. Tylor, Primitive culture: Researches into the development of mythology, philosophy, religion, art and custom. J. Murray, 1871.
[5] F. M. Keesing, Cultural anthropology. Mittal Publications, 1965.
[6] A. Bandura, "Social cognitive theory in cultural context," Applied psychology, vol. 51, no. 2, pp. 269-290, 2002.
[7] L. K. Trevino, "Ethical decision making in organizations: A person-situation interactionist model," Academy of management Review, vol. 11, no. 3, pp. 601-617, 1986.
[8] G. Hofstede, G. J. Hofstede, and M. Minkov, Cultures and organizations: Software of the mind. Mcgraw-hill New York, 2005.
[9] G. Hofstede, Culture's consequences: Comparing values, behaviors, institutions and organizations across nations. Sage publications, 2001.
[10] V. Taras, B. L. Kirkman, and P. Steel, "" Examining the impact of Culture’s Consequences: A three-decade, multilevel, meta-analytic review of Hofstede’s cultural value dimensions": Correction to Taras, Kirkman, and Steel (2010)," 2010.
[11] J. Brockner et al., "Culture and procedural justice: The influence of power distance on reactions to voice," Journal of Experimental Social Psychology, vol. 37, no. 4, pp. 300-315, 2001.
[12] B. L. Kirkman, G. Chen, J.-L. Farh, Z. X. Chen, and K. B. Lowe, "Individual power distance orientation and follower reactions to transformational leaders: A cross-level, cross-cultural examination," Academy of management journal, vol. 52, no. 4, pp. 744-764, 2009.
[13] G. Hofstede, "Cultural dimensions in management and planning," Asia Pacific journal of management, vol. 1, no. 2, pp. 81-99, 1984.
[14] V. Aubergé, "A Gestalt morphology of prosody directed by functions: the example of a step by step model developed at ICP," in Speech Prosody 2002, International Conference, 2002.
[15] A. K. Uskul, S. Paulmann, and M. Weick, "Social power and recognition of emotional prosody: High power is associated with lower recognition accuracy than low power," Emotion, vol. 16, no. 1, p. 11, 2016.
[16] M. Weick, A. Guinote, and D. Wilkinson, "Lack of power enhances visual perceptual discrimination," Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, vol. 65, no. 3, p. 208, 2011.
[17] H. Mixdorff, A. Hönemann, and A. Rilliard, "Acoustic-prosodic analysis of attitudinal expressions in german," in Sixteenth Annual Conference of the International Speech Communication Association, 2015.
[18] R. Van Bezooijen and C. Gooskens, "Identification of language varieties: The contribution of different linguistic levels," Journal of language and social psychology, vol. 18, no. 1, pp. 31-48, 1999.
[19] D. S. Hurley, "Issues in teaching pragmatics, prosody, and non-verbal communication," Applied Linguistics, vol. 13, no. 3, pp. 259-280, 1992.
[20] J. West and J. L. Graham, "A linguistic-based measure of cultural distance and its relationship to managerial values," MIR: Management International Review, pp. 239-260, 2004.
[21] S. Grawunder, M. Oertel, and C. Schwarze, "Politeness, culture, and speaking task–Paralinguistic prosodic behavior of speakers from Austria and Germany," in Speech Prosody, 2014, vol. 2014, pp. 159-163.
[22] A. Barbulescu, R. Ronfard, and G. Bailly, "Which prosodic features contribute to the recognition of dramatic attitudes?," Speech Communication, vol. 95, pp. 78-86, 2017.
[23] T. Shochi, A. Rilliard, V. Aubergé, and D. Erickson, "Intercultural perception of English, French and Japanese social affective prosody," The role of prosody in Affective Speech, vol. 97, p. 31, 2009.
[24] J. E. Koski, H. Xie, and I. R. Olson, "Understanding social hierarchies: The neural and psychological foundations of status perception," Social neuroscience, vol. 10, no. 5, pp. 527-550, 2015.
[25] S.-L. Liew, Y. Ma, S. Han, and L. Aziz-Zadeh, "Who's afraid of the boss: cultural differences in social hierarchies modulate self-face recognition in Chinese and Americans," PloS one, vol. 6, no. 2, p. e16901, 2011.
[26] A. Jaimes and N. Dimitrova, "Human-centered multimedia: Culture, deployment, and access," IEEE MultiMedia, vol. 13, no. 1, pp. 12-19, 2006.
[27] L. Batrinca, N. Mana, B. Lepri, N. Sebe, and F. Pianesi, "Multimodal personality recognition in collaborative goal-oriented tasks," IEEE Transactions on Multimedia, vol. 18, no. 4, pp. 659-673, 2016.
[28] J.-I. Biel and D. Gatica-Perez, "The youtube lens: Crowdsourced personality impressions and audiovisual analysis of vlogs," IEEE Transactions on Multimedia, vol. 15, no. 1, pp. 41-55, 2012.
[29] M. J. Scott, S. C. Guntuku, W. Lin, and G. Ghinea, "Do personality and culture influence perceived video quality and enjoyment?," IEEE Transactions on Multimedia, vol. 18, no. 9, pp. 1796-1807, 2016.
[30] M. J. Scott, S. C. Guntuku, Y. Huan, W. Lin, and G. Ghinea, "Modelling human factors in perceptual multimedia quality: On the role of personality and culture," in Proceedings of the 23rd ACM international conference on Multimedia, 2015, pp. 481-490.
[31] S. C. Guntuku, W. Lin, M. J. Scott, and G. Ghinea, "Modelling the influence of personality and culture on affect and enjoyment in multimedia," in 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), 2015: IEEE, pp. 236-242.
[32] P. Varini, G. Serra, and R. Cucchiara, "Personalized egocentric video summarization of cultural tour on user preferences input," IEEE Transactions on Multimedia, vol. 19, no. 12, pp. 2832-2845, 2017.
[33] R. J. House, P. J. Hanges, M. Javidan, P. W. Dorfman, and V. Gupta, Culture, leadership, and organizations: The GLOBE study of 62 societies. Sage publications, 2004.
[34] S. H. Schwartz, "Beyond individualism/collectivism: New cultural dimensions of values," 1994.
[35] P. B. Smith, S. Dugan, and F. Trompenaars, "National culture and the values of organizational employees: A dimensional analysis across 43 nations," Journal of cross-cultural psychology, vol. 27, no. 2, pp. 231-264, 1996.
[36] V. Taras, P. Steel, and B. L. Kirkman, "Improving national cultural indices using a longitudinal meta-analysis of Hofstede's dimensions," Journal of World Business, vol. 47, no. 3, pp. 329-341, 2012.
[37] C. Demetriou, B. Ozer, and C. Essau, "Self-report questionnaires. Encycl," Clin. Psychol, 2015.
[38] D. Stevanovic et al., "Does the Strengths and Difficulties Questionnaire–self report yield invariant measurements across different nations? Data from the International Child Mental Health Study Group," Epidemiology and psychiatric sciences, vol. 24, no. 4, pp. 323-334, 2015.
[39] V. A. Scholtes, C. B. Terwee, and R. W. Poolman, "What makes a measurement instrument valid and reliable?," Injury, vol. 42, no. 3, pp. 236-240, 2011.
[40] R. W. Picard, Affective computing. MIT press, 2000.
[41] C. M. Lee and S. S. Narayanan, "Toward detecting emotions in spoken dialogs," IEEE transactions on speech and audio processing, vol. 13, no. 2, pp. 293-303, 2005.
[42] M. El Ayadi, M. S. Kamel, and F. Karray, "Survey on speech emotion recognition: Features, classification schemes, and databases," Pattern recognition, vol. 44, no. 3, pp. 572-587, 2011.
[43] E. Sariyanidi, H. Gunes, and A. Cavallaro, "Automatic analysis of facial affect: A survey of registration, representation, and recognition," IEEE transactions on pattern analysis and machine intelligence, vol. 37, no. 6, pp. 1113-1133, 2014.
[44] M. Karg, A.-A. Samadani, R. Gorbet, K. Kühnlenz, J. Hoey, and D. Kulić, "Body movements for affective expression: A survey of automatic recognition and generation," IEEE Transactions on Affective Computing, vol. 4, no. 4, pp. 341-359, 2013.
[45] E. Crane and M. Gross, "Motion capture and emotion: Affect detection in whole body movement," in International Conference on Affective Computing and Intelligent Interaction, 2007: Springer, pp. 95-101.
[46] R. A. Calvo and S. D'Mello, "Affect detection: An interdisciplinary review of models, methods, and their applications," IEEE Transactions on affective computing, vol. 1, no. 1, pp. 18-37, 2010.
[47] R. W. Picard, E. Vyzas, and J. Healey, "Toward machine emotional intelligence: Analysis of affective physiological state," IEEE transactions on pattern analysis and machine intelligence, vol. 23, no. 10, pp. 1175-1191, 2001.
[48] C. Busso et al., "Analysis of emotion recognition using facial expressions, speech and multimodal information," in Proceedings of the 6th international conference on Multimodal interfaces, 2004, pp. 205-211.
[49] H. Gunes, M. Piccardi, and M. Pantic, "From the lab to the real world: Affect recognition using multiple cues and modalities," in Affective Computing: IntechOpen, 2008.
[50] F.-S. Tsai, H.-C. Yang, W.-W. Chang, and C.-C. Lee, "Automatic Assessment of Individual Culture Attribute of Power Distance Using a Social Context-Enhanced Prosodic Network Representation," in INTERSPEECH, 2018, pp. 436-440.
[51] D. McColl, A. Hong, N. Hatakeyama, G. Nejat, and B. Benhabib, "A survey of autonomous human affect detection methods for social robots engaged in natural HRI," Journal of Intelligent & Robotic Systems, vol. 82, no. 1, pp. 101-133, 2016.
[52] G. Park et al., "Automatic personality assessment through social media language," Journal of personality and social psychology, vol. 108, no. 6, p. 934, 2015.
[53] M. B. Harms, A. Martin, and G. L. Wallace, "Facial emotion recognition in autism spectrum disorders: a review of behavioral and neuroimaging studies," Neuropsychology review, vol. 20, no. 3, pp. 290-322, 2010.
[54] J. A. Mumford, "A power calculation guide for fMRI studies," Social cognitive and affective neuroscience, vol. 7, no. 6, pp. 738-742, 2012.
[55] D. Szucs and J. P. Ioannidis, "Sample size evolution in neuroimaging research: An evaluation of highly-cited studies (1990–2012) and of latest practices (2017–2018) in high-impact journals," NeuroImage, vol. 221, p. 117164, 2020.
[56] P. Boersma, "Praat: doing phonetics by computer," http://www. praat. org/, 2006.
[57] N. Dehak, P. Dumouchel, and P. Kenny, "Modeling prosodic features with joint factor analysis for speaker verification," IEEE Transactions on Audio, Speech, and Language Processing, vol. 15, no. 7, pp. 2095-2103, 2007.
[58] C.-Y. Lin and H.-C. Wang, "Language identification using pitch contour information," in Proceedings.(ICASSP'05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005., 2005, vol. 1: IEEE, pp. I/601-I/604 Vol. 1.
[59] J. M. Shine, O. Koyejo, and R. A. Poldrack, "Temporal metastates are associated with differential patterns of time-resolved connectivity, network topology, and attention," Proceedings of the National Academy of Sciences, vol. 113, no. 35, pp. 9888-9891, 2016.
[60] D. S. Bassett, M. Yang, N. F. Wymbs, and S. T. Grafton, "Learning-induced autonomy of sensorimotor systems," Nature neuroscience, vol. 18, no. 5, pp. 744-751, 2015.
[61] H. You, A. Liska, N. Russell, and P. Das, "Automated brain state identification using graph embedding," in 2017 International Workshop on Pattern Recognition in Neuroimaging (PRNI), 2017: IEEE, pp. 1-5.
[62] N. Tzourio-Mazoyer et al., "Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain," Neuroimage, vol. 15, no. 1, pp. 273-289, 2002.
[63] J. Cai, Z. Meng, A. S. Khan, Z. Li, J. O'Reilly, and Y. Tong, "Island loss for learning discriminative features in facial expression recognition," in 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), 2018: IEEE, pp. 302-309.
[64] M. Jiang, Z. Yang, W. Liu, and X. Liu, "Additive margin softmax with center loss for face recognition," in Proceedings of the 2018 the 2nd International Conference on Video and Image Processing, 2018, pp. 1-6.
[65] Y. Xu et al., "Robust face recognition based on convolutional neural network," DEStech Transactions on Computer Science and Engineering, no. icmsie, 2017.
[66] J. Cai et al., "Feature-level and model-level audiovisual fusion for emotion recognition in the wild," in 2019 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR), 2019: IEEE, pp. 443-448.
[67] D. Dai, Z. Wu, R. Li, X. Wu, J. Jia, and H. Meng, "Learning discriminative features from spectrograms using center loss for speech emotion recognition," in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019: IEEE, pp. 7405-7409.
[68] R. Zhang, Q. Wang, and Y. Lu, "Combination of ResNet and center loss based metric learning for handwritten Chinese character recognition," in 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR), 2017, vol. 5: IEEE, pp. 25-29.
[69] S. Yang, F. Nian, and T. Li, "A light and discriminative deep networks for off-line handwritten Chinese character recognition," in 2017 32nd Youth Academic Annual Conference of Chinese Association of Automation (YAC), 2017: IEEE, pp. 785-790.
[70] N. I. Eisenberger and M. D. Lieberman, "Why rejection hurts: a common neural alarm system for physical and social pain," Trends in cognitive sciences, vol. 8, no. 7, pp. 294-300, 2004.
[71] M. A. Apps, M. F. Rushworth, and S. W. Chang, "The anterior cingulate gyrus and social cognition: tracking the motivation of others," Neuron, vol. 90, no. 4, pp. 692-707, 2016.
[72] K. Hadland, M. F. Rushworth, D. Gaffan, and R. Passingham, "The effect of cingulate lesions on social behaviour and emotion," Neuropsychologia, vol. 41, no. 8, pp. 919-931, 2003.
[73] P. H. Rudebeck, M. J. Buckley, M. E. Walton, and M. F. Rushworth, "A role for the macaque anterior cingulate gyrus in social valuation," Science, vol. 313, no. 5791, pp. 1310-1312, 2006.
[74] M. A. Apps, P. L. Lockwood, and J. H. Balsters, "The role of the midcingulate cortex in monitoring others' decisions," Frontiers in neuroscience, vol. 7, p. 251, 2013.
[75] R. Adolphs, "The neurobiology of social cognition," Current opinion in neurobiology, vol. 11, no. 2, pp. 231-239, 2001.
[76] B. Rossion, R. Caldara, M. Seghier, A. M. Schuller, F. Lazeyras, and E. Mayer, "A network of occipito‐temporal face‐sensitive areas besides the right middle fusiform gyrus is necessary for normal face processing," Brain, vol. 126, no. 11, pp. 2381-2395, 2003.
[77] H. D. Critchley et al., "The functional neuroanatomy of social behaviour: changes in cerebral blood flow when people with autistic disorder process facial expressions," Brain, vol. 123, no. 11, pp. 2203-2212, 2000.
[78] J. Y. Chiao, T. Harada, E. R. Oby, Z. Li, T. Parrish, and D. J. Bridge, "Neural representations of social status hierarchy in human inferior parietal cortex," Neuropsychologia, vol. 47, no. 2, pp. 354-363, 2009.
[79] A. L. Glenn, A. Raine, and R. A. Schug, "The neural correlates of moral decision-making in psychopathy," Molecular psychiatry, vol. 14, no. 1, pp. 5-6, 2009.
[80] D. Bzdok, G. Hartwigsen, A. Reid, A. R. Laird, P. T. Fox, and S. B. Eickhoff, "Left inferior parietal lobe engagement in social cognition and language," Neuroscience & Biobehavioral Reviews, vol. 68, pp. 319-334, 2016.
[81] L. A. McGraw and L. J. Young, "The prairie vole: an emerging model organism for understanding the social brain," Trends in neurosciences, vol. 33, no. 2, pp. 103-109, 2010.
[82] G. Hofstede, "Cultural differences in teaching and learning," International Journal of intercultural relations, vol. 10, no. 3, pp. 301-320, 1986.
[83] S. Bochner and B. Hesketh, "Power distance, individualism/collectivism, and job-related attitudes in a culturally diverse work group," Journal of cross-cultural psychology, vol. 25, no. 2, pp. 233-257, 1994.
[84] H. Kober, L. F. Barrett, J. Joseph, E. Bliss-Moreau, K. Lindquist, and T. D. Wager, "Functional grouping and cortical–subcortical interactions in emotion: a meta-analysis of neuroimaging studies," Neuroimage, vol. 42, no. 2, pp. 998-1031, 2008.
[85] P. Fusar-Poli et al., "Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies," Journal of psychiatry & neuroscience, 2009.
[86] K. A. Lindquist, T. D. Wager, H. Kober, E. Bliss-Moreau, and L. F. Barrett, "The brain basis of emotion: a meta-analytic review," The Behavioral and brain sciences, vol. 35, no. 3, p. 121, 2012.
[87] S. N. Raja et al., "The revised International Association for the Study of Pain definition of pain: concepts, challenges, and compromises," Pain, vol. 161, no. 9, pp. 1976-1982, 2020.
[88] S. J. Love-Jones, "Pain as a subjective, multidimensional experience," in Pain: Springer, 2019, pp. 141-144.
[89] L. F. Barrett, B. Mesquita, K. N. Ochsner, and J. J. Gross, "The experience of emotion," Annu. Rev. Psychol., vol. 58, pp. 373-403, 2007.
[90] R. Melzack and K. L. Casey, "Sensory, motivational, and central control determinants of pain: a new conceptual model," The skin senses, vol. 1, pp. 423-43, 1968.
[91] J. A. Healey and R. W. Picard, "Detecting stress during real-world driving tasks using physiological sensors," IEEE Transactions on intelligent transportation systems, vol. 6, no. 2, pp. 156-166, 2005.
[92] K. Hone, "Empathic agents to reduce user frustration: The effects of varying agent characteristics," Interacting with computers, vol. 18, no. 2, pp. 227-245, 2006.
[93] N. Novielli, F. de Rosis, and I. Mazzotta, "User attitude towards an embodied conversational agent: Effects of the interaction mode," Journal of Pragmatics, vol. 42, no. 9, pp. 2385-2397, 2010.
[94] J. Heyhoe, Y. Birks, R. Harrison, J. K. O’Hara, A. Cracknell, and R. Lawton, "The role of emotion in patient safety: are we brave enough to scratch beneath the surface?," Journal of the Royal Society of Medicine, vol. 109, no. 2, pp. 52-58, 2016.
[95] K. B. Smith, J. Profetto-McGrath, and G. G. Cummings, "Emotional intelligence and nursing: An integrative literature review," International journal of nursing studies, vol. 46, no. 12, pp. 1624-1636, 2009.
[96] L. L. Berry and N. Bendapudi, "Health care: a fertile field for service research," Journal of service research, vol. 10, no. 2, pp. 111-122, 2007.
[97] L. L. Berry, S. W. Davis, and J. Wilmet, "When the customer is stressed," Harvard Business Review, vol. 93, no. 10, pp. 86-94, 2015.
[98] A. S. Gallan, C. B. Jarvis, S. W. Brown, and M. J. Bitner, "Customer positivity and participation in services: an empirical test in a health care context," Journal of the Academy of Marketing Science, vol. 41, no. 3, pp. 338-356, 2013.
[99] J. R. McColl-Kennedy, S. L. Vargo, T. S. Dagger, J. C. Sweeney, and Y. v. Kasteren, "Health care customer value cocreation practice styles," Journal of service research, vol. 15, no. 4, pp. 370-389, 2012.
[100] M. Faulkner, "Empowerment, disempowerment and the care of older people," Nursing Older People (through 2013), vol. 13, no. 5, p. 18, 2001.
[101] J. Walker, M. Sharpe, and S. Wessely, "Commentary: Symptoms not associated with disease: an unmet public health challenge," International journal of epidemiology, vol. 35, no. 2, pp. 477-478, 2006.
[102] A. Rozanski, J. A. Blumenthal, K. W. Davidson, P. G. Saab, and L. Kubzansky, "The epidemiology, pathophysiology, and management of psychosocial risk factors in cardiac practice: the emerging field of behavioral cardiology," Journal of the american college of cardiology, vol. 45, no. 5, pp. 637-651, 2005.
[103] J. P. Forgas and G. H. Bower, "Mood effects on person-perception judgments," Journal of personality and social psychology, vol. 53, no. 1, p. 53, 1987.
[104] N. Schwarz and G. L. Clore, "Feelings and phenomenal experiences," Social psychology: Handbook of basic principles, vol. 2, pp. 385-407, 1996.
[105] A. J. Rothman and P. Salovey, "The reciprocal relation between principles and practice: Social psychology and health behavior," 2007.
[106] S. B. Algoe and A. L. Stanton, "Gratitude when it is needed most: social functions of gratitude in women with metastatic breast cancer," Emotion, vol. 12, no. 1, p. 163, 2012.
[107] R. W. Stanton, "A road map for change: ensuring that women have breastfeeding support," ed: Springer, 2011.
[108] S. H. Mirani, D. Areja, S. S. Gilani, A. Tahir, M. Pathan, and S. Bhatti, "Frequency of depression and anxiety symptoms in surgical hospitalized patients," Cureus, vol. 11, no. 2, 2019.
[109] S. Eggins, "Hospital humour: Patient initiated humour as resistance to clinical discourse," in Intersections: Applied linguistics as a meeting place, 2014, vol. 4366: Cambridge Scholars Publishing Newcastle upon Tyne.
[110] D. Tacconi et al., "Activity and emotion recognition to support early diagnosis of psychiatric diseases," in 2008 Second International Conference on Pervasive Computing Technologies for Healthcare, 2008: IEEE, pp. 100-102.
[111] M. Calabria, M. Cotelli, M. Adenzato, O. Zanetti, and C. Miniussi, "Empathy and emotion recognition in semantic dementia: A case report," Brain and Cognition, vol. 70, no. 3, pp. 247-252, 2009.
[112] H. M. Gray and L. Tickle-Degnen, "A meta-analysis of performance on emotion recognition tasks in Parkinson’s disease," Neuropsychology, vol. 24, no. 2, p. 176, 2010.
[113] G. P. Amminger et al., "Emotion recognition in individuals at clinical high-risk for schizophrenia," Schizophrenia bulletin, vol. 38, no. 5, pp. 1030-1039, 2012.
[114] P. Rodriguez et al., "Deep pain: Exploiting long short-term memory networks for facial expression classification," IEEE transactions on cybernetics, 2017.
[115] J. Egede, M. Valstar, and B. Martinez, "Fusing deep learned and hand-crafted features of appearance, shape, and dynamics for automatic pain estimation," in 2017 12th IEEE international conference on automatic face & gesture recognition (FG 2017), 2017: IEEE, pp. 689-696.
[116] M. Tavakolian and A. Hadid, "Deep binary representation of facial expressions: a novel framework for automatic pain intensity recognition," in 2018 25th IEEE International Conference on Image Processing (ICIP), 2018: IEEE, pp. 1952-1956.
[117] Y. Oshrat, A. Bloch, A. Lerner, A. Cohen, M. Avigal, and G. Zeilig, "Speech prosody as a biosignal for physical pain detection," in Conf Proc 8th Speech Prosody, 2016, pp. 420-24.
[118] T. Devries, K. Biswaranjan, and G. W. Taylor, "Multi-task learning of facial landmarks and expression," in 2014 Canadian conference on computer and robot vision, 2014: IEEE, pp. 98-103.
[119] L. Zhong, Q. Liu, P. Yang, B. Liu, J. Huang, and D. N. Metaxas, "Learning active facial patches for expression analysis," in 2012 IEEE Conference on Computer Vision and Pattern Recognition, 2012: IEEE, pp. 2562-2569.
[120] R. Xia and Y. Liu, "A multi-task learning framework for emotion recognition using 2D continuous space," IEEE Transactions on affective computing, vol. 8, no. 1, pp. 3-14, 2015.
[121] M. Neumann and N. T. Vu, "Improving speech emotion recognition with unsupervised representation learning on unlabeled speech," in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019: IEEE, pp. 7390-7394.
[122] Z. Zhang, B. Wu, and B. Schuller, "Attention-augmented end-to-end multi-task learning for emotion prediction from speech," in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019: IEEE, pp. 6705-6709.
[123] W. Sheng and X. Li, "Multi-task learning for gait-based identity recognition and emotion recognition using attention enhanced temporal graph convolutional network," Pattern Recognition, vol. 114, p. 107868, 2021.
[124] S. Yoon, S. Byun, S. Dey, and K. Jung, "Speech emotion recognition using multi-hop attention mechanism," in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019: IEEE, pp. 2822-2826.
[125] A. Nediyanchath, P. Paramasivam, and P. Yenigalla, "Multi-head attention for speech emotion recognition with auxiliary learning of gender recognition," in ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020: IEEE, pp. 7179-7183.
[126] L. M. Isbell, J. Tager, K. Beals, and G. Liu, "Emotionally evocative patients in the emergency department: a mixed methods investigation of providers’ reported emotions and implications for patient safety," BMJ quality & safety, vol. 29, no. 10, pp. 1-2, 2020.
[127] L. D. Wandner, C. D. Scipio, A. T. Hirsh, C. A. Torres, and M. E. Robinson, "The perception of pain in others: how gender, race, and age influence pain expectations," The Journal of Pain, vol. 13, no. 3, pp. 220-227, 2012.
[128] A. M. Unruh, "Gender variations in clinical pain experience," Pain, vol. 65, no. 2-3, pp. 123-167, 1996.
[129] W. Gutiérrez Lombana and S. E. Gutiérrez Vidál, "Pain and gender differences. A clinical approach," Colombian Journal of Anestesiology, vol. 40, no. 3, pp. 207-212, 2012.
[130] N. Solheim, S. Östlund, T. Gordh, and L. A. Rosseland, "Women report higher pain intensity at a lower level of inflammation after knee surgery compared with men," Pain reports, vol. 2, no. 3, 2017.
[131] C. Leboeuf-Yde, J. Nielsen, K. O. Kyvik, R. Fejer, and J. Hartvigsen, "Pain in the lumbar, thoracic or cervical regions: do age and gender matter? A population-based study of 34,902 Danish twins 20–71 years of age," BMC musculoskeletal disorders, vol. 10, no. 1, pp. 1-12, 2009.
[132] R. B. Fillingim, C. D. King, M. C. Ribeiro-Dasilva, B. Rahim-Williams, and J. L. Riley III, "Sex, gender, and pain: a review of recent clinical and experimental findings," The journal of pain, vol. 10, no. 5, pp. 447-485, 2009.
[133] W. Ellermeier and W. Westphal, "Gender differences in pain ratings and pupil reactions to painful pressure stimuli," Pain, vol. 61, no. 3, pp. 435-439, 1995.
[134] S. J. Gibson and R. D. Helme, "Age-related differences in pain perception and report," Clinics in geriatric medicine, vol. 17, no. 3, pp. 433-456, 2001.
[135] S. Lautenbacher, M. Kunz, P. Strate, J. Nielsen, and L. Arendt-Nielsen, "Age effects on pain thresholds, temporal summation and spatial summation of heat and pressure pain," Pain, vol. 115, no. 3, pp. 410-418, 2005.
[136] H.-T. Hong, J.-L. Li, C.-M. Chang, and C.-C. Lee, "Improving Automatic Pain Level Recognition using Pain Site as an Auxiliary Task," in 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), 2019: IEEE, pp. 284-289.
[137] M. Sidorov and W. Minker, "Emotion recognition and depression diagnosis by acoustic and visual features: A multimodal approach," in Proceedings of the 4th International Workshop on Audio/Visual Emotion Challenge, 2014, pp. 81-86.
[138] L. Y. Mano, "Emotional condition in the Health Smart Homes environment: emotion recognition using ensemble of classifiers," in 2018 Innovations in Intelligent Systems and Applications (INISTA), 2018: IEEE, pp. 1-8.
[139] F.-S. Tsai, Y.-M. Weng, C.-J. Ng, and C.-C. Lee, "Pain versus Affect? An Investigation in the Relationship between Observed Emotional States and Self-Reported Pain," in 2019 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 2019: IEEE, pp. 508-512.
[140] S. Steidl, M. Levit, A. Batliner, E. Noth, and H. Niemann, "" Of all things the measure is man" automatic classification of emotions and inter-labeler consistency [speech-based emotion recognition]," in Proceedings.(ICASSP'05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005., 2005, vol. 1: IEEE, pp. I/317-I/320 Vol. 1.
[141] P. E. Shrout and J. L. Fleiss, "Intraclass correlations: uses in assessing rater reliability," Psychological bulletin, vol. 86, no. 2, p. 420, 1979.
[142] F. Eyben, M. Wöllmer, and B. Schuller, "Opensmile: the munich versatile and fast open-source audio feature extractor," in Proceedings of the 18th ACM international conference on Multimedia, 2010, pp. 1459-1462.
[143] F. Eyben et al., "The Geneva minimalistic acoustic parameter set (GeMAPS) for voice research and affective computing," IEEE transactions on affective computing, vol. 7, no. 2, pp. 190-202, 2015.
[144] M. Liu, R. Wang, S. Li, S. Shan, Z. Huang, and X. Chen, "Combining multiple kernel methods on riemannian manifold for emotion recognition in the wild," in Proceedings of the 16th International Conference on multimodal interaction, 2014, pp. 494-501.
[145] A. Khare, S. Parthasarathy, and S. Sundaram, "Multi-modal embeddings using multi-task learning for emotion recognition," arXiv preprint arXiv:2009.05019, 2020.
[146] E. Mower et al., "Interpreting ambiguous emotional expressions," in 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009: IEEE, pp. 1-8.
[147] C. M. Bishop, "Pattern recognition," Machine learning, vol. 128, no. 9, 2006.
[148] P. Rainville, Q. V. H. Bao, and P. Chrétien, "Pain-related emotions modulate experimental pain perception and autonomic responses," Pain, vol. 118, no. 3, pp. 306-318, 2005.
[149] K. Wiech and I. Tracey, "The influence of negative emotions on pain: behavioral effects and neural mechanisms," Neuroimage, vol. 47, no. 3, pp. 987-994, 2009.
[150] J. L. Rhudy and M. W. Meagher, "Fear and anxiety: divergent effects on human pain thresholds," Pain, vol. 84, no. 1, pp. 65-75, 2000.

 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top

相關論文

1. 透過語音特徵建構基於堆疊稀疏自編碼器演算法之婚姻治療中夫妻互動行為量表自動化評分系統
2. 基於健保資料預測中風之研究並以Hadoop作為一種快速擷取特徵工具
3. 一個利用人類Thin-Slice情緒感知特性所建構而成之全時情緒辨識模型新框架
4. 應用多任務與多模態融合技術於候用校長演講自動評分系統之建構
5. 基於多模態主動式學習法進行樣本與標記之間的關係分析於候用校長評鑑之自動化評分系統建置
6. 透過結合fMRI大腦血氧濃度相依訊號以改善語音情緒辨識系統
7. 結合fMRI之迴旋積類神經網路多層次特徵 用以改善語音情緒辨識系統
8. 針對實體化交談介面開發基於行為衡量方法於自閉症小孩之評估系統
9. 一個多模態連續情緒辨識系統與其應用於全域情感辨識之研究
10. 整合文本多層次表達與嵌入演講屬性之表徵學習於強健候用校長演講自動化評分系統
11. 利用聯合因素分析研究大腦磁振神經影像之時間效應以改善情緒辨識系統
12. 利用LSTM演算法基於自閉症診斷觀察量表訪談建置辨識自閉症小孩之評估系統
13. 利用多模態模型混合CNN和LSTM影音特徵以自動化偵測急診病患疼痛程度
14. 以雙向長短期記憶網路架構混和多時間粒度文字模態改善婚 姻治療自動化行為評分系統
15. 透過表演逐字稿之互動特徵以改善中文戲劇表演資料庫情緒辨識系統
 
* *