|
[1] T. Driskell, J. E. Driskell, C. S. Burke, and E. Salas, “Team roles: A review and integration,” Small Group Research, vol. 48, no. 4, pp. 482–511, 2017. [2] S.-L. Yeh, Y.-S. Lin, and C.-C. Lee, “An interaction-aware attention network for speech emotion recognition in spoken dialogs,” in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 6685–6689, IEEE, 2019. [3] Y.-S. Lin and C.-C. Lee, “Deriving dyad-level interaction representation using interlocutors structural and expressive multimodal behavior features.,” in INTERSPEECH, pp. 2366–2370, 2017. [4] I.Naim,M.I.Tanveer,D.Gildea,andM.E.Hoque,“Automatedpredictionandanalysis of job interview performance: The role of what you say and how you say it,” in 2015 11th IEEE international conference and workshops on automatic face and gesture recognition (FG), vol. 1, pp. 1–6, IEEE, 2015. [5] T. Choudhury and S. Basu, “Modeling conversational dynamics as a mixed-memory markov process,” Advances in neural information processing systems, vol. 17, 2004. [6] U. Malik, J. Saunier, K. Funakoshi, and A. Pauchet, “Who speaks next? turn change and next speaker prediction in multimodal multiparty interaction,” in 2020 IEEE 32nd International Conference on Tools with Artificial Intelligence (ICTAI), pp. 349–354, IEEE, 2020. [7] D. Aneja, R. Hoegen, D. McDuff, and M. Czerwinski, “Understanding conversational and expressive style in a multimodal embodied conversational agent,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–10, 2021. [8] R. Niewiadomski, S. J. Hyniewska, and C. Pelachaud, “Computational models of expressive behaviors for a virtual agent.,” 2014. [9] Y. Matsusaka, S. Fujie, and T. Kobayashi, “Modeling of conversational strategy for the robot participating in the group conversation,” in Seventh European conference on speech communication and technology, Citeseer, 2001. [10] B.-H.Su,C.-M.Chang,Y.-S.Lin,andC.-C.Lee,“Improvingspeechemotionrecognition using graph attentive bi-directional gated recurrent unit network.,” in INTERSPEECH, pp. 506–510, 2020. [11] C.-C. Lee, A. Katsamanis, M. P. Black, B. R. Baucom, A. Christensen, P. G. Georgiou, and S. S. Narayanan, “Computing vocal entrainment: A signal-derived pca-based quantification scheme with application to affect analysis in married couple interactions,” Computer Speech & Language, vol. 28, no. 2, pp. 518–539, 2014. [12] M. Nasir, B. Baucom, C. Bryan, S. Narayanan, and P. Georgiou, “Modeling vocal entrainment in conversational speech using deep unsupervised learning,” IEEE Transactions on Affective Computing, 2020. [13] Z.Huang,W.Xu,andK.Yu,“Bidirectionallstm-crfmodelsforsequencetagging,”arXiv preprint arXiv:1508.01991, 2015. [14] G. Shang, A. J.-P. Tixier, M. Vazirgiannis, and J.-P. Lorré, “Speaker-change aware crf for dialogue act classification,” arXiv preprint arXiv:2004.02913, 2020. [15] S. Okada, Y. Ohtake, Y. I. Nakano, Y. Hayashi, H.-H. Huang, Y. Takase, and K. Nitta, “Estimating communication skills using dialogue acts and nonverbal features in multiple discussion datasets,” in Proceedings of the 18th ACM International Conference on Multimodal Interaction, pp. 169–176, 2016. [16] R.Ishii,K.Otsuka,S.Kumano,R.Higashinaka,andJ.Tomita,“Analyzinggazebehavior and dialogue act during turn-taking for estimating empathy skill level,” in Proceedings of the 20th ACM International Conference on Multimodal Interaction, pp. 31–39, 2018. [17] R. Ishii, K. Otsuka, S. Kumano, R. Higashinaka, and J. Tomita, “Estimating interpersonal reactivity scores using gaze behavior and dialogue act during turn-changing,” in International Conference on Human-Computer Interaction, pp. 45–53, Springer, 2019. [18] Y.-S.Lin,S.S.-F.Gau,andC.-C.Lee,“Amultimodalinterlocutor-modulatedattentional blstm for classifying autism subgroups during clinical interviews,” IEEE Journal of Selected Topics in Signal Processing, vol. 14, no. 2, pp. 299–311, 2020. [19] C.Antaki,R.Barnes,andI.Leudar,“Self-disclosureasasituatedinteractionalpractice,” British journal of social psychology, vol. 44, no. 2, pp. 181–199, 2005. [20] V. J. Derlega and J. H. Berg, Self-disclosure: Theory, research, and therapy. Plenum Press, 1987. [21] P. C. Cozby, “Self-disclosure: a literature review.,” Psychological bulletin, vol. 79, no. 2, p. 73, 1973. [22] S. M. Jourard and P. Lasakow, “Some factors in self-disclosure.,” The Journal of Abnormal and Social Psychology, vol. 56, no. 1, p. 91, 1958. [23] J.-P. Laurenceau, L. F. Barrett, and P. R. Pietromonaco, “Intimacy as an interpersonal process: The importance of self-disclosure, partner disclosure, and perceived partner responsiveness in interpersonal exchanges.,” Journal of personality and social psychology, vol. 74, no. 5, p. 1238, 1998. [24] K.DINDIA,M.Fitzpatrick,andD.Kenny,“Self-disclosureinspouseandstrangerinteraction a social relations analysis,” Human Communication Research HUM COMMUN RES, vol. 23, pp. 388–412, 03 1997. [25] N. L. Collins and L. C. Miller, “Self-disclosure and liking: a meta-analytic review.,” Psychological bulletin, vol. 116, no. 3, p. 457, 1994. [26] J. G. Shapiro, H. H. Krauss, and C. B. Truax, “Therapeutic conditions and disclosure beyond the therapeutic encounter.,” Journal of Counseling Psychology, vol. 16, no. 4, p. 290, 1969. [27] B. A. Farber, “Patient self-disclosure: A review of the research,” Journal of Clinical Psychology, vol. 59, no. 5, pp. 589–600, 2003. [28] A. E. Kelly, “Helping construct desirable identities: A self-presentational view of psychotherapy.,” Psychological Bulletin, vol. 126, no. 4, p. 475, 2000. [29] M. Worthy, A. L. Gary, and G. M. Kahn, “Self-disclosure as an exchange process.,” Journal of personality and social psychology, vol. 13, no. 1, p. 59, 1969. [30] A. W. Gouldner, “The norm of reciprocity: A preliminary statement,” American sociological review, pp. 161–178, 1960. [31] J. K. Burgoon, L. A. Stern, and L. Dillman, Interpersonal adaptation: Dyadic interaction patterns. Cambridge University Press, 2007. [32] S.L.KooleandW.Tschacher,“Synchronyinpsychotherapy:Areviewandanintegrative framework for the therapeutic alliance,” Frontiers in psychology, vol. 7, p. 862, 2016. [33] P.Mundy,M.Sigman,J.Ungerer,andT.Sherman,“Definingthesocialdeficitsofautism: The contribution of non-verbal communication measures,” Journal of child psychology and psychiatry, vol. 27, no. 5, pp. 657–669, 1986. [34] R. E. McEvoy, S. J. Rogers, and B. F. Pennington, “Executive function and social communication deficits in young autistic children,” Journal of child psychology and psychiatry, vol. 34, no. 4, pp. 563–578, 1993. [35] M. Turner, “Annotation: Repetitive behaviour in autism: A review of psychological research,” The Journal of Child Psychology and Psychiatry and Allied Disciplines, vol. 40, no. 6, pp. 839–849, 1999. [36] K. A. Pelphrey, J. P. Morris, and G. McCarthy, “Neural basis of eye gaze processing deficits in autism,” Brain, vol. 128, no. 5, pp. 1038–1048, 2005. [37] H. Tager-Flusberg, R. Paul, C. Lord, et al., “Language and communication in autism,” Handbook of autism and pervasive developmental disorders, vol. 1, pp. 335–364, 2005. [38] C.Lord,M.Rutter,S.Goode,J.Heemsbergen,H.Jordan,L.Mawhood,andE.Schopler, “Austism diagnostic observation schedule: A standardized observation of communicative and social behavior,” Journal of autism and developmental disorders, vol. 19, no. 2, pp. 185–212, 1989. [39] E. K. Farran, A. Branson, and B. J. King, “Visual search for basic emotional expressions in autism; impaired processing of anger, fear and sadness, but a typical happy face advantage,” Research in Autism Spectrum Disorders, vol. 5, no. 1, pp. 455–462, 2011. [40] L.-H. Quek, K. Sofronoff, J. Sheffield, A. White, and A. Kelly, “Co-occurring anger in young people with asperger’s syndrome,” Journal of clinical psychology, vol. 68, no. 10, pp. 1142–1148, 2012. [41] A. C. Samson, W. M. Wells, J. M. Phillips, A. Y. Hardan, and J. J. Gross, “Emotion regulation in autism spectrum disorder: evidence from parent interviews and children’s daily diaries,” Journal of Child Psychology and Psychiatry, vol. 56, no. 8, pp. 903–913, 2015. [42] B.P.Ho,J.Stephenson,andM.Carter,“Angerinchildrenwithautismspectrumdisorder: Parent’s perspective.,” International Journal of Special Education, vol. 27, no. 2, pp. 14– 32, 2012. [43] L. Capps, N. Yirmiya, and M. Sigman, “Understanding of simple and complex emotions in non-retarded children with autism,” Journal of Child Psychology and Psychiatry, vol. 33, no. 7, pp. 1169–1182, 1992. [44] C. Rieffe, M. M. Terwogt, and K. Kotronopoulou, “Awareness of single and multiple emotions in high-functioning children with autism,” Journal of autism and developmental disorders, vol. 37, no. 3, pp. 455–465, 2007. [45] A. C. Laurent and E. Rubin, “Challenges in emotional regulation in asperger syndrome and high-functioning autism,” Topics in Language Disorders, vol. 24, no. 4, pp. 286–297, 2004. [46] C. Rieffe, P. Oosterveld, M. M. Terwogt, S. Mootz, E. Van Leeuwen, and L. Stockmann, “Emotion regulation and internalizing symptoms in children with autism spectrum disorders,” Autism, vol. 15, no. 6, pp. 655–670, 2011. [47] A. C. Samson, A. Y. Hardan, I. A. Lee, J. M. Phillips, and J. J. Gross, “Maladaptive behavior in autism spectrum disorder: The role of emotion experience and emotion regulation,” Journal of Autism and Developmental Disorders, vol. 45, no. 11, pp. 3424–3432, 2015. [48] D. Bone, C.-C. Lee, M. P. Black, M. E. Williams, S. Lee, P. Levitt, and S. Narayanan, “The psychologist as an interlocutor in autism spectrum disorder assessment: Insights from a study of spontaneous prosody,” Journal of Speech, Language, and Hearing Research, vol. 57, no. 4, pp. 1162–1177, 2014. [49] D. Bone, S. Bishop, R. Gupta, S. Lee, and S. S. Narayanan, “Acoustic-prosodic and turn-taking features in interactions with children with neurodevelopmental disorders.,” in Interspeech, pp. 1185–1189, 2016. [50] S. R. Leekam, S. J. Libby, L. Wing, J. Gould, and C. Taylor, “The diagnostic interview for social and communication disorders: algorithms for icd-10 childhood autism and wing and gould autistic spectrum disorder,” Journal of Child Psychology and Psychiatry, vol. 43, no. 3, pp. 327–342, 2002. [51] M. Mordre, B. Groholt, A. K. Knudsen, E. Sponheim, A. Mykletun, and A. M. Myhre, “Is long-term prognosis for pervasive developmental disorder not otherwise specified different from prognosis for autistic disorder? findings from a 30-year follow-up study,” Journal of autism and developmental disorders, vol. 42, no. 6, pp. 920–928, 2012. [52] C.Lord,E.Petkova,V.Hus,W.Gan,F.Lu,D.M.Martin,O.Ousley,L.Guy,R.Bernier, J. Gerdts, et al., “A multisite study of the clinical diagnosis of different autism spectrum disorders,” Archives of general psychiatry, vol. 69, no. 3, pp. 306–313, 2012. [53] A.P.Associationetal.,Diagnosticandstatisticalmanualofmentaldisorders(DSM-5®). American Psychiatric Pub, 2013. [54] U. Frith, “Emanuel miller lecture: Confusions and controversies about asperger syndrome,” Journal of child psychology and psychiatry, vol. 45, no. 4, pp. 672–686, 2004. [55] T. Bennett, P. Szatmari, S. Bryson, J. Volden, L. Zwaigenbaum, L. Vaccarella, E. Duku, and M. Boyle, “Differentiating autism and asperger syndrome on the basis of language delay or impairment,” Journal of autism and developmental disorders, vol. 38, no. 4, pp. 616–625, 2008. [56] C.Ecker,W.Spooren,andD.Murphy,“Developingnewpharmacotherapiesforautism,” Journal of internal medicine, vol. 274, no. 4, pp. 308–320, 2013. [57] S. Odom, K. Hume, B. Boyd, and A. Stabel, “Moving beyond the intensive behavior treatment versus eclectic dichotomy: Evidence-based and individualized programs for learners with asd,” Behavior Modification, vol. 36, no. 3, pp. 270–297, 2012. [58] L. Schreibman, “Intensive behavioral/psychoeducational treatments for autism: Research needs and future directions,” Journal of autism and developmental disorders, vol. 30, no. 5, pp. 373–378, 2000. [59] C.-P. Chen, S. S.-F. Gau, and C.-C. Lee, “Toward differential diagnosis of autism spectrum disorder using multimodal behavior descriptors and executive functions,” Computer Speech & Language, vol. 56, pp. 17–35, 2019. [60] C.-P. Chen, X.-H. Tseng, S. S.-F. Gau, and C.-C. Lee, “Computing multimodal dyadic behaviors during spontaneous diagnosis interviews toward automatic categorization of autism spectrum disorder,” in Proc. Interspeech, 2017. [61] Y.-S. Lin, S. S.-F. Gau, and C.-C. Lee, “An interlocutor-modulated attentional lstm for differentiating between subgroups of autism spectrum disorder,” Proc. Interspeech 2018, pp. 2329–2333, 2018. [62] C. Lord, M. Rutter, and A. Le Couteur, “Autism diagnostic interview-revised: a revised version of a diagnostic interview for caregivers of individuals with possible pervasive developmental disorders,” Journal of autism and developmental disorders, vol. 24, no. 5, pp. 659–685, 1994. [63] M. P. Black, D. Bone, M. E. Williams, P. Gorrindo, P. Levitt, and S. Narayanan, “The usc care corpus: Child-psychologist interactions of children with autism spectrum disorders,” in Twelfth Annual Conference of the International Speech Communication Association, Citeseer, 2011. [64] E.Billing,T.Belpaeme,H.Cai,H.-L.Cao,A.Ciocan,C.Costescu,D.David,R.Homewood, D. Hernandez Garcia, P. Gómez Esteban, et al., “The dream dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy,” PloS one, vol. 15, no. 8, p. e0236939, 2020. [65] T. Guha, Z. Yang, A. Ramakrishna, R. B. Grossman, D. Hedley, S. Lee, and S. S. Narayanan, “On quantifying facial expression-related atypicality of children with autism spectrum disorder,” in 2015 IEEE international conference on acoustics, speech and signal processing (ICASSP), pp. 803–807, IEEE, 2015. [66] P.BoersmaandD.Weenink,“Praat-asystemfordoingphoneticsbycomputer[computer software],” The Netherlands: Institute of Phonetic Sciences, University of Amsterdam, 2003. [67] L. Centelles, C. Assaiante, K. Etchegoyhen, M. Bouvard, and C. Schmitz, “Understanding social interaction in children with autism spectrum disorders: does whole-body motion mean anything to them?,” L’Encephale, vol. 38, no. 3, pp. 232–240, 2012. [68] E. Milne, J. Swettenham, and R. Campbell, “Motion perception and autistic spectrum disorder: a review,” Current Psychology of Cognition, vol. 23, no. 1/2, p. 3, 2005. [69] S. Tsermentseli, J. M. OâBrien, and J. V. Spencer, “Comparison of form and motion coherence processing in autistic spectrum disorders and dyslexia,” Journal of Autism and Developmental Disorders, vol. 38, no. 7, pp. 1201–1210, 2008. [70] Z. Cao, T. Simon, S.-E. Wei, and Y. Sheikh, “Realtime multi-person 2d pose estimation using part affinity fields,” in CVPR, 2017. [71] S.-E. Wei, V. Ramakrishna, T. Kanade, and Y. Sheikh, “Convolutional pose machines,” in CVPR, 2016. [72] J.Sánchez,F.Perronnin,T.Mensink,andJ.Verbeek,“Imageclassificationwiththefisher vector: Theory and practice,” International journal of computer vision, vol. 105, no. 3, pp. 222–245, 2013. [73] H. Kaya, A. A. Karpov, and A. A. Salah, “Fisher vectors with cascaded normalization for paralinguistic analysis,” in Sixteenth Annual Conference of the International Speech Communication Association, 2015. [74] S.-W. Hsiao, H.-C. Sun, M.-C. Hsieh, M.-H. Tsai, Y. Tsao, and C.-C. Lee, “Toward automating oral presentation scoring during principal certification program using audiovideo low-level behavior profiles,” IEEE Transactions on Affective Computing, 2017. [75] A. Graves and J. Schmidhuber, “Framewise phoneme classification with bidirectional lstm and other neural network architectures,” Neural Networks, vol. 18, no. 5-6, pp. 602– 610, 2005. [76] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997. [77] D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” arXiv preprint arXiv:1409.0473, 2014. [78] S. Sharma, R. Kiros, and R. Salakhutdinov, “Action recognition using visual attention,” arXiv preprint arXiv:1511.04119, 2015. [79] S.Mirsamadi,E.Barsoum,andC.Zhang,“Automaticspeechemotionrecognitionusing recurrent neural networks with local attention,” in Acoustics, Speech and Signal Processing (ICASSP), 2017 IEEE International Conference on, pp. 2227–2231, IEEE, 2017. [80] J. Gibson, D. Can, P. Georgiou, D. C. Atkins, and S. S. Narayanan, “Attention networks for modeling behaviors in addiction counseling,” in Proc. Interspeech, 2017. [81] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014. [82] A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang, Z. DeVito, Z. Lin, A. Desmaison, L. Antiga, and A. Lerer, “Automatic differentiation in pytorch,” in NIPS-W, 2017. [83] D. B. Shalom, S. Mostofsky, R. Hazlett, M. Goldberg, R. Landa, Y. Faran, D. McLeod, and R. Hoehn-Saric, “Normal physiological emotions but differences in expression of conscious feelings in children with high-functioning autism,” Journal of autism and developmental disorders, vol. 36, no. 3, pp. 395–400, 2006. [84] A. C. Samson, O. Huber, and J. J. Gross, “Emotion regulation in asperger’s syndrome and high-functioning autism.,” Emotion, vol. 12, no. 4, p. 659, 2012. [85] J. R. Hackman and C. G. Morris, “Group tasks, group interaction process, and group performance effectiveness: A review and proposed integration,” in Advances in experimental social psychology, vol. 8, pp. 45–99, Elsevier, 1975. [86] M. E. Gist, E. A. Locke, and M. S. Taylor, “Organizational behavior: Group structure, process, and effectiveness,” Journal of Management, vol. 13, no. 2, pp. 237–257, 1987. [87] U. Avci and O. Aran, “Predicting the performance in decision-making tasks: From individual cues to group interaction,” IEEE Transactions on Multimedia, vol. 18, no. 4, pp. 643–658, 2016. [88] U. Kubasova, G. Murray, and M. Braley, “Analyzing verbal and nonverbal features for predicting group performance,” arXiv preprint arXiv:1907.01369, 2019. [89] G. Murray and C. Oertel, “Predicting group performance in task-based interaction,” in Proceedings of the 2018 on International Conference on Multimodal Interaction, pp. 14– 20, ACM, 2018. [90] M.Nowak,J.Kim,N.W.Kim,andC.Nass,“Socialvisualizationandnegotiation:effects of feedback configuration and status,” in Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, pp. 1081–1090, 2012. [91] J. M. DiMicco, K. J. Hollenbach, A. Pandolfo, and W. Bender, “The impact of increased awareness while face-to-face,” Human–Computer Interaction, vol. 22, no. 1-2, pp. 47– 96, 2007. [92] Y. R. Tausczik and J. W. Pennebaker, “Improving teamwork using real-time language feedback,” in Proceedings of the SIGCHI conference on human factors in computing systems, pp. 459–468, 2013. [93] G. Leshed, Automated language-based feedback for teamwork behaviors. Cornell University, 2009. [94] T.B.RobyandJ.T.Lanzetta,“Workgroupstructure,communication,andgroupperformance,” Sociometry, vol. 19, no. 2, pp. 105–113, 1956. [95] B. A. Olaniran, “Group performance in computer-mediated and face-to-face communication media,” Management Communication Quarterly, vol. 7, no. 3, pp. 256–281, 1994. [96] S.R.Hiltz,K.Johnson,andM.Turoff,“Experimentsingroupdecisionmakingcommunication process and outcome in face-to-face versus computerized conferences,” Human communication research, vol. 13, no. 2, pp. 225–252, 1986. [97] R.F.Bales,“Interactionprocessanalysis;amethodforthestudyofsmallgroups.,”1950. [98] H. Bunt, J. Alexandersson, J.-W. Choe, A. C. Fang, K. Hasida, V. Petukhova, A. Popescu-Belis, and D. R. Traum, “Iso 24617-2: A semantically-based standard for dialogue annotation.,” in LREC, pp. 430–437, 2012. [99] C. A. Gorse and S. Emmitt, “Communication behaviour during management and design team meetings: a comparison of group interaction,” Construction management and economics, vol. 25, no. 11, pp. 1197–1213, 2007. [100] C. A. Gorse, S. Emmitt, M. Lowis, and A. Howarth, “Project performance and management and design team communication,” in Proceedings of the Association of Researchers in Construction Management, 17th Annual Conference, pp. 5–7, 2001. [101] J. R. Hackman and N. Katz, “Group behavior and performance.,” 2010. [102] S. A. Wheelan, The handbook of group research and practice. Sage, 2005. [103] H. C. Foushee and K. L. Manos, “5. information transfer within the cockpit: Problems in mtracockplt communications,” C. E. Billings: Ames Research Center. E. S. Cheaney: Battelle’s Columbus Division, Mountain View, California., p. 63, 1981. [104] H. C. Foushee, “Dyads and triads at 35,000 feet: Factors affecting group process and aircrew performance.,” American Psychologist, vol. 39, no. 8, p. 885, 1984. [105] C. A. Gorse and S. Emmitt, “Informal interaction in construction progress meetings,” Construction Management and Economics, vol. 27, no. 10, pp. 983–993, 2009. [106] S.Li,S.Okada,andJ.Dang,“Interactionprocesslabelrecognitioningroupdiscussion,” in 2019 International Conference on Multimodal Interaction, pp. 426–434, 2019. [107] Y.-S. Lin and C.-C. Lee, “Predicting performance outcome with a conversational graph convolutional network for small group interactions,” in ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 8044– 8048, IEEE, 2020. [108] U. Kubasova and G. Murray, “Group performance prediction with limited context,” in Companion Publication of the 2020 International Conference on Multimodal Interaction, pp. 191–195, 2020. [109] S.-C.Zhong,Y.-S.Lin,C.-M.Chang,Y.-C.Liu,andC.-C.Lee,“Predictinggroupperformances using a personality composite-network architecture during collaborative task,” Proc. Interspeech 2019, pp. 1676–1680, 2019. [110] McCowan, J. Carletta, W. Kraaij, S. Ashby, S. Bourban, M. Flynn, M. Guillemot, T. Hain, J. Kadlec, V. Karaiskos, et al., “The ami meeting corpus,” in Proceedings of the 5th international conference on methods and techniques in behavioral research, vol. 88, p. 100, Citeseer, 2005. [111] F. Pianesi, M. Zancanaro, B. Lepri, and A. Cappelletti, “A multimodal annotated corpus of consensus decision making meetings,” Language Resources and Evaluation, vol. 41, no. 3-4, pp. 409–429, 2007. [112] D.Sanchez-Cortes,O.Aran,M.S.Mast,andD.Gatica-Perez,“A non verbal behavior approach to identify emergent leaders in small groups,” IEEE Transactions on Multimedia, vol. 14, no. 3, pp. 816–832, 2012. [113]M.Braley and G.Murray,“The group affect and performance (gap) corpus,”in Proceedings of the ICMI 2018 Workshop on Group Interaction Frontiers in Technology (GIFT), 2018. [114] Bhattacharya, M. Foley, C. Ku, N. Zhang, T. Zhang, C. Mine, M. Li, H. Ji, C. Riedl, B. F. Welles, et al., “The unobtrusive group interaction (ugi) corpus,” in Proceedings of the 10th ACM Multimedia Systems Conference, pp. 249–254, ACM, 2019. [115] J. E. McGrath, Groups: Interaction and performance, vol. 14. Prentice-Hall Englewood Cliffs, NJ, 1984. [116] P. C. Bottger and P. W. Yetton, “An integration of process and decision scheme explanations of group problem solving performance,” Organizational behavior and human decision processes, vol. 42, no. 2, pp. 234–249, 1988. [117] Wiedow and U. Konradt, “Two-dimensional structure of team process improvement: Team reflection and team adaptation,” Small Group Research, vol. 42, no. 1, pp. 32–54, 2011. [118] Somech, “The effects of leadership style and team process on performance and innovation in functionally heterogeneous teams,” Journal of management, vol. 32, no. 1, pp. 132–157, 2006. [119] M. A. West, “Reflexivity, revolution and innovation in work teams,” in Product development teams, pp. 1–29, Jai Press, 2000. [120] N. Clarke, “Emotional intelligence and learning in teams,” Journal of Workplace Learning, 2010. [121] J. Ohlsson, “Team learning: Collective reflection processes in teacher teams,” Journal of Workplace Learning, 2013.
[122] W. Woolley, C. F. Chabris, A. Pentland, N. Hashmi, and T. W. Malone, “Evidence for a collective intelligence factor in the performance of human groups,” science, vol. 330, no. 6004, pp. 686–688, 2010. [123] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “Bert: Pre-training of deep bidirectional transformers for language understanding,” arXiv preprint arXiv:1810.04805, 2018. [124] V. Bazarevsky, Y. Kartynnik, A. Vakunov, K. Raveendran, and M. Grundmann, “Blazeface: Sub-millisecond neural face detection on mobile gpus,” arXiv preprint arXiv:1907.05047, 2019. [125] Y. Kartynnik, A. Ablavatski, I. Grishchenko, and M. Grundmann, “Real-time facial surface geometry from monocular video on mobile gpus,” arXiv preprint arXiv:1907.06724, 2019. [126] V. Bazarevsky, I. Grishchenko, K. Raveendran, T. Zhu, F. Zhang, and M. Grundmann, “Blazepose: On-device real-time body pose tracking,” arXiv preprint arXiv:2006.10204, 2020. [127] L. Le, A. Patterson, and M. White, “Supervised autoencoders: Improving generalization performance with unsupervised regularizers,” in Advances in Neural Information Processing Systems (S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, eds.), vol. 31, Curran Associates, Inc., 2018.
[128] S. K. Subburaj, A. E. Stewart, A. Ramesh Rao, and S. K. D’Mello, “Multimodal, multiparty modeling of collaborative problem solving performance,” in Proceedings of the 2020 International Conference on Multimodal Interaction, pp. 423–432, 2020. [129] M. J. Martin and P. W. Foltz, “Automated team discourse annotation and performance prediction using lsa,” tech. rep., NEW MEXICO STATE UNIV LAS CRUCES, 2004. [130] L. Gonzales, J. T. Hancock, and J. W. Pennebaker, “Language style matching as a predictor of social dynamics in small groups,” Communication Research, vol. 37, no. 1, pp. 3–19, 2010. [131] D.Reitter and J.D.Moore,“Alignment and task success in spoken dialogue,”Journalof Memory and Language, vol. 76, pp. 29–46, 2014. [132] U. Fischer, L. McDonnell, and J. Orasanu, “Linguistic correlates of team performance: Toward a tool for monitoring team functioning during space missions,” Aviation, space, and environmental medicine, vol. 78, no. 5, pp. B86–B95, 2007. [133] Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in neural information processing systems, pp. 5998–6008, 2017. [134] Y.-S. Lin and C.-C. Lee, “Using interlocutor-modulated attention blstm to predict personality traits in small group interaction,” in Proceedings of the 2018 on International Conference on Multimodal Interaction, pp. 163–169, ACM, 2018. [135] F. Tschan, “Ideal cycles of communication (or cognitions) in triads, dyads, and individuals,” Small Group Research, vol. 33, no. 6, pp. 615–643, 2002. [136] F. Tschan, “Communication enhances small group performance if it conforms to task requirements: The concept of ideal communication cycles,” Basic and Applied Social Psychology, vol. 17, no. 3, pp. 371–393, 1995.
|