|
[1] R. B. Zajonc, "Feeling and thinking: Preferences need no inferences." American psychologist 35.2 (1980): 151. [2] B. De Martino, D. Kumaran, B. Seymour, and R. J. Dolan, “Frames, biases, and rational decision-making in the human brain,” Science, vol. 313, no. 5787, pp. 684–687, 2006. [3] J. S. Lerner and D. Keltner, “Beyond valence: Toward a model of emotion-specific influences on judgement and choice,” Cognition & Emotion, vol. 14, no. 4, pp. 473–493, 2000. [4] I. Blanchette and A. Richards, “The influence of affect on higher level cognition: A review of research on interpretation, judgement, decision making and reasoning,” Cognition & Emotion, vol. 24, no. 4, pp. 561–595, 2010. [5] G. Margolin, P. H. Oliver, E. B. Gordis, H. G. O’hearn, A. M. Medina, C. M. Ghosh, and L. Morland, “The nuts and bolts of behavioral observation of marital and family interaction,” Clinical Child and Family Psychology Review, vol. 1, no. 4, pp. 195–213, 1998. [6] R. E. Heyman, “Observation of couple conflicts: clinical assessment applications, stubborn truths, and shaky foundations.” Psychological assessment, vol. 13, no. 1, p. 5, 2001. [7] J. C. Norcross and B. E. Wampold, “Evidence-based therapy relationships: research conclusions and clinical practices.” Psychotherapy, vol. 48, no. 1, p. 98, 2011. [8] L. Sanders, C. Trinh, B. Sherman, and S. Banks, “Assessment of client satisfaction in a peer counseling substance abuse treatment program for pregnant and postpartum women,” Evaluation and Program Planning, vol. 21, no. 3, pp. 287–296, 1998. [9] C. Lord, S. Risi, L. Lambrecht, E. H. Cook Jr, B. L. Leventhal, P. C. DiLavore, A. Pickles, and M. Rutter, “The autism diagnostic observation schedulegeneric: A standard measure of social and communication deficits associated with the spectrum of autism,” Journal of autism and developmental disorders, vol. 30, no. 3, pp. 205–223, 2000. [10] P. Borkenau, N. Mauer, R. Riemann, F. M. Spinath, and A. Angleitner, “Thin slices of behavior as cues of personality and intelligence.” Journal of personality and social psychology, vol. 86, no. 4, p. 599, 2004. [11] N. Ambady and H. M. Gray, “On being sad and mistaken: mood effects on the accuracy of thin-slice judgments.” Journal of personality and social psychology, vol. 83, no. 4, p. 947, 2002. [12] N. Ambady and R. Rosenthal, “Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis.” Psychological bulletin, vol. 111, no. 2, p. 256, 1992. [13] N. Ambady and R. Rosenthal, “Half a minute: Predicting teacher evaluations from thin slices of nonverbal behavior and physical attractiveness.” Journal of personality and social psychology, vol. 64, no. 3, p. 431, 1993. [14] K. A. Fowler, S. O. Lilienfeld, and C. J. Patrick, “Detecting psychopathy from thin slices of behavior.” Psychological assessment, vol. 21, no. 1, p. 68, 2009. [15] L. P. Naumann, S. Vazire, P. J. Rentfrow, and S. D. Gosling, “Personality judgments based on physical appearance,” Personality and Social Psychology Bulletin, 2009. [16] T. F. Oltmanns, J. N. Friedman, E. R. Fiedler, and E. Turkheimer, “Perceptions of people with personality disorders based on thin slices of behavior,” Journal of Research in Personality, vol. 38, no. 3, pp. 216–229, 2004. [17] C. Oveis, J. Gruber, D. Keltner, J. L. Stamper, and W. T. Boyce, “Smile intensity and warm touch as thin slices of child and family affective style.” Emotion, vol. 9, no. 4, p. 544, 2009. [18] S. Narayanan and P. G. Georgiou, “Behavioral signal processing: Deriving human behavioral informatics from speech and language,” Proceedings of the IEEE, vol. 101, no. 5, pp. 1203–1233, 2013. [19] M. P. Black, A. Katsamanis, B. R. Baucom, C.-C. Lee, A. C. Lammert, A. Christensen, P. G. Georgiou, and S. S. Narayanan, “Toward automating a human behavioral coding system for married couples interactions using speech acoustic features,” Speech Communication, vol. 55, no. 1, pp. 1–21, 2013. [20] P. G. Georgiou, M. P. Black, A. C. Lammert, B. R. Baucom, and S. S. Narayanan, “thats aggravating, very aggravating: Is it possible to classify behaviors in couple interactions using automatically derived lexical features?” in Affective Computing and Intelligent Interaction. Springer, 2011, pp. 87–96. [21] S.-W. Hsiao, H.-C. Sun, M.-C. Hsieh, M.-H. Tsai, H.-C. Lin, and C.-C. Lee, “A multimodal approach for automatic assessment of school prinicipals’ oral presentation during pre-service training program,” in Interspeech, 2015, pp. 2529–2533. [22] W.-Y. Huang, S.-W. Hsiao, H.-C. Sun, M.-C. Hsieh, M.-H. Tsai, and C.-C. Lee, “Enhancement of automatic oral presentation assessment system using latent n-grams word representation and part-of-speech information,” in Interspeech, 2016, p. in press. [23] Z. Yang, A. Metallinou, E. Erzin, and S. Narayanan, “Analysis of interaction attitudes using data-driven hand gesture phrases,” in Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on. IEEE, 2014, pp. 699–703. [24] A. Metallinou, Z. Yang, C.-c. Lee, C. Busso, S. Carnicke, and S. Narayanan, “The usc creativeit database of multimodal dyadic interactions: from speech and full body motion capture to continuous emotional annotations,” Language Resources and Evaluation, pp. 1–25, 2015. [25] S. M. Carnicke, “The knebel technique: active analysis in practice,” Actor Training, pp. 99–116, 2010. [26] R. Cowie, E. Douglas-Cowie, S. Savvidou*, E. McMahon, M. Sawey, and M. Schr¨oder, “’feeltrace’: An instrument for recording perceived emotion in real time,” in ISCA tutorial and research workshop (ITRW) on speech and emotion, 2000. [27] P. Boersma et al., “Praat, a system for doing phonetics by computer,” Glot international, vol. 5, no. 9/10, pp. 341–345, 2002 [28] D. Ververidis, C. Kotropoulos, Emotional speech recognition: resources, features, and methods, Speech Commun. 48 (2006) 1162–1181. [29] B. McFee et al., "librosa: Audio and music signal analysis in python." Proceedings of the 14th Python in Science Conference. 2015. [30] A. Metallinou, A. Katsamanis, and S. Narayanan, “Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information,” Image and Vision Computing, vol. 31, no. 2, pp. 137–152, 2013. [31] Y. Bengio, H. Schwenk, J.-S. Sen´ecal, F. Morin, and J.-L. Gauvain, “Neural probabilistic language models,” in Innovations in Machine Learning. Springer, 2006, pp. 137–186. [32] T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in Advances in neural information processing systems, 2013, pp. 3111–3119. [33] T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” arXiv preprint arXiv:1301.3781, 2013. [34] F. Morin and Y. Bengio, “Hierarchical probabilistic neural network language model.” in Aistats, vol. 5. Citeseer, 2005, pp. 246–252. [35] T. Mikolov, Q. V. Le, and I. Sutskever, “Exploiting similarities among languages for machine translation,” arXiv preprint arXiv:1309.4168, 2013. [36] J. Zhu et al., "Active learning with sampling by uncertainty and density for word sense disambiguation and text classification." Proceedings of the 22nd International Conference on Computational Linguistics-Volume 1. Association for Computational Linguistics, 2008. [37] F. Perronnin and C. Dance, “Fisher kernels on visual vocabularies for image categorization,” in Computer Vision and Pattern Recognition, 2007. CVPR’07. IEEE Conference on. IEEE, 2007, pp. 1–8. [38] X. Peng, C. Zou, Y. Qiao, and Q. Peng, “Action recognition with stacked fisher vectors,” in Computer Vision–ECCV 2014. Springer, 2014, pp. 581–595. [39] C. Sun and R. Nevatia, “Large-scale web video event classification by use of fisher vectors,” in Applications of Computer Vision (WACV), 2013 IEEE Workshop on. IEEE, 2013, pp. 15–22. [40] X. Peng, L. Wang, X. Wang, and Y. Qiao, “Bag of visual words and fusion methods for action recognition: Comprehensive study and good practice,” arXiv preprint arXiv:1405.4506, 2014. [41] F. Perronnin, J. S´anchez, and T. Mensink, “Improving the fisher kernel for large-scale image classification,” in Computer Vision–ECCV 2010. Springer, 2010, pp. 143–156. [42] W.-C. Lin and C.-C. Lee, “A thin-slice perception of emotion? an information theoretic-based framework to identify locally emotion-rich behavior segments for global affect recognition,” in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016, pp. 5790–5794. |