|
[1] Mindlab. 2014. Does playing music at work increase productivity? Retrieved April 8, 2015, from http://themindlab.co.uk/ [2] Gray, E. 2013. Music: a therapy for all? Perspectives in public health, 133(1), 14 [3] Satoh, M., Ogawa, J. I., Tokita, T., Nakaguchi, N., Nakao, K., Kida, H., & Tomimoto, H. 2014. The effects of physical exercise with music on cognitive function of elderly people: Mihama-Kiho project. PloS one, 9(4), e95230. [4] Kenny, D. 2004. Treatment Approaches for Music Performance Anxiety: What works? Music Forum. [5] Juslin, P. N., & Laukka, P. 2004. Expression, perception, and induction of musical emotions: A review and a questionnaire study of everyday listening. Journal of New Music Research, 33(3), 217-238. [6] Feng, Y., Zhuang, Y., & Pan, Y. 2003. Popular music retrieval by detecting mood. In Proceedings of the 26th annual international ACM SIGIR conference on Research and development in information retrieval (pp. 375-376). ACM. [7] Lu, L., Liu, D., & Zhang, H. J. 2006. Automatic mood detection and tracking of music audio signals. IEEE Transactions on audio, speech, and language processing, 14(1), 5-18. [8] [8] Yang, Y. H., Lin, Y. C., Su, Y. F., & Chen, H. H. 2008. A regression approach to music emotion recognition. Audio, Speech, and Language Processing, IEEE Transactions on, 16(2), 448-457. [9] Yang, Y. H., & Chen, H. H. 2012. Machine recognition of music emotion: A review. ACM Transactions on Intelligent Systems and Technology (TIST), 3(3), 40. [10] Han, B. J., Rho, S., Jun, S., & Hwang, E. 2010. Music emotion classification and context-based music recommendation. Multimedia Tools and Applications, 47(3), 433-460. [11] Russell, J. A. 1980. A circumplex model of affect. Journal of personality and social psychology, 39(6), 1161. [12] Watson, D., Clark, L. A., & Tellegen, A. 1988. Development and validation of brief measures of positive and negative affect: the PANAS scales. Journal of personality and social psychology, 54(6), 1063. [13] Levenson, Robert W. 2003. Blood, sweat, and fears. Annals of the New York Academy of Sciences 1000.1, 348-366. [14] Eysenck, H. J., & Eysenck, M. W. 1985. Personality and individual differences: A natural science approach. New York: Plenum. [15] Duda, R. O., Hart, P. E., & Stork, D. G. 2012. Pattern classification. John Wiley & Sons. [16] Fu, Z., Lu, G., Ting, K. M., & Zhang, D. 2011. A survey of audio-based music classification and annotation. Multimedia, IEEE Transactions on 13(2), 303-319. [17] Serra, J., Gómez, E., Herrera, P., & Serra, X. 2008. Chroma binary similarity and local alignment applied to cover song identification. Audio, Speech, and Language Processing, IEEE Transactions on, 16(6), 1138-1151. [18] Yang, Y. H., Lin, Y. C., Su, Y. F., & Chen, H. H. 2008. A regression approach to music emotion recognition. Audio, Speech, and Language Processing, IEEE Transactions on, 16(2), 448-457. [19] Shen, J., Shepherd, J., Cui, B., & Tan, K. L. 2009. A novel framework for efficient automated singer identification in large music databases. ACM Transactions on Information Systems (TOIS), 27(3), 18. [20] Watson, K. B. 1942. The nature and measurement of musical meanings. Psychological Monographs: General and Applied, 54(2), i-43. [21] Fairbanks, G., & Pronovost, W. 1939. An experimental study of the pitch characteristics of the voice during the expression of emotion∗. Communications Monographs, 6(1), 87-104. [22] Fairbanks, G., & Hoaglin, L. W. 1941. An experimental study of the durational characteristics of the voice during the expression of emotion. Communications Monographs, 8(1), 85-90. [23] Gabrielsson, A., Lindstrom, E. 2001. The influence of musical structure on emotional expression. In: Juslin, P.N., Sloboda, J.A. (Eds.), Music and Emotion: Theory and Research. Oxford University Press, Oxford, 223–248. [24] Schellenberg, E. G., Krysciak, A. M., & Campbell, R. J. 2000. Perceiving emotion in melody: Interactive effects of pitch and rhythm. Music Perception, 155-171. [25] Dalla Bella, S., Peretz, I., Rousseau, L., Gosselin, N. 2001. A developmental study of the affective value of tempo and mode in music, Cognition 80, 1–9. [26] Khalfa, S., Schon, D., Anton, J.L., Liégeois-Chauvel, C. 2005. Brain regions involved in the recognition of sadness and happiness in music, Neuroreport 16 (18), 1981–1984. [27] Davitz, J. R. 1964. The communication of emotional meaning. [28] Fonagy, I., & Magdics, K. 1963. Emotional patterns in intonation and music. Zeitschrift für Phonetik 16(1-3), 293-326. [29] Michalski, R. S., Carbonell, J. G., & Mitchell, T. M. (Eds.). 2013. Machine learning: An artificial intelligence approach. Springer Science & Business Media. [30] Kohavi, R., & Provost, F. 1998. Glossary of terms. Machine Learning, 30(2-3), 271-274. [31] Russel, S., & Norvig, P. 1994. Artificial Intelligence A Modern Approach. 1995. Cited on, 20. [32] Weiss, S. I., and Kulikowski, C. 1991. Computer Systems That Learn: Classification and Prediction Methods from Statistics, Neural Networks, Machine Learning, and Expert Systems. San Francisco, Calif.: Morgan Kaufmann. [33] Hand, D. J. 1981. Discrimination and Classification. Chichester, U.K.: Wiley. [34] Jiawei, H., & Kamber, M. 2001. Data mining: concepts and techniques. San Francisco, CA, itd: Morgan Kaufmann, 5. [35] Malik, M. 2008. Standard measurement of heart rate variability. Dynamic electrocardiography, 13-21. [36] Medicore, SA-3000P Clinical Manual ver. 3.0. Retrieved June 8, 2015, from http://medi-core.com/download/HRV_clinical_manual_ver3.0.pdf. [37] Mioglobal®. Retrieved June 8, 2015, from https://www.mioglobal.com/Default.aspx. [38] Samsung®. Retrieved June 28, 2015 from http://www.samsung.com/global/microsite/gear/gearlive_design.html [39] Brooke, J. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. [40] Newman, K. 2001. Interrogating SERVQUAL: a critical assessment of service quality measurement in a high street retail bank. International journal of bank marketing, 19(3), 126-139. [41] Hart, S. G., & Staveland, L. E. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology, 52, 139-183. [42] Hsu, Y-W., Chiu, M-C. & Hwang, S-L. 2014. Investigating the Relationship between Therapeutic Music and Emotion: A Pilot Study on Healthcare Service. In Proceedings of the 21st ISPE Inc. International Conference on Concurrent Engineering, 688-697. [43] Nunnally, J. 1978. Psychometric methods. [44] Fayers, P. M., & Machin, D. 2007. Scores and measurements: validity, reliability, sensitivity. Quality of Life: The Assessment, Analysis and Interpretation of Patient-Reported Outcomes, Second edition, 77-108. [45] Bangor, A., Kortum, P., & Miller, J. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies, 4(3), 114-123. [46] Parasuraman, A., Zeithaml, V. A., & Berry, L. L. 1985. A conceptual model of service quality and its implications for future research. the Journal of Marketing, 41-50. [47] Bailey, J. E., & Pearson, S. W. 1983. Development of a tool for measuring and analyzing computer user satisfaction. Management science, 29(5), 530-545. [48] Ives, B. and Olson, M. H. and Baroudi, J. J. 1983. The Measurement of User Information Satisfaction, Communication of the ACM, Oct. 1983, Vol. 26, Iss. 10, pp. 785-793. [49] McClure, E. B., Pope, K., Hoberman, A. J., Pine, D. S., & Leibenluft, E. 2014. Facial expression recognition in adolescents with mood and anxiety disorders.
|