|
Abdul-kader, S. A. (2015). Survey on Chatbot Design Techniques in Speech Conversation Systems. 6(7), 72–80. Araujo, T. (2018). Computers in Human Behavior Living up to the chatbot hype : The in fl uence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189. https://doi.org/10.1016/j.chb.2018.03.051 Broadbent, E. (2017). Interactions with Robots: The Truths We Reveal About Ourselves. Ssrn. https://doi.org/10.1146/annurev-psych-010416-043958 Ciechanowski, L., Przegalinska, A., Magnuski, M., &Gloor, P. (2018). In the Shades of the Uncanny Valley : An Experimental Study of Human – Chatbot Interaction. Future Generation Computer Systems. https://doi.org/10.1016/j.future.2018.01.055 D. Davis, F. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(0), 319–340. https://doi.org/10.1016/j.cell.2017.08.036 D’Alfonso, S., Santesteban-Echarri, O., Rice, S., Wadley, G., Lederman, R., Miles, C., …Alvarez-Jimenez, M. (2017). Artificial intelligence-assisted online social therapy for youth mental health. Frontiers in Psychology, 8(JUN), 1–13. https://doi.org/10.3389/fpsyg.2017.00796 Epley, N., Keysar, B., VanBoven, L., &Gilovich, T. (2004). Perspective taking as egocentric anchoring and adjustment. Journal of Personality and Social Psychology, 87(3), 327–339. https://doi.org/10.1037/0022-3514.87.3.327 Ferrara, B. Y. E., Varol, O., Davis, C., Menczer, F., &Flammini, A. (2016). The Rise of Social Bots. Communications of the ACM, 59(7). https://doi.org/10.1145/2818717 Fitzpatrick, K. K., Darcy, A., Vierhile, M., &Darcy, A. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent ( Woebot ): A Randomized Controlled Trial Corresponding Author : 4, 1–11. https://doi.org/10.2196/mental.7785 Fornell, C., &Larcker, D. F. (1981). SEM with unobservable variables and measurement error: Algebra and statistics. Journal of Marketing Research, 18(3), 382–388. Gilbert, D. T., Gill, M. J., &Wilson, T. D. (2002). The future is now: Temporal correction in affective forecasting. Organizational Behavior and Human Decision Processes, 88(1), 430–444. https://doi.org/10.1006/obhd.2001.2982 Go, E., &Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97(June 2018), 304–316. https://doi.org/10.1016/j.chb.2019.01.020 Heerink, M., Kröse, B., Evers, V., &Wielinga, B. (2010). Assessing Acceptance of Assistive Social Agent Technology by Older Adults: the Almere Model. 361–375. https://doi.org/10.1007/s12369-010-0068-5 Io, H. N., &Lee, C. B. (2018). Chatbots and conversational agents: A bibliometric analysis. IEEE International Conference on Industrial Engineering and Engineering Management, 2017-Decem, 215–219. https://doi.org/10.1109/IEEM.2017.8289883 Jochen Wirtz, Paul G. Patterson, Werner H. Kunz, Thorsten Gruber, Vinh Nhat Lu, Stefanie Paluch, A. M. (2018). Brave new world: service robots in the frontline. Journal of Service Management, 29(5), 907–931. https://doi.org/https://doi.org/10.1108/JOSM-04-2018-0119 Kerly, A., Ellis, R., &Bull, S. (2008). CALMsystem: A Conversational Agent for Learner Modelling. Knowledge-Based Systems, 21(3), 238–246. https://doi.org/10.1016/j.knosys.2007.11.015 Kerly, A., Hall, P., &Bull, S. (2007). Bringing chatbots into education: Towards natural language negotiation of open learner models. Knowledge-Based Systems, 20(2), 177–185. https://doi.org/10.1016/j.knosys.2006.11.014 Kruger, J. (1999). Lake Wobegon be gone! The “below-average effect” and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology, 77(2), 221–232. https://doi.org/10.1037/0022-3514.77.2.221 Lester, J., Branting, K., &Mott, B. (2004). Conversational agents: The practical handbook of internet computing. 220–240. https://doi.org/10.1201/9780203507223 Li, X., Chen, Y.-N., Li, L., Gao, J., &Celikyilmaz, A. (2017). End-to-End Task-Completion Neural Dialogue Systems. Retrieved from http://arxiv.org/abs/1703.01008 MacInnis, D. J., &Folkes, V. S. (2017). Humanizing brands: When brands seem to be like me, part of me, and in a relationship with me. Journal of Consumer Psychology, 27(3), 355–374. https://doi.org/10.1016/j.jcps.2016.12.003 Maslow, A. H. (1943). A theory of human motivation. Psychological Review, 50(4), 370–396. https://doi.org/10.4324/9781912282517 Mathies, C., Chiew, T. M., &Kleinaltenkamp, M. (2016). The antecedents and consequences of humour for service A review and directions for research. https://doi.org/10.1108/JSTP-09-2014-0187 Molina, A., Martín-Consuegra, D., &Esteban, Á. (2007). Relational benefits and customer satisfaction in retail banking. International Journal of Bank Marketing, 25(4), 253–271. https://doi.org/10.1108/02652320710754033 Mou, Y., &Xu, K. (2017). The media inequality: Comparing the initial human-human and human-AI social interactions. Computers in Human Behavior, 72(July 2018), 432–440. https://doi.org/10.1016/j.chb.2017.02.067 Murray, K. B. (1991). A Test of Services Marketing Theory: Consumer Information Acquisition Activities. Journal of Marketing, 55(1), 10. https://doi.org/10.2307/1252200 Nomura, T., &Kanda, T. (2015). Rapport – Expectation with a Robot Scale. (February). https://doi.org/10.1007/s12369-015-0293-z Nowak, K. L. (2003). The Effect of the Agency and Anthropomorphism on Users’ Sense of Telepresence, Copresence, and Social Presence. Presence, 12(5), 481–494. Rautio, P. (2011). Writing about everyday beauty: Anthropomorphizing and distancing as literary practices. Environmental Communication, 5(1), 104–123. https://doi.org/10.1080/17524032.2010.540251 Seyama, J., &Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 16(4), 337–351. https://doi.org/10.1162/pres.16.4.337 Shenhav, A., Rand, D. G., &Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in God. Journal of Experimental Psychology: General, 141(3), 423–428. https://doi.org/10.1037/a0025391 Shum, H.-Y., He, X., &Li, D. (2018). From Eliza to XiaoIce: Challenges and Opportunities with Social Chatbots. 19(1), 10–26. Retrieved from http://arxiv.org/abs/1801.01957 Shum, H., He, X., &Li, D. (2016). From Eliza to XiaoIce : Challenges and Opportunities with Social Chatbots. Stock, R. M., &Merkle, M. (2017). A Service Robot Acceptance Model : User Acceptance of Humanoid Robots During Service Encounters. Turing, B. A. M. (1950). Computing machinery and intelligence. Wang, Y. S., Lin, H. H., &Luarn, P. (2006). Predicting consumer intention to use mobile service. Information Systems Journal, 16(2), 157–179. https://doi.org/10.1111/j.1365-2575.2006.00213.x Weizenbaum, J. (1983). ELIZA—A Computer Program For the Study of Natural Language Communication Between Man And Machine. Communications of the ACM, 26(1), 23–28. https://doi.org/10.1145/357980.357991 Wiese, E., Metta, G., &Wykowska, A. (2017). Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Frontiers in Psychology, 8(OCT), 1–19. https://doi.org/10.3389/fpsyg.2017.01663
|