帳號:guest(3.128.199.6)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):王俊人
作者(外文):Wang, Chun-Jen
論文名稱(中文):聊天機器人人形化與幽默對使用者經驗之影響
論文名稱(外文):The effect of chatbot humanoid image and humor on user experience
指導教授(中文):王貞雅
指導教授(外文):Wang, Chen-Ya
口試委員(中文):王俊程
許裴舫
口試委員(外文):Wang, Jyun-Cheng
Hsu, Pei-Fang
學位類別:碩士
校院名稱:國立清華大學
系所名稱:服務科學研究所
學號:106078513
出版年(民國):108
畢業學年度:107
語文別:中文
論文頁數:53
中文關鍵詞:聊天機器人幽默擬人論
外文關鍵詞:chatbothumoranthropomorphism
相關次數:
  • 推薦推薦:0
  • 點閱點閱:421
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
聊天機器人在當代應用跨足各個產業,在工商、心理領域以及技術相關的研究於近年急遽成長,而本篇研究針對聊天機器人的人形化外觀與幽默進行探討,考量在高或低程度的擬人與幽默是否會影響使用者體驗。本研究採用2(人形化外觀程度) x 2(幽默程度) x 2(聊天機器人類型)進行情境實驗設計,讓受試者親身操作聊天機器人後,進行實驗問卷填答,共蒐集126份有效樣本。研究分析結果指出,高幽默程度的聊天機器人在任務型聊天機器人可以正向提升使用者體驗,因此融入幽默對話進入聊天機器人腳本中,有助於提升使用者的服務滿意度以及再使用意願;高程度擬人形象雖然對於使用者體驗沒有直接關聯,但經由融洽關係與信任感的中介仍有影響。這意味著儘管形象上的改變未必能直接影響使用者體驗,但卻在心理層面產生影響,未來可以進一步研究其成因與探索其他情感中介關係所帶來的影響。
The application of chatbot in contemporary industries has grown rapidly and there has been an increasing number of research in business, psychology, and technology-related fields. This study explores the humanoid image and humor of chatbot on user experience through scenario simulation experiment. This study employs a 2 (level of humanoid image) x 2 (level of chatbot humor) x 2 (chatbot type) experimental design. A total of 126 valid responses were collected. The research results show that the chatbot with high humor level can positively enhance the user experience (e.g. satisfaction and intention to use) for the task-oriented chatbot. The level of the humanoid image is not directly related to the user experience but indirectly through rapport and trust. The future research may further explore other mediating mechanisms.
摘要
目錄
第壹章、緒論---------------------------1
第一節、研究背景與動機------------------1
第二節、研究目的與問題------------------4
第三節、研究流程-----------------------5
第貳章、文獻回顧-----------------------6
第一節、聊天機器人(CHATBOT)------------6
第二節、擬人論(ANTHROPOMORPHISM)------11
第三節、幽默(HUMOR)-------------------14
第四節、情感關係(RELATIONAL ELEMENTS)-16
第參章、研究方法-----------------------19
第一節、實驗設計-----------------------19
第二節、概述與聊天機器人架設------------20
第三節、實驗過程-----------------------21
第四節、衡量--------------------------24
第肆章、結果與分析---------------------28
第一節、敘述性統計分析------------------28
第二節、信度與效度分析------------------30
第三節、操弄檢定-----------------------31
第四節、模型分析與假說驗證--------------33
第五節、間接效果分析-------------------37
第伍章、討論--------------------------42
第陸章、管理意涵-----------------------44
第柒章、研究限制與未來研究發展----------46
第捌章、參考文獻-----------------------47
附錄、--------------------------------52


Abdul-kader, S. A. (2015). Survey on Chatbot Design Techniques in Speech Conversation Systems. 6(7), 72–80.
Araujo, T. (2018). Computers in Human Behavior Living up to the chatbot hype : The in fl uence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189. https://doi.org/10.1016/j.chb.2018.03.051
Broadbent, E. (2017). Interactions with Robots: The Truths We Reveal About Ourselves. Ssrn. https://doi.org/10.1146/annurev-psych-010416-043958
Ciechanowski, L., Przegalinska, A., Magnuski, M., &Gloor, P. (2018). In the Shades of the Uncanny Valley : An Experimental Study of Human – Chatbot Interaction. Future Generation Computer Systems. https://doi.org/10.1016/j.future.2018.01.055
D. Davis, F. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(0), 319–340. https://doi.org/10.1016/j.cell.2017.08.036
D’Alfonso, S., Santesteban-Echarri, O., Rice, S., Wadley, G., Lederman, R., Miles, C., …Alvarez-Jimenez, M. (2017). Artificial intelligence-assisted online social therapy for youth mental health. Frontiers in Psychology, 8(JUN), 1–13. https://doi.org/10.3389/fpsyg.2017.00796
Epley, N., Keysar, B., VanBoven, L., &Gilovich, T. (2004). Perspective taking as egocentric anchoring and adjustment. Journal of Personality and Social Psychology, 87(3), 327–339. https://doi.org/10.1037/0022-3514.87.3.327
Ferrara, B. Y. E., Varol, O., Davis, C., Menczer, F., &Flammini, A. (2016). The Rise of Social Bots. Communications of the ACM, 59(7). https://doi.org/10.1145/2818717
Fitzpatrick, K. K., Darcy, A., Vierhile, M., &Darcy, A. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent ( Woebot ): A Randomized Controlled Trial Corresponding Author : 4, 1–11. https://doi.org/10.2196/mental.7785
Fornell, C., &Larcker, D. F. (1981). SEM with unobservable variables and measurement error: Algebra and statistics. Journal of Marketing Research, 18(3), 382–388.
Gilbert, D. T., Gill, M. J., &Wilson, T. D. (2002). The future is now: Temporal correction in affective forecasting. Organizational Behavior and Human Decision Processes, 88(1), 430–444. https://doi.org/10.1006/obhd.2001.2982
Go, E., &Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97(June 2018), 304–316. https://doi.org/10.1016/j.chb.2019.01.020
Heerink, M., Kröse, B., Evers, V., &Wielinga, B. (2010). Assessing Acceptance of Assistive Social Agent Technology by Older Adults: the Almere Model. 361–375. https://doi.org/10.1007/s12369-010-0068-5
Io, H. N., &Lee, C. B. (2018). Chatbots and conversational agents: A bibliometric analysis. IEEE International Conference on Industrial Engineering and Engineering Management, 2017-Decem, 215–219. https://doi.org/10.1109/IEEM.2017.8289883
Jochen Wirtz, Paul G. Patterson, Werner H. Kunz, Thorsten Gruber, Vinh Nhat Lu, Stefanie Paluch, A. M. (2018). Brave new world: service robots in the frontline. Journal of Service Management, 29(5), 907–931. https://doi.org/https://doi.org/10.1108/JOSM-04-2018-0119
Kerly, A., Ellis, R., &Bull, S. (2008). CALMsystem: A Conversational Agent for Learner Modelling. Knowledge-Based Systems, 21(3), 238–246. https://doi.org/10.1016/j.knosys.2007.11.015
Kerly, A., Hall, P., &Bull, S. (2007). Bringing chatbots into education: Towards natural language negotiation of open learner models. Knowledge-Based Systems, 20(2), 177–185. https://doi.org/10.1016/j.knosys.2006.11.014
Kruger, J. (1999). Lake Wobegon be gone! The “below-average effect” and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology, 77(2), 221–232. https://doi.org/10.1037/0022-3514.77.2.221
Lester, J., Branting, K., &Mott, B. (2004). Conversational agents: The practical handbook of internet computing. 220–240. https://doi.org/10.1201/9780203507223
Li, X., Chen, Y.-N., Li, L., Gao, J., &Celikyilmaz, A. (2017). End-to-End Task-Completion Neural Dialogue Systems. Retrieved from http://arxiv.org/abs/1703.01008
MacInnis, D. J., &Folkes, V. S. (2017). Humanizing brands: When brands seem to be like me, part of me, and in a relationship with me. Journal of Consumer Psychology, 27(3), 355–374. https://doi.org/10.1016/j.jcps.2016.12.003
Maslow, A. H. (1943). A theory of human motivation. Psychological Review, 50(4), 370–396. https://doi.org/10.4324/9781912282517
Mathies, C., Chiew, T. M., &Kleinaltenkamp, M. (2016). The antecedents and consequences of humour for service A review and directions for research. https://doi.org/10.1108/JSTP-09-2014-0187
Molina, A., Martín-Consuegra, D., &Esteban, Á. (2007). Relational benefits and customer satisfaction in retail banking. International Journal of Bank Marketing, 25(4), 253–271. https://doi.org/10.1108/02652320710754033
Mou, Y., &Xu, K. (2017). The media inequality: Comparing the initial human-human and human-AI social interactions. Computers in Human Behavior, 72(July 2018), 432–440. https://doi.org/10.1016/j.chb.2017.02.067
Murray, K. B. (1991). A Test of Services Marketing Theory: Consumer Information Acquisition Activities. Journal of Marketing, 55(1), 10. https://doi.org/10.2307/1252200
Nomura, T., &Kanda, T. (2015). Rapport – Expectation with a Robot Scale. (February). https://doi.org/10.1007/s12369-015-0293-z
Nowak, K. L. (2003). The Effect of the Agency and Anthropomorphism on Users’ Sense of Telepresence, Copresence, and Social Presence. Presence, 12(5), 481–494.
Rautio, P. (2011). Writing about everyday beauty: Anthropomorphizing and distancing as literary practices. Environmental Communication, 5(1), 104–123. https://doi.org/10.1080/17524032.2010.540251
Seyama, J., &Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 16(4), 337–351. https://doi.org/10.1162/pres.16.4.337
Shenhav, A., Rand, D. G., &Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in God. Journal of Experimental Psychology: General, 141(3), 423–428. https://doi.org/10.1037/a0025391
Shum, H.-Y., He, X., &Li, D. (2018). From Eliza to XiaoIce: Challenges and Opportunities with Social Chatbots. 19(1), 10–26. Retrieved from http://arxiv.org/abs/1801.01957
Shum, H., He, X., &Li, D. (2016). From Eliza to XiaoIce : Challenges and Opportunities with Social Chatbots.
Stock, R. M., &Merkle, M. (2017). A Service Robot Acceptance Model : User Acceptance of Humanoid Robots During Service Encounters.
Turing, B. A. M. (1950). Computing machinery and intelligence.
Wang, Y. S., Lin, H. H., &Luarn, P. (2006). Predicting consumer intention to use mobile service. Information Systems Journal, 16(2), 157–179. https://doi.org/10.1111/j.1365-2575.2006.00213.x
Weizenbaum, J. (1983). ELIZA—A Computer Program For the Study of Natural Language Communication Between Man And Machine. Communications of the ACM, 26(1), 23–28. https://doi.org/10.1145/357980.357991
Wiese, E., Metta, G., &Wykowska, A. (2017). Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Frontiers in Psychology, 8(OCT), 1–19. https://doi.org/10.3389/fpsyg.2017.01663
(此全文未開放授權)
電子全文
中英文摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *