帳號:guest(3.21.159.11)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):戴安庭
作者(外文):Tai, An-Ting
論文名稱(中文):結合智慧眼鏡與擴增實境系統於機台操作輔助的資訊顯示模式之評估
論文名稱(外文):Information Display Evaluation of Augmented Reality Based Machine Assisted Operation Instruction System on Smart Glasses
指導教授(中文):王茂駿
盧俊銘
指導教授(外文):Wang, Mao-Jiun
Lu, Jun-Ming
口試委員(中文):吳欣潔
唐硯漁
口試委員(外文):Wu, Hsin-Chieh
Kang, Yen-Yu
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:104034555
出版年(民國):106
畢業學年度:105
語文別:中文
論文頁數:61
中文關鍵詞:步驟提示系統使用性評估融入感工作負荷
外文關鍵詞:step-promptsystem usability scalesense of immersionworkload
相關次數:
  • 推薦推薦:0
  • 點閱點閱:87
  • 評分評分:*****
  • 下載下載:14
  • 收藏收藏:0
隨著科技日益發達,訓練員工操作機台的方式由傳統的口耳授受或紙本操作手冊,漸漸地演變成電腦化操作,尤其是近來因穿戴式裝置的迅速發展,許多企業開始嘗試透過穿戴式裝置訓練員工。穿戴式裝置包括智慧型眼鏡、智慧手套、智慧手錶等,而智慧型眼鏡其具有免手持和高融入感的特色,加上可穿透式的顯示搭配擴增實境的技術應用,大幅提升了訓練員工操作機台的效率與彈性。
由於智慧型眼鏡近期才備受矚目,過往關於其在資訊顯示模式和融入感等問題之研究較少,故本研究將擴增實境技術和智慧型眼鏡整合,應用於輔助機台操作上,進行擴增實境資訊顯示模式之評估。實驗將資訊顯示模式設定為「文字說明」、「動作圖示」、「文字說明加動作圖示」等三種,加以有無提示下一步驟方位之設定,共有6種組合。本研究共招募30位(15男15女,年齡21至30歲)研究參與者,研究參與者需配戴智慧型眼鏡(Epson BT-200)、透過以Unity 5.4開發之擴增實境輔助系統操作一料架矯直送料機台(TOMAC TLN2-300),使用六種不同資訊顯示模式分別完成一次共14個步驟的操作流程。過程中將蒐集任務完成時間、錯誤次數等客觀績效指標,主觀感受則包括NASA-TLX工作負荷量表、融入感問卷及SUS系統使用性量表,藉以找出較適之資訊顯示模式。
實驗結果顯示,資訊顯示模式對於錯誤率與系統使用性有顯著影響,其中以「文字說明加動作圖示」與「文字說明」之資訊顯示模式較佳。除此之外,有步驟提示相較於無步驟提示有較短的完成時間、較低的工作負荷及較高的系統使用性。綜合各項主客觀指標,本研究發現最適之資訊顯示模式為「文字說明加動作圖示」,但建議文字敘述不能過於冗長,只用於輔助說明較易混淆之動作圖示,另宜加上箭頭提示下一步驟之方位,以降低使用者之工作負荷、縮短完成時間,也提升系統使用性。
With the advance of technology, the way how people are trained to operate machines has been changed from face-to-face instruction or paper manual to computerized instruction. As wearable devices are developing rapidly recently, there are more and more training given through the use of wearable devices. The wearable devices include head-mounted displays, smart gloves, smart watches, and etc. Moreover, smart glasses are one of the most popular head-mounted displays, which have the features of hands-free, high immersion and see-through display. In addition, with the application of the techniques of Augmented Reality (AR), smart glasses promote the efficiency and flexibility of training significantly.
Because smart glasses have attracted extensive attention in recent years, there are few studies addressing the issues of application of AR, information display and immersion. In this study, the smart glasses integrated with the AR technology were applied to the instruction of machine operation as well as investigating the usability. There were three forms of information display considered including text, animated image, and text along with animated image. Each form of information display was provided with or without a prompt for the next step. As a result, there were six combinations. There were 15 males and 15 females between 21-30 years old participating in the experiment. The participants are required to wear the Epson BT-200 smart glasses with different combination of information display to complete the operational process of 14 steps by manipulating the TOMAC TLN2-300 machine. The completion time, number of errors and subjective measures including NASA-TLX, SUS and sense of immersion were collected in the experiment, so as to identify the better combination of information display.
Results showed that the information display had significant influence on number of errors and SUS scores. Participants using the text along with animated image and only text information display had better performance. Besides, participants who are provided with a prompt for the next step had better performance in terms of completion time, workload and SUS scores. Considering subjective and objective indexes, the study concluded that information display of text along with animated image was the best. This study also suggests that the text presented should be concise, and can only be used in describing confusing animated image. Furthermore, it is better to append arrows which can prompt the direction of next step to decrease the workload of users, shorten the completing time and enhance the system usability.
目錄
摘要 I
Abstract II
誌謝 IV
圖目錄 VII
表目錄 VIII
第一章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 2
1.3 研究架構 3
第二章 文獻探討 5
2.1 頭戴式顯示器 5
2.2 擴增實境 7
2.2.1 擴增實境的定義 7
2.2.2 擴增實境的實例與應用 8
2.2.3 擴增實境介面 10
2.3 融入感 11
2.4 工作負荷 12
2.5 系統使用性 13
2.6 性別 14
2.7 小結 15
第三章 研究方法 16
3.1 實驗材料與設備 16
3.1.1 實驗材料 16
3.1.2 實驗設備 18
3.2 實驗設計 21
3.2.1 擴增實境介面開發 22
3.2.2 自變項 23
3.2.3 依變項 25
3.2.4 控制變項 27
3.3 研究參與者資料 27
3.4 實驗流程 27
3.5 統計方法 30
第四章 研究結果 31
4.1 完成時間 32
4.2 錯誤次數 33
4.3 系統使用性量表 34
4.4 融入感問卷 36
4.5 NASA-TLX工作負荷量表 36
第五章 討論 38
5.1 性別結果比較 38
5.2 資訊顯示模式結果比較 41
5.3 步驟提示結果比較 42
5.4 資訊顯示模式之綜合討論 43
5.5 智慧眼鏡結合擴增實境系統與傳統方式、操作任務之比較 46
5.6 實驗設備限制與環境限制 46
第六章 結論與建議 48
6.1結論 48
6.2 建議 49
參考文獻 50
附錄一、融入感問卷 55
附錄二、系統使用性量表 56
附錄三、NASA-TLX工作負荷量表 57
附錄四、Marker一覽 58
附錄五、機台操作流程 59













參考文獻
1. Epson官方網站:http://www.epson.com.tw/Projectors/V11H560054/Overview
2. Google Glass官方網站:https://www.google.com/glass/start/
3. Recon Instruments Jet官方網站:http://www.reconinstruments.com/products/jet/
4. Sony官方網站:https://www.sony.com.tw/zh
5. 李傳房 (2014)。高齡使用者擴增實境互動導覽介面研究。福祉科技與服務管理學刊, 2(3),243-258。
6. 游嘉智 (2009)。擴增實境行動遊戲設計研究─ 以古蹟寺廟導覽為例。大同大學工業設計學系所學位論文,1-201。
7. 蔡妮馨 (2015)。Google Glass結合擴增實境於手機維護系統使用性評估。清華大學工業工程與工程管理學系學位論文,1-91。
8. 謝孟真 (2014)。美國國家航空暨太空總署工作心智負荷指標之中文化與信效度初探。長庚大學護理學系所學位論文,1-94。
9. Azuma, R. T. (2004). Overview of augmented reality. In Proceeding of Siggraph.
10. Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 6(4), 355-385.
11. Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. Intl. Journal of Human-Computer Interaction, 24(6), 574-594.
12. Bangor, A., Kortum, P., & Miller, J. (2009). Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3), 114-123.
13. Benbow, C. P., & Stanley, J. C. (1980). Sex differences in mathematical ability: Fact or artifact?. Science, 210(4475), 1262-1264.
14. Birkfellner, W., Figl, M., Huber, K., Watzinger, F., Wanschitz, F., Hummel, J., & Bergmann, H. (2002). A head-mounted operating binocular for augmented reality visualization in medicine-design and initial evaluation. IEEE Transactions on Medical Imaging, 21(8), 991-997.
15. Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4-7.
16. Brown, E., & Cairns, P. (2004). A grounded investigation of game immersion. ACM Press, 1279-1300.
17. Cahill, L. (2005). His brain, her brain. Scientific American, 292(5), 40-47.
18. Cooper, G. E., & Harper Jr, R. P. (1969). The use of pilot rating in the evaluation of aircraft handling qualities. NASA Ames Research Center.
19. Darroch, I., Goodman, J., Brewster, S., & Gray, P. (2005). The effect of age and font size on reading text on handheld computers. Human-Computer Interaction-INTERACT 2005, 253-266.
20. De Crescenzio, F., Fantini, M., Persiani, F., Di Stefano, L., Azzari, P., & Salti, S. (2011). Augmented reality for aircraft maintenance training and operations support. IEEE Computer Graphics and Applications, 31(1), 96-101.
21. Didier, J. Y., Roussel, D., Mallem, M., Otmane, S., Naudet, S., Pham, Q. C., & Hocquard, A. (2005). AMRA: Augmented Reality Assistance in Train Maintenance Tasks. In Proceedings of the Workshop on Industrial Augmented Reality, 17-18.
22. Dini, G., & Dalle, M. (2015). Application of Augmented Reality Techniques in Through-life Engineering Services. Procedia CIRP, 38, 14-23.
23. Doil, F., Schreiber, W., Alt, T., & Patron, C. (2003). Augmented reality for manufacturing planning. In Proceedings of the workshop on Virtual Environments, 71-76.
24. Dunleavy, M., Dede, C., & Mitchell, R. (2009). Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journal of Science Education and Technology, 18(1), 7-22.
25. Dünser, A., Steinbügl, K., Kaufmann, H., & Glück, J. (2006). Virtual and augmented reality as spatial ability training tools. In Proceedings of the 7th ACM SIGCHI New Zealand chapter's international conference on Computer-human interaction: design centered HCI, 125-132.
26. Egenhofer, M. (1999). Spatial information appliances: A next generation of geographic information systems. In 1st Brazilian Workshop on Geoinformatics, Campinas, Brazil, 1-4.
27. Fiorentino, M., Uva, A. E., Gattullo, M., Debernardis, S., & Monno, G. (2014). Augmented reality on large screen for interactive maintenance instructions. Computers in Industry, 65(2), 270-278.
28. Furmanski, C., Azuma, R., & Daily, M. (2002). Augmented-reality visualizations guided by cognition: Perceptual heuristics for combining visible and obscured information. International Symposium on Mixed and Augmented Reality, 215-224.
29. Gates, A. I. (1961). Sex differences in reading ability. The Elementary School Journal, 61(8), 431-434.
30. Hart, S. G. (2006). NASA-Task Load Index (NASA-TLX); 20 years later. In Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting, 904-908.
31. Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology, 52, 139-183.
32. Hart, S. G., Childress, M. E., & Hauser, J. R. (1982). Individual definitions of the term “workload”. In Eighth Symposium on Psychology in the Department of Defense, 478-485.
33. Huang, D. L., Rau, P. L. P., & Liu, Y. (2009). Effects of font size, display resolution and task type on reading Chinese fonts from mobile devices. International Journal of Industrial Ergonomics, 39(1), 81-89.
34. Jennett, C., Cox, A. L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T., & Walton, A. (2008). Measuring and defining the experience of immersion in games. International Journal of Human-Computer Studies, 66(9), 641-661.
35. Kahneman, D. (1973). Attention and effort. Englewood Cliffs, NJ: Prentice-Hall.
36. Kang, Y. Y., Wang, M. J. J., & Lin, R. (2009). Usability evaluation of e-books. Displays, 30(2), 49-52.
37. Kato, H., & Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality, 85-94.
38. Kiyokawa, K., Kurata, Y., & Ohno, H. (2001). An optical see-through display for mutual occlusion with a real-time stereovision system. Computers and Graphics, 25 (5), 765-779.
39. Larson, P., Rizzo, A. A., Buckwalter, J. G., van Rooyen, A., Kratz, K., Neumann, U., Kesselman, C., Thiebaux, M. & van der Zaag, C. (1999). Gender issues in the use of virtual environments. CyberPsychology & Behavior, 2(2), 113-123.
40. Lawton, C. A., & Morrin, K. A. (1999). Gender differences in pointing accuracy in computer-simulated 3D mazes. Sex roles, 40(1), 73-92.
41. Lewis, J. R., & Sauro, J. (2009). The factor structure of the system usability scale. In International Conference on Human Centered Design, 94-103.
42. Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1995). Augmented reality: A class of displays on the reality-virtuality continuum. In Photonics for Industrial Applications, 282-292.
43. Miller, G. A. (1956). The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological review, 63(2), 81.
44. Moray, N. (Ed.). (2013). Mental workload: Its theory and measurement. New York, NY: Plenum.
45. Moroney, W. F., Biers, D. W., Eggemeier, F. T., & Mitchell, J. A. (1992). A comparison of two scoring procedures with the NASA task load index in a simulated flight task. In Proceedings of the IEEE National Aerospace and Electronics Conference, 734-740.
46. Navab, N., Bani-Kashemi, A., & Mitschke, M. (1999). Merging visible and invisible: Two camera-augmented mobile C-arm (CAMC) applications. In Proceedings of IEEE and ACM International Workshop on Augmented Reality, 134-41.
47. Okimoto, M. L. L., Okimoto, P. C., & Goldbach, C. E. (2015). User Experience in Augmented Reality Applied to the Welding Education. Procedia Manufacturing, 3, 6223-6227.
48. Rauschnabel, P. A., Brem, A., & Ivens, B. S. (2015). Who will buy smart glasses? Empirical results of two pre-market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Computers in Human Behavior, 49, 635-647.
49. Reid, G. B., & Nygren, T. E. (1988). The subjective workload assessment technique: A scaling procedure for measuring mental workload. Advances in Psychology, 52, 185-218.
50. Renkewitz, H., Kinder, V., Brandt, M., & Alexander, T. (2008). Optimal font size for head-mounted-displays in outdoor applications. In Information Visualisation, 2008. IV'08. 12th International Conference (503-508).
51. Rolland, J. P., & Hua, H. (2005). Head-mounted display systems. Encyclopedia of Optical Engineering, 1-13.
52. Schinke, T., Henze, N., & Boll, S. (2010). Visualization of off-screen objects in mobile augmented reality. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, 313-316.
53. Sheedy, J., & Bergstrom, N. (2002). Performance and comfort on near-eye computer displays. Optometry and Vision Science, 79(5), 306-312.
54. Shrouf, F., Ordieres, J., & Miragliotta, G. (2014). Smart factories in industry 4.0: a review of the concept and of energy management approached in production based on the Internet of things paradigm. In IEEE International Conference on Industrial Engineering and Engineering Management, 697-701.
55. Slater, M. (2003). A note on presence terminology. Retrieved from http://www.cs.ucl.ac.uk/research/vr/Projects/Presencia/ConsortiumPublications/ ucl_cs_papers/presence-terminology.htm.
56. Steinicke, F., Bruder, G., Hinrichs, K., Kuhl, S., Lappe, M., & Willemsen, P. (2009). Judgment of natural perspective projections in head-mounted display environments. In Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology, 35-42.
57. Tang, A., Owen, C., Biocca, F., & Mou, W. (2003). Comparative effectiveness of augmented reality in object assembly. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 73-80.
58. Tullis, T. S., & Stetson, J. N. (2004). A comparison of questionnaires for assessing website usability. In Usability Professional Association Conference, 1-12.
59. Wickens, C.D. (1984). Engineering Psychology and Human Performance. Columbus, OH: Charles E. Merrill.
60. Wilson, J., Steingart, D., Romero, R., Reynolds, J., Mellers, E., Redfern, A. & Wright, P. (2005). Design of monocular head-mounted displays for increased indoor firefighting safety and efficiency. In Defense and Security, 103-114.
61. Yuan, M. L., Ong, S. K., & Nee, A. Y. C. (2008). Augmented reality for assembly guidance using a virtual interactive tool. International Journal of Production Research, 46(7), 1745-1767.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *