帳號:guest(18.191.215.202)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):王駿業
作者(外文):Ong, Jun-Yee
論文名稱(中文):操作提示形態對於使用者使用擴增實境穿戴設備進行任務的表現之影響
論文名稱(外文):Effects of information types on task performance in wearing augmented reality glasses
指導教授(中文):李昀儒
指導教授(外文):Lee, Yun-Ju
口試委員(中文):盧俊銘
黃瀅瑛
口試委員(外文):Lu, Jun-Ming
Huang, Ying-Yin
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:106034401
出版年(民國):109
畢業學年度:108
語文別:中文
論文頁數:77
中文關鍵詞:擴增實境訊息類型任務表現反應時間慣用眼眼動軌跡
外文關鍵詞:Augmented Realityhint typetask performancereaction timedominant eyeseyes movement
相關次數:
  • 推薦推薦:0
  • 點閱點閱:48
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
隨著時代的發展,擴增實境相關技術近幾年來得到突飛猛進的進步,並且開始出現相關民用級別的硬體設備。目前擴增實境技術已經開始應用在各個領域舉凡教育、遊戲等,在這些領域中,快速有效且準確的接收界面訊息是最為重要的,這關乎使用設備效率以及使用者的體驗,但是與使用者反應速度最相關的界面訊息類型卻鮮少有相關研究,本研究將分析文字、圖像、動態圖像三種提示類型何者在使用時擁有最短的反應時間。

除了了解反應速度(提示開始顯示時間至使用者確認之時間差)與資訊的顯示類型之間的關係外,實驗也會分析使用者在操作過程中使用不同側的眼睛是否會影響其精神負荷,補充慣用眼相關研究的空白,並將結果作為擴增實境穿戴設備廠商開發硬體時的依據。研究選擇的精神負荷判斷參數為PERCLOS數值、注視時間、注視點數量、掃視時間、掃視點數量以及眼動面積。本次研究實驗分為前測實驗以及正式實驗兩個部分,兩次實驗中受試者均需進行動態視力檢測(主任務)以及提示在設備出現時點擊觸控板(次任務)兩個任務。

本研究通過前測實驗從選定的對比度以及位置參數組成的提示中挑選各個類別(文字、圖像以及動圖)中擁有最短反應速度的一個供正式實驗中使用,最終選擇為:文字(左上,低對比度)、圖像(左上,低對比度)以及動圖(低對比度)。正式實驗結果表示慣用眼在操作過程中並不會影響使用者的精神負荷,並且對接收提示的反應速度無影響。與此同時,結果也表明所使用的提示類型會影響使用者的PERCLOS數值以及反應速度。注視時間、注視點數量、掃視時間、掃視點數量以及眼動面積參數並未受到慣用眼以及提示類型影響。

本研究結果顯示提示類型的不同確實對使用者接收訊息的反應速度有所影響,慣用眼因素雖然並無發現對所選參數有所影響,但是該因素能夠作為未來相關軟體開發時的參考。未來研究方向為驗證更多類型的提示之影響以及了解慣用眼會產生影響的條件為何。
With the development of the times, augmented reality related technologies have made rapid pro-gress in recent years, and related civilian-level hardware devices have begun to appear. At present, augmented reality technology has been applied in various fields, such as education, gaming, etc. In these fields, it is most important to receive interface messages quickly and accurately. It is re-lated to the efficiency of the use of the equipment and the user experience. Still, it is only a few related research on the types of interface hints that are most relevant to the user's response speed. This study will analyze which of the three types of hints, text, image, and dynamic image, have the shortest reaction time when used.

In addition, the relationship between the reaction time (the time difference from the time when the hint is displayed to the user's confirmation), and the type of hints displayed were evalu-ated. The experiment would also analyze whether the use of eyes from different sides of the user during the operation would affect their mental workload to supply related research on dominant eyes and acted as the guide for the development of augmented reality wearable device hardware. The mental workload judgment parameters selected in this study were PERCLOS, fixation time, number of fixation, saccade time, number of the saccade, and eye movement area. This research experiment was divided into two parts: a pilot study and an official experiment. In both experi-ments, the subjects were required to perform a dynamic vision detection task (primary task) and click on the touchpad when the hint appears on the AR device (secondary task).

Through the pilot study, the one with the shortest reaction time in each category (text, image, and dynamic image) was selected from the hints of the selected contrast and position parameters for formal experiments. The final choice was: text (upper left, low contrast), image (upper left, low contrast), and dynamic image (low contrast). The official experimental results showed that the dominant eye did not affect the user's mental workload during the experiment and had no ef-fect on the reaction time to receiving hints. At the same time, the results also showed that the type of hints used affected the user's PERCLOS value and reaction time. Fixation duration, number of fixation, saccade time, number of the saccade, and eye movement area parameters were not af-fected by dominant eyes and types of hints.

The results of this study show that the different types of hints do affect the user's reaction time when receiving messages. Although the dominant eye factor did not affect the selected men-tal workload parameters, it can be used as a reference for future studies when developing related software. Further research directions are to verify the impact of more types of hints and to under-stand the conditions under which the dominant eye will affect.
摘要
目錄
第一章 緒論---------------------------------8
1.1.研究背景與動機--------------------------8
1.2.研究目的與範圍-------------------------10
第二章 文獻回顧----------------------------11
2.1.擴增實境與相關技術回顧-----------------11
2.1.1. 市售擴增實境穿戴設備整理-----------13
2.1.1.1.谷歌眼鏡(Google Glass)-------------13
2.1.1.2.微軟Hololens-----------------------16
2.1.1.3.EpsonMoverioBT系列-----------------18
2.1.2. 擴增實境於各個領域之應用-----------20
2.2. 視覺相關整理-----------------------24
2.2.1. 色盲成因以及檢測方法---------------24
2.2.2. 慣用眼-----------------------------25
2.3. 操作提示類型的影響-----------------26
2.3.1. 顯示顏色之影響---------------------26
2.3.2. 顯示位置之影響---------------------28
2.3.3. 顯示載體類型-----------------------33
2.4. 任務表現的反應時間-----------------34
2.4.1. 環境對於反應時間之影響-------------34
2.4.2. 精神負荷對於反應時間之影響---------35
2.4.3. 小結-------------------------------35
2.5. 測量精神負荷的方法-----------------36
2.5.1. NASA-TLX---------------------------36
2.5.2. 眼動範圍---------------------------37
2.5.3. 注視與掃視時間及其數目-------------38
2.5.4. PERCLOS----------------------------38
2.5.5. 小結-------------------------------39
第三章 研究方法----------------------------40
3.1. 前測實驗---------------------------40
3.1.1. 目標-------------------------------40
3.1.2. 設備選擇---------------------------40
3.1.3. 受試者-----------------------------41
3.1.4. 實驗設計---------------------------41
3.1.5. 實驗流程---------------------------43
3.1.6. 實驗結果---------------------------44
3.2. 正式實驗受試者---------------------47
3.3. 實驗設置與設備---------------------48
3.4. 實驗設計---------------------------49
3.5. 資料分析方法-----------------------50
3.6. 統計分析---------------------------51
第四章 實驗結果----------------------------52
4.1.受試者基本資料-------------------------52
4.2.PERCLOS--------------------------------52
4.3.注視時間-------------------------------54
4.4.注視點數量-----------------------------55
4.5.掃視時間-------------------------------56
4.6.掃視點數量-----------------------------57
4.7.眼動面積-------------------------------58
4.8.反應速度-------------------------------59
4.9.總結-----------------------------------60
第五章 討論--------------------------------61
5.1.對比度與位置的影響---------------------61
5.2.提示類型與反應速度---------------------63
5.3.慣用眼的影響---------------------------65
5.4.精神負荷相關分析-----------------------67
第六章 結論與未來研究方向------------------70
6.1.結論-----------------------------------70
6.2.未來研究方向---------------------------70
參考文獻-----------------------------------71
附錄---------------------------------------77
[1]Jason Chen (2010).Microsoft Xbox 360 Kinect Launches November 4.檢自https://gizmodo.com/5563148/microsoft-xbox-360-kinect-launches-november-4 (Nov 11,2018)
[2] Vuforia Developer Portal (2011).Announcing the 1.0 Release of the QCAR SDK. 檢自https://developer.vuforia.com/forum/news-and-announcements/announcing-10-release-qcar-sdk (Nov 11,2018)
[3] SEIKO EPSON CORPORATION (JP) (2012). Quick Guide – Moverio BT-100. 檢自https://files.support.epson.com/pdf/bt100_/bt100_qr.pdf (Nov 11,2018)
[4] Niantic (2013). 檢自https://plus.google.com/u/0/+Ingress/posts/2wvNoTjd9jG (Nov 11,2018)
[5] 新浪財經 (2014). 谷歌眼鏡下週二(4月15日)開賣.檢自http://finance.sina.com.cn/360desktop/stock/usstock/c/20140411/182218775173.shtml(Nov 11,2018)
[6] Business Insider (2016). The first version of Microsoft’s insane holographic gog-gles will cost $3000 and start shipping at the end of March. 檢自https://www.businessinsider.com/microsoft-hololens-developer-edition-release-date-apps-2016-2(Nov 11,2018)
[7] Polygon (2016). Pokémon Go heading out to the field in japanese-only beta test. 檢自https://www.polygon.com/2016/3/4/11161010/pokemon-go-field-test-beta-japan(Nov 11,2018)
[8] Apple Inc. (UK) (2017). iOS 11 brings powerful new features to iPhone and iPad this fall. 檢自https://www.apple.com/newsroom/2017/06/ios-11-brings-new-features-to-iphone-and-ipad-this-fall/(Nov 11,2018)
[9] Gosalia, Anuj (2018). Announcing ARCore 1.0 and new updates to Google Lens. https://www.blog.google/products/arcore/announcing-arcore-10-and-new-updates-google-lens/(Nov 11,2018)
[10]Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and Virtual Environments. https://doi.org/10.1162/pres.1997.6.4.355
[11]Kato, H., & Billinghurst, M. (2013). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), 12(23), 85–94. https://doi.org/10.1109/IWAR.1999.803809
[12] Milgram, P. et al. (1994). Augmented Reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies, 2351, 282–292. https://doi.org/10.1.1.83.6861
[13]X Company. Glass Enterprise Edition Helping people work faster and safer with smart glasses. 檢自https://x.company/projects/glass/(Nov 30,2018)
[14]Google I/O(2012). Google I/O 2012 – Keynote Day 1. 檢自https://developers.google.com/events/io/2012/ 1:28:08處(Nov 30,2018)
[15]Google Glass Help. Tech specs. 檢自https://support.google.com/glass/answer/3064128?hl=en&ref_topic=3063354(Nov 30,2018)
[16]Luckerso, Victor (2015). Google Will Stop Selling Glass Next Week. 檢自http://time.com/3669927/google-glass-explorer-program-ends/(Nov 30,2018)
[17] X Company. Glass. 檢自https://www.x.company/glass/(Nov 30,2018)
[18] Microsoft (2018). Hololens hardware details. 檢自https://docs.microsoft.com/en-us/windows/mixed-reality/hololens-hardware-details(Nov 30,2018)
[19]SEIKO EPSON CORPORATION (JP). Moverio BT-300. 檢自https://www.epson.jp/products/moverio/bt300/spec.htm(Nov 30,2018)
[20] Di Serio, Á., Ibáñez, M. B., & Kloos, C. D. (2013). Impact of an augmented reality system on students’ motivation for a visual art course. Computers and Education, 68, 585–596. https://doi.org/10.1016/j.compedu.2012.03.002
[21] Lin, C. Y., Chai, H. C., Wang, J. Y., Chen, C. J., Liu, Y. H., Chen, C. W., … Huang, Y. M. (2016). Augmented reality in educational activities for children with disabilities. Displays, 42, 51–54. https://doi.org/10.1016/j.displa.2015.02.004
[22]Clinical Anatomy and Physiology of the Visual System, 3rd Edition (2012).
[23] Test for Colour-Blindness by S.Ishihara (24 plates version) (1972).
[24] NASA Task Load Index. 檢自https://humansystems.arc.nasa.gov/groups/TLX/ (April 25,2019)
[25] Tian, Yu., Zhang, Shaoyang.,Wang, Chunhui.,Chen, Shanguang.(2018). Eye tracking for assessment of mental workload and evaluation of RVD interface. MMESE 2018:Man-Machine-Environment System Engineering,11-17.
[26]Findley, Ben(2013).Aiming: Dominant Eye and Hand Decisions. 檢自https://www.usacarry.com/aiming-dominant-eye-and-hand-decisions/
[27] Brannan, R.(1990). Eye movement indices of mental worload. Acta Psychologica, 75, 75–89.
[28] Rantanen, E. M., & Goldberg, J. H. (1999). The effect of mental workload on the visual field size and shape. Ergonomics, 42(6), 816–834. https://doi.org/10.1080/001401399185315
[29] Ko, Y. H. (2017). The effects of luminance contrast, colour combinations, font, and search time on brand icon legibility. Applied Ergonomics, 65, 33–40. https://doi.org/10.1016/j.apergo.2017.05.015
[30] Costanza, E., Inverso, S. A., Pavlov, E., Allen, R., & Maes, P. (2006). eye-q: Eyeglass Peripheral Display for Subtle Intimate Notifications. Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI ’06, 211–218. https://doi.org/10.1145/1152215.1152261
[31] Sonderegger, A., & Sauer, J. (2010). The influence of design aesthetics in usability testing: Effects on user performance and perceived usability. Applied Ergonomics, 41(3), 403–410. https://doi.org/10.1016/j.apergo.2009.09.002
[32] Ngo, D. C. L., Teo, L. S., & Byrne, J. G. (2003). Modelling interface aesthetics. Information Sciences, 152(SUPPL), 25–46. https://doi.org/10.1016/S0020-0255(02)00404-8
[33] Altaboli, A., & Lin, Y. (2011). Investigating effects of screen layout elements on interface and screen design aesthetics. Advances in Human-Computer Interaction, 2011. https://doi.org/10.1155/2011/659758
[34] Bhandari, U., Neben, T., Chang, K., & Chua, W. Y. (2017). Effects of interface design factors on affective responses and quality evaluations in mobile applications. Computers in Human Behavior, 72, 525–534. https://doi.org/10.1016/j.chb.2017.02.044
[35] Chua, S. H., Perrault, S. T., Matthies, D. J. C., & Zhao, S. (2016). Positioning Glass: Investigating Display Positions of Monocular Optical See-Through Head-Mounted Display. Proceedings of the Fourth International Symposium on Chinese CHI - ChineseCHI2016, 1–6. https://doi.org/10.1016/B978-0-443-06839-3.00042-4
[36] Zheng, X. S., Foucault, C., Silva, P. M. da, Dasari, S., Yang, T., & Goose, S. (2015). Eye-Wearable Technology for Machine Maintenance: Effects of Display Position and Hands-free Operation. Conference on Human Factors in Computing Systems, 1, 2125–2134. https://doi.org/10.1145/2702123.2702305
[37]Schneider, Marc., Deml, Barbara.(2016). An Integrated Approach of Mental Workload Assessment. Advance in Ergonomics Design of System,p191-208
[38] Schiefer, U., Strasburger, H., Becker, S. T., Vonthein, R., Schiller, J., Dietrich, T. J., & Hart, W. (2001). Reaction time in automated kinetic perimetry: Effects of stimulus luminance, eccentricity, and movement direction. Vision Research, 41(16), 2157–2164. https://doi.org/10.1016/S0042-6989(01)00088-8
[39] Cho, Minyoung. (217AD). Task complexity,modality and working memory in L2 task performance. System, 72, 85–98. https://doi.org/10.1016/j.system.2017.10.010
[40] Makishita, H., & Matsunaga, K. (2008). Differences of drivers’ reaction times according to age and mental workload. Accident Analysis and Prevention, 40(2), 567–575. https://doi.org/10.1016/j.aap.2007.08.012
[41]News Desk(2014).Myopia?Google Glass may cause blind spot. 檢自https://tribune.com.pk/story/786167/myopia-google-glass-may-cause-blind-spot/(March22, 2019)
[42]PingWest(2019).微軟尖端科技Hololens的前世今生.檢自https://technews.tw/2019/02/28/microsoft-hololens-story/(March23, 2019)
[43] Bernhardt, S., Nicolau, S. A., Soler, L., & Doignon, C. (2017). The status of augmented reality in laparoscopic surgery as of 2016. Medical Image Analysis, 37, 66–90. https://doi.org/10.1016/j.media.2017.01.007
[44] Miyake, R. K., Zeman, H. D., & Kikuchi, R. (2006). Vein Imaging : A New Method of Near Infrared, (September). https://doi.org/10.1111/j.1524-4725.2006.32226.x
[45] Hannah, B., Burnett, G., & Skrypchuk, L. (2018). Establishing the Role of a Virtual Lead Vehicle as a Novel Augmented Reality Navigational Aid, 137–145.
[46]瞬間視覺訓練.http://25.hon5.com/25.php?1557086458
[47]《色盲檢查圖》第六版.俞自萍,曹愈,曹凱(2017).ISBN978-7-117-23231-9
[48]Image Color Summarizer. http://mkweb.bcgsc.ca/color-summarizer/?analyze
[49] Wittmann, M., Kiss, M., Gugg, P, Steffen, A., Fink, M., Poppel, E, Kamiya, H.(2006).Effect of display position of a visual in vehicle task on simulated driv-ing.Applied Ergonomics 37(2006) 197-199
[50]Coren,Stanley.(1999).Sensorimotor performance as a function of eye dominance and handedness.Perceptual and motor skills,1999 88,424-426.
[51]Sugiyama,Y., Nishizono,H., Takeshita,S. and Yamada,R..Eye dominance, visibil-ity and putting performance.Science and Golf IV:Proceeding of the world Scien-tific Congress of Golf,151-155
[52]Thomas,N.G.,Harden,L.M., Rogers,G.G. (2005).Visual evoked potential,reaction time,and eye dominance in cricketers.Journal of Sports Medicine and Physical Fitness;Sep 2005;pg.428
[53]Blankenberger, Sven., Hahn, Klaus.(1990). Effect of icon design on human computer interaction.International Journal of Man-Machine Stud-ies(1991)35,363-377
[54]Mcgregor, David K. & Stern, John A. (1996). Time on task and blink effects on saccade duration. Journal Ergonomics, Volume 39, 1996- Issue 4
[55]Vater, Christian., Kredel, Ralf., Hossner, Ernst-Joachim.(2016). Detecting single target changes in multiple object tracking:The case of peripheral vision. Atten Percept Psychophs(2016)78:1004-1019.
[56]Rivecourt, M.,Kuperus, M.N.,Post, W.J. & Mulder, L.J.(2008). Cardiovasular and eye activity muasures as indices for momentary changes in mental effort during simulated flight. Ergonomics,Volume51,2008-Issue9
[57]Lum, Heather.C., Greatbatch, Richard.L.,Waldfogle, Grace.E., Benedit, Jacob.D. & Nembhard, David A. (2016). The Relationship of EyeMovement,Workload, and Attention on Learning in a Computer Based Training Program. Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting.
[58]Latham, Keziah., Whitaker, David (1996). A Comparison of Word Recognition and Reading Performance in Foveal and Peripheral Vision. Vision Res., Vol.36, No17, pp2665-2674,1996
[59]Create your own augmented reality. 檢自:https://theroar.io/create-own-augmented-reality/
[60]Blizzard Entertainment. Overwatch low health effect. https://i.imgur.com/ZODIbcq.jpg
[61] Kline, T., Ghali, L.M., Kline, D.W., Brown, S. (1990). Visibility Distance of Highway Signs among Young, Middle Aged, and Older Observers: Icons Are Better Than Text. Volume 32 issue 5, pages 609-619
[62]谷歌眼鏡企業版參考圖,來源:https://x.company/glass/
[63]谷歌眼鏡前端實體圖,來源:https://www.zhihu.com/question/20276179
[64]谷歌眼鏡前段平面構造圖,來源:https://www.zhihu.com/question/
20276179
[65] Hololens外觀設計圖,來源:https://www.microsoft.com/en-us/hololens
/hardware
[66] Hololens與Kinect2接收器對應圖,來源:https://technews.tw/2019/02/28/microsoft-hololens-story/
[67] Epson Moverio BT-300外觀設計圖,來源:https://www.epson.jp/products/moverio/bt300/
[68] BT-300控制器各零件名稱圖,來源:http://support.epson.com.tw/i-tech/%E6%8A%80%E8%A1%93%E6%96%87%E4%BB%B6/BT-300%20Manual.pdf
[69]圖像來源:https://petapixel.com/2012/06/05/a-quick-trick-for-figuring-out-which-of-your-eyes-is-dominant/
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *