帳號:guest(3.12.74.66)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):劉宇倫
作者(外文):Liu, Yu-Lun
論文名稱(中文):擴增實境中人機協作之輔助性互動介面研究
論文名稱(外文):An Experimental Study on Interactive Interface of Human Robot Collaboration in Augmented Reality
指導教授(中文):瞿志行
指導教授(外文):Chu, Chih-Hsing
口試委員(中文):黃瀅瑛
王怡然
口試委員(外文):Huang, Ying-Yin
Wang, I-Jan
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:109034558
出版年(民國):111
畢業學年度:110
語文別:中文
論文頁數:72
中文關鍵詞:人機協作擴增實境視覺回饋觸覺回饋人因評估組裝作業
外文關鍵詞:Human robot collaborationAugmented realityVisual feedbackHaptic feedbackErgonomic assessmentManual assembly
相關次數:
  • 推薦推薦:0
  • 點閱點閱:59
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
全球製造業已從過往的大量生產,逐漸轉型成客製化、小批量與高彈性的生產模式。人工智慧已廣泛運用於製造現場,然而仍存在許多人工作業,其自動化困難度或成本過高,較可行的思維是透過適當輔助功能,結合人類決策、動作與人工智慧,實現所謂的人機協同合作。現今人機協作技術已逐漸成熟,結合機器人的高效率,與人類面對不確定環境的反應能力,可有效執行高複雜性工作。與機器協作的過程中,如何確保人員安全,避免相互干擾甚至碰撞,成為重要議題之一。本研究使用擴增實境技術,發展人與機器人之間的互動介面,針對分類與組裝作業情境,提高人機協作的安全性與信任感。實驗結果顯示,即時性資訊回饋可提升工作效率,受試者透過擴增實境的視覺輔助功能,預先了解機械手臂的移動路徑。而熟悉觸覺輔助功能後,即使視覺專注於任務的同時,仍然能掌握足夠的資訊,判斷機械手臂的移動路徑。根據受試者眼動行為,以及與機械手臂靠近的程度分析中,同樣顯示出回饋資訊的助益。本研究獲得不同感官互動介面的設計準則,有效提升人機協作的工作效率,同時也展示了擴增實境技術的創新性應用。
Modern manufacturing industry has been transformed from mass production to customization with small batch and high flexibility. Artifical intelligence (AI) has been applied to a wide spectrum of manufacturing activities, but manual operations remain indispensable on the shop floor due to high complexity and/or cost involving in their automation. A more practical approach is to combine AI with human decision and action via assisted functions, thus realizing the idea of human robot collaboration (HRC). With recent progresses, HRC technologies have been able to intergrade high efficiency of machines and human capability of exception handling under uncertainty to perform high-complexity tasks. One critical issue in HRC is to assure operator's safety by avoiding interference and collision with robot during the collaboration process. This work develops an interface between human and robot using augmented reality (AR) technology. The focus is on improving the safety and trustworthiness in HRC. Experimental results show that instant feedback can improve the work performance of the operator collaborating with a robot. Subjects recognize the moving trajectory of the robot in advance assised by a visual interface in AR. Besides, they can be visually focused on the task being conducted while receiving sufficient information about the robot's movement communicated via tactile feedback. Analysis of the eye tracking behavior of the subjects and the degree of proximity to the moving robot shows the similar effectiveness of the tactile feedback. This work derives important insights into the design of various sensory feedback for assisting human robot collaboration. It also opens up new AR applications in smart manufacturing.
摘要 II
Abstract III
圖目錄 VI
表目錄 VIII
一、 緒論 1
1.1 研究背景 1
1.2 研究目的 1
二、 文獻回顧 4
2.1 人機協作 4
2.2 人機協作安全輔助功能 6
2.2.1 視覺輔助功能 6
2.2.2 觸覺輔助功能 7
2.2.3 聽覺回饋 8
2.2.4 多感官回饋 9
2.3 小結 9
三、 研究方法 11
3.1 實驗內容 11
3.1.1 人機協作實驗 11
3.2 實驗設備 19
3.2.1 機械手臂 19
3.2.2 觸覺回饋手套 21
3.2.3 頭戴式顯示裝置 22
3.2.4 系統整合 22
3.3 實驗設計 26
3.3.1 實驗受試者 26
3.3.2 實驗程序 26
3.3.3 實驗假設 27
四、 研究結果與討論 28
4.1 實驗數據分析與預處裡 28
4.2 實驗數據分析及討論 30
4.2.1 學習效應分析 30
4.2.2 回饋內容對工作效率之分析 31
4.2.3 主觀問卷分析 33
4.2.4 受試者行為分析 40
4.3 小結 46
五、結論與未來展望 47
5.1 結論 47
5.2 未來展望 48
參考文獻 49
附錄一、實驗表現數據 53
附錄二、實驗受試者行為分析數據 57
附錄三、NASA-TLX數據 61
附錄四、系統易用性問卷數據 65
附錄五、安全性主觀問卷數據 69
[1] Wang L., Liu S., Liu H., Wang X.V. (2020). Overview of Human-Robot Collaboration in Manufacturing, in Proceedings of 5th International Conference on the Industry 4.0 Model for Advanced Manufacturing, p. 15-58
[2] Pini, F., Ansaloni, M., & Leali, F. (2016, September). Evaluation of operator relief for an effective design of HRC workcells. In 2016 IEEE 21st international conference on emerging technologies and factory automation (ETFA) (pp. 1-6). IEEE.
[3] Schlotzhauer, A., Kaiser, L., Wachter, J., Brandstötter, M., & Hofbaur, M. (2019, August). On the trustability of the safety measures of collaborative robots: 2D Collision-force-map of a sensitive manipulator for safe HRC. In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE) (pp. 1676-1683). IEEE.
[4] Human’s FOV: https://vrui-research.gitbook.io/researchonvrui/ergonomic-issues/jie-mian-zui-yu/ren-yan-kan-neng-li
[5] Wickens, C. D. (2008). Multiple Resources and Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50(3), 449–455.
[6] Kumar, S., Savur, C., & Sahin, F. (2020). Survey of human–robot collaboration in industrial settings: Awareness, intelligence, and compliance. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 51(1), 280-297.
[7] Villani, V., Pini, F., Leali, F., & Secchi, C. (2018). Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics, 55, 248-266.
[8] Bdiwi, M., Pfeifer, M., & Sterzing, A. (2017). A new strategy for ensuring human safety during various levels of interaction with industrial robots. CIRP Annals, 66(1), 453-456.
[9] International Organization for Standardization. (2016). Robots and robotic devices - Collaborative robots (ISO Standard No. 15066: 2016). https://www.iso.org/standard/62996.html
[10] Green, S. A., Billinghurst, M., Chen, X., & Chase, J. G. (2008). Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems, 5(1), 1.
[11] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., & Kruijff, E. (2019, October). Non-visual cues for view management in narrow field of view augmented reality displays. In 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 190-201). IEEE.
[12] Lipowski, Z. J. (1975). Sensory and information inputs overload: behavioral effects. Comprehensive Psychiatry.
[13] Hietanen, A., Pieters, R., Lanz, M., Latokartano, J., & Kämäräinen, J. K. (2020). AR-based interaction for human-robot collaborative manufacturing. Robotics and Computer-Integrated Manufacturing, 63, 101891.
[14] Bolano, G., Fu, Y., Roennau, A., & Dillmann, R. (2021, July). Deploying Multi-Modal Communication Using Augmented Reality in a Shared Workspace. In 2021 18th International Conference on Ubiquitous Robots (UR) (pp. 302-307). IEEE.
[15] Grushko, S., Vysocký, A., Oščádal, P., Vocetka, M., Novák, P., & Bobovský, Z. (2021). Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors, 21(11), 3673.
[16] Scheggi, S., Chinello, F., & Prattichizzo, D. (2012, June). Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks. In Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments (pp. 1-4).
[17] Eckert, M., Blex, M., & Friedrich, C. M. (2018, January). Object detection featuring 3D audio localization for Microsoft HoloLens. In Proc. 11th Int. Joint Conf. on Biomedical Engineering Systems and Technologies (Vol. 5, pp. 555-561).
[18] Zijlstra, A. T. (2017). Using the HoloLens' Spatial Sound System to aid the Visually Impaired when Navigating Indoors (Doctoral dissertation, Faculty of Science and Engineering, University of Groningen).
[19] Ménélas, B., Picinalli, L., Katz, B. F., & Bourdot, P. (2010, March). Audio haptic feedbacks for an acquisition task in a multi-target context. In 2010 IEEE symposium on 3D user interfaces (3DUI) (pp. 51-54). IEEE.
[20] Dehais, F., Sisbot, E. A., Alami, R., & Causse, M. (2011). Physiological and subjective evaluation of a human–robot object hand-over task. Applied ergonomics, 42(6), 785-791.
[21] Lasota, P. A., Fong, T., & Shah, J. A. (2017). A survey of methods for safe human-robot interaction. Now Publishers.
[22] NASA-TLX Questionnaire: https://humansystems.arc.nasa.gov/groups/tlx/
[23] Salvendy, G. (Ed.). (2006). Handbook of human factors and ergonomics (Vol. 144). New York: Wiley.
[24] Deci, E. L., Vallerand, R. J., Pelletier, L. G., & Ryan, R. M. (1991). Motivation and education: The self-determination perspective. Educational psychologist, 26(3-4), 325-346.
[25] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., Kruijff, E., & Schöning, J. (2020). Comparing non-visual and visual guidance methods for narrow field of view augmented reality displays. IEEE Transactions on Visualization and Computer Graphics, 26(12), 3389-3401.
[26] Trossen Robotics ROS Research Arm: WidowX 250 Robot Arm, https://www.trossenrobotics.com/widowx-250-robot-arm.aspx
[27] SensorGlove Nova: https://www.senseglove.com/product/nova/
[28] HTC VIVE Base Station 2.0: https://www.vive.com/us/accessory/base-station2/
[29] HTC VIVE Tracker 2.0: https://www.vive.com/nz/accessory/vive-tracker/
[30] Microsoft HoloLens 2: https://www.microsoft.com/en-us/HoloLens/hardware
[31] Windows Subsystem for Linux Documentation: https://docs.microsoft.com/en-us/windows/wsl/
[32] ROS-Robot Operating System: https://www.ros.org/
[33] System Usability Scale, SUS: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
[34] ROS MoveIt: https://moveit.ros.org/
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *