帳號:guest(3.144.227.146)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):黎百加
作者(外文):Li, Pai-Chia
論文名稱(中文):擴增實境中基於機器編程示範之運動規劃
論文名稱(外文):Programming by Demonstration based Motion Planning in Augmented Reality
指導教授(中文):瞿志行
指導教授(外文):Chu, Chih-Hsing
口試委員(中文):黃瀅瑛
郭嘉真
口試委員(外文):Huang, Ying-Yin
Kuo, Chia-Chen
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:104034508
出版年(民國):106
畢業學年度:105
語文別:中文
論文頁數:67
中文關鍵詞:擴增實境示範學習使用者介面路徑規劃點膠機機器人
外文關鍵詞:augmented realityprogramming by demonstrationuser interfacepath planningdispenserrobotics
相關次數:
  • 推薦推薦:0
  • 點閱點閱:420
  • 評分評分:*****
  • 下載下載:24
  • 收藏收藏:0
隨著產品週期縮短與多樣性的需求增加,傳統大量生產逐漸式微,轉為以小批量客製化製造型態為主。使用的生產機具也從專用機械,轉變為智慧型彈性製造系統,由數值控制工具機與機器人兩項自動化技術所構成。實務界常用的機器人運動規劃方式為示範學習(Programming by Demonstration, PbD),工程人員使用教導器在真實製造場景中,演示機器人操作的具體內容,再藉由模仿或重播演示過程執行實際工作。但教導器之操作複雜且耗時費力,人員的經驗將影響規劃之效率與準確度。為提高機器編程示範之運動規劃效能,本研究將發展新型態之示範學習使用者介面,以擴增實境(Augmented Reality, AR)作為高互動性的人機介面,於虛擬的三維場景中,規劃真實機器的運動軌跡,藉由虛實互動功能提供即時性輔助資訊,提高使用者對於空間幾何的感知能力,協助正確且快速完成路徑規劃。藉由兩種不同的彈性製造設備,進行上述概念的驗證與成果展示,首先使用五軸力回饋裝置,模仿多軸機械手臂的運動,使用者直接移動其末端執行器,對虛擬模型進行加工路徑規劃;此外以開放式控制的多軸點膠機為例,對虛擬模型操縱真實探針,以決定點膠機的運動軌跡。整合深度攝影機、虛擬工件與機器座標系,以深度攝影機即時追蹤與識別空間中的運動目標物,擷取與記錄其運動軌跡,即時產生規劃上的輔助性資訊,並根據不同的使用情境,顯示基於擴增實境之互動介面,在機器運動規劃上的應用價值。本研究開啟擴增實境在工業上的創新性應用,以及人機互動技術在製造上新課題。
As the need of mass production decrease, traditional machinery are being replaced by intellectualized flexible manufactured machine tools. Practically, robot path planning are done via programming by demonstration, PbD. Human operators guide the robot through teach-pendant, then the robot finish the task by repeating the demonstration. Unfortunately, it is a very time‐consuming and complex process, usually required certain level of expertise. In this research, we proposed a highly interactive human machine interface, HMI, through augmented reality, AR, to promote the efficiency and intuition during the planning process. This system is implemented in two different flexible manufactured machine tools to verify the concept and present the result works. One is a 5-DOF haptic device to mimic the movement of a 5-axis robot arm, another is an open loop control multi-axis dispenser machine. Users plan the path over a virtual workpiece by a color-marked probe, guidance information is presented on the virtual model to avoid collisions, virtual-real interaction is also provided based on user’s behavior during the procedure to upgrade the spatial perception of users. First, multiple coordinate frames are integrated, i.e., RGBD camera, virtual workpiece and the manufactured machine tools. Then a RGBD cam is applied to continuously gather geometry information of the real world and track the colored target. Finally, trajectory of the colored target is captured and recorded. In this research, we revel the ability and value of AR-based HMI in robot path planning. Meanwhile open a new chapter in innovative application of augmented reality in industry and human machine interface technology.
摘要 II
圖目錄 VI
表目錄 IX
第一章、緒論 1
1.1 研究背景 1
1.2 文獻探討 2
1.2.1 示範學習 2
1.2.2 擴增實境於工業製造及路徑規劃上之應用 3
1.2.3 擴增實境結合電腦輔助設計軟體之應用 5
1.3 研究目的 6
第二章、系統架構 8
2.1 系統簡述與架構 8
2.2 應用情境描述 10
第三章、基於機器編程示範擴增實境之方法論述 13
3.1 擴增實境場景建立 13
3.1.1 建立真實場景三維雲點 13
3.1.2 目標物追蹤與座標系整合 19
3.1.3 座標系整合 21
3.2 互動功能開發 26
3.2.1 工件呈現方式 26
3.2.2引導輔助功能開發 29
3.3 運動路徑生成 35
3.3.1路徑紀錄與幾何資訊擷取 36
3.3.3 生成機器路徑檔 38
第四章、系統實作 42
5.1 研究工具 42
5.2 實作結果 44
5.2.1 Phantom Omni®實作結果 45
5.2.2 多軸點膠機實作結果 51
5.3 誤差驗證 54
第五章、結論與未來展望 61
第六章、參考文獻 64

[1] Z. Pan, J. Polden, N. Larkin, S. Van Duin, and J. Norrish, "Recent progress on programming methods for industrial robots," Robotics and Computer-Integrated Manufacturing, vol. 28, pp. 87-94, 2012.
[2] J. N. Pires, A. Loureiro, T. Godinho, P. Ferreira, B. Fernando, and J. Morgado, "Object oriented and distributed software applied to industrial robotic welding," Industrial Robot: An International Journal, vol. 29, pp. 149-161, 2002.
[3] R. D. Schraft and C. Meyer, "The need for an intuitive teaching method for small and medium enterprises," VDI BERICHTE, vol. 1956, p. 95, 2006.
[4] G. Biggs and B. MacDonald, "A survey of robot programming systems," in Proceedings of the Australasian conference on robotics and automation, 2003, pp. 1-3.
[5] H. C. Fang, S. K. Ong, and A. Y. C. Nee, "A novel augmented reality-based interface for robot path planning," International Journal on Interactive Design and Manufacturing (IJIDeM), vol. 8, pp. 33-42, 2014.
[6] G. Reinhart, U. Munzert, and W. Vogl, "A programming system for robot-based remote-laser-welding with conventional optics," CIRP Annals - Manufacturing Technology, vol. 57, pp. 37-40, // 2008.
[7] 中部線割/雷射 股份有限公司
(http://www.chungpu.com/)
[8] CTIMES - 機器人進駐未來新生活
(http://www.hope.com.tw/DispArt-tw.asp?O=HK01J8ZE63OARASTDK)
[9] B. Siciliano and O. Khatib, Springer handbook of robotics: Springer Science & Business Media, 2008.
[10] R. Dillmann, "Teaching and learning of robot tasks via observation of human performance," Robotics and Autonomous Systems, vol. 47, pp. 109-116, 2004.
[11] S. B. Kang and K. Ikeuchi, "A robot system that observes and replicates grasping tasks," in Computer Vision, 1995. Proceedings., Fifth International Conference on, 1995, pp. 1093-1099.
[12] F. J. Abu-Dakka, B. Nemec, A. Kramberger, A. G. Buch, N. Krüger, and A. Ude, "Solving peg-in-hole tasks by human demonstration and exception strategies," Industrial Robot: An International Journal, vol. 41, pp. 575-584, 2014.
[13] J. Lambrecht, M. Kleinsorge, M. Rosenstrauch, and J. Krüger, "Spatial programming for industrial robots through task demonstration," Int J Adv Robotic Sy, vol. 10, 2013.
[14] Gesture-based Programming by Demonstration for Industrial Robots
(https://youtu.be/Pnt2NY5J5Nk?list=PLnHv0ihDurWaOP-TsWy8CqXDIxJ6qajDx)
[15] H. C. Fang, S. K. Ong, and A. Y. C. Nee, "Interactive robot trajectory planning and simulation using Augmented Reality," Robotics and Computer-Integrated Manufacturing, vol. 28, pp. 227-237, 4// 2012.
[16] C. Breazeal, A. Edsinger, P. Fitzpatrick, and B. Scassellati, "Active vision for sociable robots," IEEE Transactions on systems, man, and cybernetics-part A: Systems and Humans, vol. 31, pp. 443-453, 2001.
[17] C. A. Jara, F. A. Candelas, P. Gil, M. Fernández, and F. Torres, "An augmented reality interface for training robotics through the web," in Proceedings of the 40th International Symposium on Robotics", ISBN, 2005, pp. 978-84.
[18] R. Bischoff and A. Kazi, "Perspectives on augmented reality based human-robot interaction with industrial robots," in 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), 2004, pp. 3226-3231 vol.4.
[19] R. Marin, P. J. Sanz, P. Nebot, and R. Wirz, "A multimodal interface to control a robot arm via the web: a case study on remote programming," IEEE Transactions on Industrial Electronics, vol. 52, pp. 1506-1520, 2005.
[20] G. A. Lee and G. J. Kim, "Immersive authoring of Tangible Augmented Reality content: A user study," Journal of Visual Languages & Computing, vol. 20, pp. 61-79, 4// 2009.
[21] S. K. Ong, J. W. S. Chong, and A. Y. C. Nee, "A novel AR-based robot programming and path planning methodology," Robotics and Computer-Integrated Manufacturing, vol. 26, pp. 240-249, 6// 2010.
[22] S. Ong, M. Yuan, and A. Nee, "Augmented reality applications in manufacturing: a survey," International journal of production research, vol. 46, pp. 2707-2742, 2008.
[23] A. Rastogi, P. Milgram, and J. J. Grodski, "Augmented telerobotic control: a visual interface for unstructured environments," in Proceedings of the KBS/Robotics Conference, 1995, pp. 16-18.
[24] V. W. Za ¨h MF, Munzert U, "Accelerating the Teaching Process of Industrial Robots—Augmented Reality for Intuitive Man–machine Interaction," Werkstattstechnik, vol. 94, pp. 438-441, 2004.
[25] J. W. S. Chong, S. K. Ong, A. Y. C. Nee, and K. Youcef-Youmi, "Robot programming using augmented reality: An interactive method for planning collision-free paths," Robotics and Computer-Integrated Manufacturing, vol. 25, pp. 689-701, 2009.
[26] B. D. Argall, S. Chernova, M. Veloso, and B. Browning, "A survey of robot learning from demonstration," Robotics and Autonomous Systems, vol. 57, pp. 469-483, 5/31/ 2009.
[27] N. Andersson, A. Argyrou, F. Nägele, F. Ubis, U. E. Campos, M. O. de Zarate, et al., "AR-Enhanced Human-Robot-Interaction-Methodologies, Algorithms, Tools," Procedia CIRP, vol. 44, pp. 193-198, 2016.
[28] O. Bimber and R. Raskar, Spatial augmented reality: merging real and virtual worlds: CRC press, 2005.
[29] Richard's blog - 攝像頭校正 camera calibration - part 1 camera model
(http://wycwang.blogspot.tw/2012/09/camera-calibration-part-1-camera-model.html)
[30] D. E. Knuth, The art of computer programming: sorting and searching vol. 3: Pearson Education, 1998.
[31] Y.-Y. Chuang, D. B. Goldman, B. Curless, D. H. Salesin, and R. Szeliski, "Shadow matting and compositing," in ACM Transactions on Graphics (TOG), 2003, pp. 494-500.
[32] H. Asada and H. Izumi, "Automatic program generation from teaching data for the hybrid control of robots," IEEE Transactions on Robotics and Automation, vol. 5, pp. 166-173, 1989.
[33] OpenCV - Miscellaneous Image Transformations
(http://docs.opencv.org/2.4/modules/imgproc/doc/miscellaneous_transformations.html)
[34] Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on pattern analysis and machine intelligence, vol. 22, pp. 1330-1334, 2000.
[35] Richard's blog - 攝像頭校正 攝像頭校正 camera calibration - part 2 calibration.
(http://wycwang.blogspot.tw/2012/10/camera-calibration-part-2-calibration.html)
[36] ccjou. 線代啟示錄 - 奇異值分解 (SVD).
(https://ccjou.wordpress.com/2009/09/01/%E5%A5%87%E7%95%B0%E5%80%BC%E5%88%86%E8%A7%A3-svd/)
[37] Chua Hock-Chuan programming notes - 3D Graphics with OpenGL
(https://www.ntu.edu.sg/home/ehchua/programming/opengl/CG_BasicsTheory.html)
[38] tenos. JustDoIT博客園 - 二維平面上判斷點是否在三角形內.
(http://www.cnblogs.com/TenosDoIt/p/4024413.html)
[39] D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to linear regression analysis: John Wiley & Sons, 2015.
[40] J. Canny, "A computational approach to edge detection," IEEE Transactions on pattern analysis and machine intelligence, pp. 679-698, 1986.
[41] D. H. Ballard, "Generalizing the Hough transform to detect arbitrary shapes," Pattern recognition, vol. 13, pp. 111-122, 1981.
[42] M. A. Fischler and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol. 24, pp. 381-395, 1981.
[43] Wikipedia - RGB Color Mode.
(https://en.wikipedia.org/wiki/RGB_color_model)
[44] 陳瑋萱,,「改善擴增實境中虛實遮蔽之邊緣品質」清華大學工業工程與工程管理學系學位論文,2016
[45] OpenCV
(http://opencv.org/)
[46] Jackraken's blog - openGL教學<-> 認識openGL
(http://jackraken.github.io/2014/09/17/openGL_basic/)
[47] 李寧,「Android案例開發完全講義」碁峰資訊股份有限公司,2011
[48] OpenCV 3.0 alpha
(http://opencv.org/opencv-3-0-alpha.html)
[49] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik, "Evaluating and Improving the Depth Accuracy of Kinect for Windows v2," IEEE Sensors Journal, vol. 15, pp. 4275-4285, 2015.
[50] B. Karan, "Calibration of Kinect-type RGB-D sensors for robotic applications," FME Transactions, vol. 43, pp. 47-54, 2015.
[51] A. Staranowicz, G. R. Brown, F. Morbidi, and G. L. Mariottini, "Easy-to-use and accurate calibration of rgb-d cameras from spheres," Image and Video Technology, pp. 265-278, 2014.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *