|
[1] Z. Pan, J. Polden, N. Larkin, S. Van Duin, and J. Norrish, "Recent progress on programming methods for industrial robots," Robotics and Computer-Integrated Manufacturing, vol. 28, pp. 87-94, 2012. [2] J. N. Pires, A. Loureiro, T. Godinho, P. Ferreira, B. Fernando, and J. Morgado, "Object oriented and distributed software applied to industrial robotic welding," Industrial Robot: An International Journal, vol. 29, pp. 149-161, 2002. [3] R. D. Schraft and C. Meyer, "The need for an intuitive teaching method for small and medium enterprises," VDI BERICHTE, vol. 1956, p. 95, 2006. [4] G. Biggs and B. MacDonald, "A survey of robot programming systems," in Proceedings of the Australasian conference on robotics and automation, 2003, pp. 1-3. [5] H. C. Fang, S. K. Ong, and A. Y. C. Nee, "A novel augmented reality-based interface for robot path planning," International Journal on Interactive Design and Manufacturing (IJIDeM), vol. 8, pp. 33-42, 2014. [6] G. Reinhart, U. Munzert, and W. Vogl, "A programming system for robot-based remote-laser-welding with conventional optics," CIRP Annals - Manufacturing Technology, vol. 57, pp. 37-40, // 2008. [7] 中部線割/雷射 股份有限公司 (http://www.chungpu.com/) [8] CTIMES - 機器人進駐未來新生活 (http://www.hope.com.tw/DispArt-tw.asp?O=HK01J8ZE63OARASTDK) [9] B. Siciliano and O. Khatib, Springer handbook of robotics: Springer Science & Business Media, 2008. [10] R. Dillmann, "Teaching and learning of robot tasks via observation of human performance," Robotics and Autonomous Systems, vol. 47, pp. 109-116, 2004. [11] S. B. Kang and K. Ikeuchi, "A robot system that observes and replicates grasping tasks," in Computer Vision, 1995. Proceedings., Fifth International Conference on, 1995, pp. 1093-1099. [12] F. J. Abu-Dakka, B. Nemec, A. Kramberger, A. G. Buch, N. Krüger, and A. Ude, "Solving peg-in-hole tasks by human demonstration and exception strategies," Industrial Robot: An International Journal, vol. 41, pp. 575-584, 2014. [13] J. Lambrecht, M. Kleinsorge, M. Rosenstrauch, and J. Krüger, "Spatial programming for industrial robots through task demonstration," Int J Adv Robotic Sy, vol. 10, 2013. [14] Gesture-based Programming by Demonstration for Industrial Robots (https://youtu.be/Pnt2NY5J5Nk?list=PLnHv0ihDurWaOP-TsWy8CqXDIxJ6qajDx) [15] H. C. Fang, S. K. Ong, and A. Y. C. Nee, "Interactive robot trajectory planning and simulation using Augmented Reality," Robotics and Computer-Integrated Manufacturing, vol. 28, pp. 227-237, 4// 2012. [16] C. Breazeal, A. Edsinger, P. Fitzpatrick, and B. Scassellati, "Active vision for sociable robots," IEEE Transactions on systems, man, and cybernetics-part A: Systems and Humans, vol. 31, pp. 443-453, 2001. [17] C. A. Jara, F. A. Candelas, P. Gil, M. Fernández, and F. Torres, "An augmented reality interface for training robotics through the web," in Proceedings of the 40th International Symposium on Robotics", ISBN, 2005, pp. 978-84. [18] R. Bischoff and A. Kazi, "Perspectives on augmented reality based human-robot interaction with industrial robots," in 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), 2004, pp. 3226-3231 vol.4. [19] R. Marin, P. J. Sanz, P. Nebot, and R. Wirz, "A multimodal interface to control a robot arm via the web: a case study on remote programming," IEEE Transactions on Industrial Electronics, vol. 52, pp. 1506-1520, 2005. [20] G. A. Lee and G. J. Kim, "Immersive authoring of Tangible Augmented Reality content: A user study," Journal of Visual Languages & Computing, vol. 20, pp. 61-79, 4// 2009. [21] S. K. Ong, J. W. S. Chong, and A. Y. C. Nee, "A novel AR-based robot programming and path planning methodology," Robotics and Computer-Integrated Manufacturing, vol. 26, pp. 240-249, 6// 2010. [22] S. Ong, M. Yuan, and A. Nee, "Augmented reality applications in manufacturing: a survey," International journal of production research, vol. 46, pp. 2707-2742, 2008. [23] A. Rastogi, P. Milgram, and J. J. Grodski, "Augmented telerobotic control: a visual interface for unstructured environments," in Proceedings of the KBS/Robotics Conference, 1995, pp. 16-18. [24] V. W. Za ¨h MF, Munzert U, "Accelerating the Teaching Process of Industrial Robots—Augmented Reality for Intuitive Man–machine Interaction," Werkstattstechnik, vol. 94, pp. 438-441, 2004. [25] J. W. S. Chong, S. K. Ong, A. Y. C. Nee, and K. Youcef-Youmi, "Robot programming using augmented reality: An interactive method for planning collision-free paths," Robotics and Computer-Integrated Manufacturing, vol. 25, pp. 689-701, 2009. [26] B. D. Argall, S. Chernova, M. Veloso, and B. Browning, "A survey of robot learning from demonstration," Robotics and Autonomous Systems, vol. 57, pp. 469-483, 5/31/ 2009. [27] N. Andersson, A. Argyrou, F. Nägele, F. Ubis, U. E. Campos, M. O. de Zarate, et al., "AR-Enhanced Human-Robot-Interaction-Methodologies, Algorithms, Tools," Procedia CIRP, vol. 44, pp. 193-198, 2016. [28] O. Bimber and R. Raskar, Spatial augmented reality: merging real and virtual worlds: CRC press, 2005. [29] Richard's blog - 攝像頭校正 camera calibration - part 1 camera model (http://wycwang.blogspot.tw/2012/09/camera-calibration-part-1-camera-model.html) [30] D. E. Knuth, The art of computer programming: sorting and searching vol. 3: Pearson Education, 1998. [31] Y.-Y. Chuang, D. B. Goldman, B. Curless, D. H. Salesin, and R. Szeliski, "Shadow matting and compositing," in ACM Transactions on Graphics (TOG), 2003, pp. 494-500. [32] H. Asada and H. Izumi, "Automatic program generation from teaching data for the hybrid control of robots," IEEE Transactions on Robotics and Automation, vol. 5, pp. 166-173, 1989. [33] OpenCV - Miscellaneous Image Transformations (http://docs.opencv.org/2.4/modules/imgproc/doc/miscellaneous_transformations.html) [34] Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on pattern analysis and machine intelligence, vol. 22, pp. 1330-1334, 2000. [35] Richard's blog - 攝像頭校正 攝像頭校正 camera calibration - part 2 calibration. (http://wycwang.blogspot.tw/2012/10/camera-calibration-part-2-calibration.html) [36] ccjou. 線代啟示錄 - 奇異值分解 (SVD). (https://ccjou.wordpress.com/2009/09/01/%E5%A5%87%E7%95%B0%E5%80%BC%E5%88%86%E8%A7%A3-svd/) [37] Chua Hock-Chuan programming notes - 3D Graphics with OpenGL (https://www.ntu.edu.sg/home/ehchua/programming/opengl/CG_BasicsTheory.html) [38] tenos. JustDoIT博客園 - 二維平面上判斷點是否在三角形內. (http://www.cnblogs.com/TenosDoIt/p/4024413.html) [39] D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to linear regression analysis: John Wiley & Sons, 2015. [40] J. Canny, "A computational approach to edge detection," IEEE Transactions on pattern analysis and machine intelligence, pp. 679-698, 1986. [41] D. H. Ballard, "Generalizing the Hough transform to detect arbitrary shapes," Pattern recognition, vol. 13, pp. 111-122, 1981. [42] M. A. Fischler and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol. 24, pp. 381-395, 1981. [43] Wikipedia - RGB Color Mode. (https://en.wikipedia.org/wiki/RGB_color_model) [44] 陳瑋萱,,「改善擴增實境中虛實遮蔽之邊緣品質」清華大學工業工程與工程管理學系學位論文,2016 [45] OpenCV (http://opencv.org/) [46] Jackraken's blog - openGL教學<-> 認識openGL (http://jackraken.github.io/2014/09/17/openGL_basic/) [47] 李寧,「Android案例開發完全講義」碁峰資訊股份有限公司,2011 [48] OpenCV 3.0 alpha (http://opencv.org/opencv-3-0-alpha.html) [49] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik, "Evaluating and Improving the Depth Accuracy of Kinect for Windows v2," IEEE Sensors Journal, vol. 15, pp. 4275-4285, 2015. [50] B. Karan, "Calibration of Kinect-type RGB-D sensors for robotic applications," FME Transactions, vol. 43, pp. 47-54, 2015. [51] A. Staranowicz, G. R. Brown, F. Morbidi, and G. L. Mariottini, "Easy-to-use and accurate calibration of rgb-d cameras from spheres," Image and Video Technology, pp. 265-278, 2014.
|