帳號:guest(52.15.86.164)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):李俊則
作者(外文):Lee, Chun-Tse
論文名稱(中文):兼具視觸覺感知以及全驅動自適應機械手指實現機器人智慧操控之研究與開發
論文名稱(外文):Intelligent Robotic Manipulation Enabled by FASA Fingers and Visual-Tactile Perception
指導教授(中文):張禎元
指導教授(外文):Chang, Jen-Yuan
口試委員(中文):宋震國
馮國華
林沛群
陳宗麟
彭志誠
口試委員(外文):Sung, Cheng-Kuo
Feng, Guo-Hua
Lin, Pei-Chun
Chen, Tsung-Lin
Peng, Chih-Cheng
學位類別:博士
校院名稱:國立清華大學
系所名稱:動力機械工程學系
學號:108033802
出版年(民國):112
畢業學年度:111
語文別:中文
論文頁數:180
中文關鍵詞:全驅動機構欠驅動機構擬人型夾爪動態靜力分析手眼校正視覺伺服控制點雲配對壓力感測器支援向量機影像物件辨識隨機堆疊料籃夾取基因演算法逆向運動學
外文關鍵詞:Fully-acutationunder-actuationanthropormorphic robotic handkinetostatic analysishand-eye calibrationvision servo controlpoint cloud registrationtactile sensorsupport vector machineimage object detectionrandom bin pickinggenetic algorithminverse kinematics
相關次數:
  • 推薦推薦:0
  • 點閱點閱:28
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
近年來機器人憑藉其工作穩定性、商業化的價格以及成熟自動化技術逐漸走出工廠之外,開始融入於我們的生活當中,例如餐飲業自動化、飯店服務業、居家照護機器人等。不難發現機器人已從固定、單一的操作環境中,轉移至多變、未見過之環境中進行各式各樣的任務,為此本研究目標在於開發一套具感知擬人型機器人系統與其智慧操控策略應用,以面對泛用型的應用情境。本論文將涵蓋機器人系統的三種面向進行探究,分別為擬人型夾爪之機電整合、視觸覺系統之感測器整合、人工智慧技術之智慧操控。機電整合方面本論文提出兼具全驅動與自適應之擬人型夾爪設計,透過運動學分析以及動態靜力分析量化其運動表現,並以機電整合平台驗證其獨立關節運動性能與自適應包覆物體之能力。感測器整合方面,使用深度攝影機作為機器人系統感知外界之用,並開發適用於擬人機器人之快速手眼校正流程;觸覺感測方面開發磁場式三軸壓力感測器,以陣列式安裝擬人手指尖掌握局部資訊。智慧操控整合方面則是採用三種不同的人工智慧技術派別: 類比主義、連結主義以及進化主義,分別應用於物件重量預估、隨機物堆疊料籃夾取、多鏈共軛機械手之逆向運動學。
In recent years, robots have gradually entered our daily lives. It is easy to see that robots have moved from fixed operational environments to various and unseen environments to perform various tasks. Therefore, the goal of this research is to develop a human-like robotic system with perception and intelligent control strategies to face general scenarios. This dissertation will cover three aspects: mechatronics of a humanoid robotic hand, vision and touch sensor system, and intelligent robotic manipulation. First, this dissertation proposes the design of hybrid fully-actuated and self-adaptive mechanism for an anthropomorphic robotic finger. Its performance is quantified by kinematic analysis and kinetostatic analysis. Second, a stereo camera is employed in the humanoid robot, and a novel hand-eye calibration method is developed; as for tactile sensing, a magnetic type three-axis pressure sensor is developed and installed in the fingertips to perceive local information. Finally, three AI technologies are utilized for intelligent robotic manipulations, namely object weight estimation, random bin picking system, and inverse kinematics of multi-chain conjugate manipulators.
摘 要 I
Abstract II
誌謝 III
目錄 IV
圖目錄 VII
表目錄 XIII
第一章 緒論 1
1.1 前言 1
1.2 文獻回顧 3
1.2.1 擬人型機械手臂夾爪之發展 5
1.2.2 感測系統於機器人上之趨勢 10
1.2.3 機器人智慧操控決策之流派 16
1.3 研究問題與目標 20
1.3.1 研究問題 20
1.3.2 研究目標 21
第二章 兼具全驅動自適應擬人手指機電整合開發 23
2.1 文獻回顧與突破處 23
2.2 機構設計與數學模型建立 28
2.3 動態靜力分析與運動學分析 35
2.3.1 動態靜力分析 35
2.3.2 運動學分析 51
2.4 實驗架設與功能驗證 56
2.5 章節總結 64
第三章 基於視覺伺服控制之手眼校正流程開發 65
3.1 文獻回顧與突破處 65
3.2 新型態校正物設計與創新式手眼校正流程 73
3.2.1 新型態校正物設計 73
3.2.2 創新式手眼校正流程介紹 78
3.3 視覺伺服控制理論 83
3.3.1 應用於相機速度旋量之視覺伺服控制 83
3.3.2 應用於機械手臂速度旋量之視覺伺服控制 88
3.4 手眼校正演算法: 點雲配對 91
3.5 實驗架設與功能驗證 95
3.5.1 視覺伺服控制: 物件追蹤定位與機械手臂控制移動 95
3.5.2 手眼校正之絕對式精度誤差 99
3.6 章節總結 106
第四章 機器人智慧操控之研究與開發 107
4.1 陣列式三維觸覺感測器應用於物件重量估測 107
4.1.1 研究背景與問題定義 108
4.1.2 磁場感測式三維力量感測器與擬人夾爪整合開發 110
4.1.3 仿生式資料收集流程與支援向量機應用於重量評估 114
4.2 應用基因演算法解共軛多鏈逆向運動學 120
4.2.1 研究背景與問題定義 120
4.2.2 擬人手指工作空間分析 121
4.2.3 基因演算法與工作空間之改良 125
4.2.4 同時多指逆向運動學之運算 132
4.3 免除CAD模型之隨機堆疊物夾取系統開發 138
4.3.1 研究背景與問題定義 138
4.3.2 視覺影像辨識技術應用: Mask-rcnn 140
4.3.3 最佳夾取點策略與系統流程建立 144
4.3.4 隨機堆疊物夾取系統系統驗證實例 148
4.4 擬人機器人智慧操控技術整合 151
4.5 章節總結 156
第五章 結論與未來展望 157
附錄一 160
參考資料 167
[1] O. Kroemer, S. Niekum, and G. Konidaris, "A review of robot learning for manipulation: Challenges, representations, and algorithms," The Journal of Machine Learning Research, vol. 22, no. 1, pp. 1395-1476, 2021.
[2] I. Lee, "Service robots: a systematic literature review," Electronics, vol. 10, no. 21, p. 2658, 2021.
[3] S. Caldera, A. Rassau, and D. Chai, "Review of deep learning methods in robotic grasp detection," Multimodal Technologies and Interaction, vol. 2, no. 3, p. 57, 2018.
[4] D. Jones, C. Snider, A. Nassehi, J. Yon, and B. Hicks, "Characterising the Digital Twin: A systematic literature review," CIRP Journal of Manufacturing Science and Technology, vol. 29, pp. 36-52, 2020.
[5] C.-H. Ting, W.-H. Yeo, Y.-J. King, Y.-D. Chuah, J.-V. Lee, and W.-B. Khaw, "Humanoid robot: A review of the architecture, applications and future trend," Research Journal of Applied Sciences, Engineering and Technology, vol. 7, no. 7, pp. 1364-1369, 2014.
[6] M. Hirose and K. Ogawa, "Honda humanoid robots development," Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 365, no. 1850, pp. 11-19, 2007.
[7] G. Metta, G. Sandini, D. Vernon, L. Natale, and F. Nori, "The iCub humanoid robot: an open platform for research in embodied cognition," in Proceedings of the 8th workshop on performance metrics for intelligent systems, 2008, pp. 50-56.
[8] R. Tellez et al., "Reem-B: An autonomous lightweight human-size humanoid robot," in Humanoids 2008-8th IEEE-RAS International Conference on Humanoid Robots, 2008, pp. 462-468: IEEE.
[9] S. Shamsuddin et al., "Humanoid robot NAO: Review of control and motion exploration," in 2011 IEEE international conference on Control System, Computing and Engineering, 2011, pp. 511-516: IEEE.
[10] EVAN ACKERMAN. (2021 Aug. 17). "Atlas Shows Most Impressive Parkour Skills We've Ever Seen Significant advancements in dynamic humanoid robots", IEEE Spectrum, Retrieved Jan. 03, 2023 from: https://spectrum.ieee.org/boston-dynamics-atlas-parkour.
[11] S. Feng, E. Whitman, X. Xinjilefu, and C. G. Atkeson, "Optimization based full body control for the atlas robot," in 2014 IEEE-RAS International Conference on Humanoid Robots, 2014, pp. 120-127: IEEE.
[12] EVAN ACKERMAN. (2022 Oct. 01). "For Better or Worse, Tesla Bot Is Exactly What We Expected Tesla fails to show anything uniquely impressive with its new humanoid robot prototype", IEEE Spectrum, Retrieved Jan. 03, 2023 from: https://spectrum.ieee.org/tesla-optimus-robot.
[13] EVAN ACKERMAN. (2022 Dec. 07). "Xiaomi’s Humanoid Drummer Beats Expectations Solving drum-playing helped quest for whole-body control", IEEE Spectrum, Retrieved Jan. 03, 2023 from: https://spectrum.ieee.org/xiaomi-robot-drummer.
[14] J. Fraden, "Handbook of modern sensors: physics, designs, and applications," ed: American Association of Physics Teachers, 1998.
[15] B. Siciliano, O. Khatib, and T. Kröger, Springer handbook of robotics. Springer, 2008.
[16] N. Ahmad, R. A. R. Ghazilla, N. M. Khairi, and V. Kasi, "Reviews on various inertial measurement unit (IMU) sensor applications," International Journal of Signal Processing Systems, vol. 1, no. 2, pp. 256-262, 2013.
[17] P. Li and X. Liu, "Common sensors in industrial robots: A review," in Journal of Physics: Conference Series, 2019, vol. 1267, no. 1, p. 012036: IOP Publishing.
[18] C. Evers and P. A. Naylor, "Acoustic slam," IEEE/ACM Transactions on Audio, Speech, and Language Processing, vol. 26, no. 9, pp. 1484-1498, 2018.
[19] Y. B. Kaya and M. Ranjbar, "A Review on Methods and Approaches in Underwater Acoustics," Computational Research Progress in Applied Science and Engineering, vol. 6, no. 3, pp. 220-227, 2020.
[20] K. Ahanat, A. Juan, and P. Veronique, "Tactile sensing in dexterous robot hands-Review," Rob. Auton. Syst, vol. 74, pp. 195-220, 2015.
[21] R. Singh and K. S. Nagla, "Comparative analysis of range sensors for the robust autonomous navigation–a review," Sensor Review, vol. 40, no. 1, pp. 17-41, 2019.
[22] M. T. Shahria, M. S. H. Sunny, M. I. I. Zarif, J. Ghommam, S. I. Ahamed, and M. H. Rahman, "A Comprehensive Review of Vision-Based Robotic Applications: Current State, Components, Approaches, Barriers, and Potential Solutions," Robotics, vol. 11, no. 6, p. 139, 2022.
[23] P. Dario, C. Laschi, and E. Guglielmelli, "Sensors and actuators for'humanoid'robots," Advanced Robotics, vol. 11, no. 6, pp. 567-584, 1996.
[24] T. Asfour et al., "Armar-4: A 63 dof torque controlled humanoid robot," in 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2013, pp. 390-396: IEEE.
[25] J. Lim et al., "Robot system of DRC‐HUBO+ and control strategy of team KAIST in DARPA robotics challenge finals," Journal of Field Robotics, vol. 34, no. 4, pp. 802-829, 2017.
[26] H. Liu, J. Greco, X. Song, J. Bimbo, L. Seneviratne, and K. Althoefer, "Tactile image based contact shape recognition using neural network," in 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2012, pp. 138-143: IEEE.
[27] J. A. C. Ramon, V. Perdereau, and F. T. Medina, "Multi-fingered robotic hand planner for object reconfiguration through a rolling contact evolution model," in 2013 IEEE International Conference on Robotics and Automation, 2013, pp. 625-630: IEEE.
[28] T. Zhang and Y. Nakamura, "Humanoid robot rgb-d slam in the dynamic human environment," International Journal of Humanoid Robotics, vol. 17, no. 02, p. 2050009, 2020.
[29] K. Sabe, M. Fukuchi, J.-S. Gutmann, T. Ohashi, K. Kawamoto, and T. Yoshigahara, "Obstacle avoidance and path planning for humanoid robots using stereo vision," in IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA'04. 2004, 2004, vol. 1, pp. 592-597: IEEE.
[30] N. Vahrenkamp, S. Wieland, P. Azad, D. Gonzalez, T. Asfour, and R. Dillmann, "Visual servoing for humanoid grasping and manipulation tasks," in Humanoids 2008-8th IEEE-RAS International Conference on Humanoid Robots, 2008, pp. 406-412: IEEE.
[31] X. Song, H. Liu, K. Althoefer, T. Nanayakkara, and L. D. Seneviratne, "Efficient break-away friction ratio and slip prediction based on haptic surface exploration," IEEE Transactions on Robotics, vol. 30, no. 1, pp. 203-219, 2013.
[32] M. R. Cutkosky and J. Ulmen, "Dynamic tactile sensing," in The human hand as an inspiration for robot hand development: Springer, 2014, pp. 389-403.
[33] Y. Bekiroglu, J. Laaksonen, J. A. Jorgensen, V. Kyrki, and D. Kragic, "Assessing grasp stability based on learning and haptic data," IEEE Transactions on Robotics, vol. 27, no. 3, pp. 616-629, 2011.
[34] P. Pelosi and K. Persaud, "Gas sensors: towards an artificial nose," in Sensors and sensory systems for advanced robots: Springer, 1988, pp. 361-381.
[35] J. Hatfield, P. Hicks, P. Neaves, K. Persaud, and P. Travers, "An integrated approach to an artificial nose based on ASICs and conducting polymers," in Sensors Vi: Technology, Systems, and Applications: Proceedings of the sixth conference on Sensors and their Applications held 12-15 September 1993 in Manchester, 1993: CRC Press.
[36] K. Toko, "A taste sensor," Measurement Science and Technology, vol. 9, no. 12, p. 1919, 1998.
[37] K. Toko, "Multichannel chemical sensor with global selectivity," Reviews on Heteroatom Chemistry, vol. 14, pp. 245-270, 1996.
[38] T. Lozano-Perez, "Robot programming," Proceedings of the IEEE, vol. 71, no. 7, pp. 821-841, 1983.
[39] B. Dufay and J.-C. Latombe, "An approach to automatic robot programming based on inductive learning," The International journal of robotics research, vol. 3, no. 4, pp. 3-20, 1984.
[40] A. Levas and M. Selfridge, "A user-friendly high-level robot teaching system," in Proceedings. 1984 IEEE International Conference on Robotics and Automation, 1984, vol. 1, pp. 413-416: IEEE.
[41] A. Segre and G. DeJong, "Explanation-based manipulator learning: Acquisition of planning ability through observation," in Proceedings. 1985 IEEE International Conference on Robotics and Automation, 1985, vol. 2, pp. 555-560: IEEE.
[42] A. M. Segre, Machine learning of robot assembly plans. Springer Science & Business Media, 2012.
[43] S. Muench, J. Kreuziger, M. Kaiser, and R. Dillman, "Robot programming by demonstration (rpd)-using machine learning and user interaction methods for the development of easy and comfortable robot programming systems," in Proceedings of the International Symposium on Industrial Robots, 1994, vol. 25, pp. 685-685: INTERNATIONAL FEDERATION OF ROBOTICS, & ROBOTIC INDUSTRIES.
[44] T. Kuniyoshi, "Teaching by showing: Generating robot programs by visual observation of human performance," in Proc. 20th Int. Symp. Ind. robots, 1989, 1989, pp. 119-126.
[45] Y. Kuniyoshi, M. Inaba, and H. Inoue, "Learning by watching: Extracting reusable task knowledge from visual observation of human performance," IEEE transactions on robotics and automation, vol. 10, no. 6, pp. 799-822, 1994.
[46] S. B. Kang and K. Ikeuchi, "A robot system that observes and replicates grasping tasks," in Proceedings of IEEE International Conference on Computer Vision, 1995, pp. 1093-1099: IEEE.
[47] C.-P. Tung and A. C. Kak, "Automatic learning of assembly tasks using a dataglove system," in Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, 1995, vol. 1, pp. 1-8: IEEE.
[48] K. Ikeuchi and T. Suchiro, "Towards an assembly plan from observation. i. assembly task recognition using face-contact relations (polyhedral objects)," in Proceedings 1992 IEEE International Conference on Robotics and Automation, 1992, pp. 2171, 2172, 2173, 2174, 2175, 2176, 2177-2171, 2172, 2173, 2174, 2175, 2176, 2177: IEEE Computer Society.
[49] S. Liu and H. Asada, "Teaching and learning of deburring robots using neural networks," in [1993] Proceedings IEEE International Conference on Robotics and Automation, 1993, pp. 339-345: IEEE.
[50] A. Billard and G. Hayes, "Drama, a connectionist architecture for control and learning in autonomous robots," Adaptive Behavior, vol. 7, no. 1, pp. 35-63, 1999.
[51] M. Kaiser and R. Dillmann, "Building elementary robot skills from human demonstration," in Proceedings of IEEE International Conference on Robotics and Automation, 1996, vol. 3, pp. 2700-2705: IEEE.
[52] R. Dillmann, M. Kaiser, and A. Ude, "Acquisition of elementary robot skills from human demonstration," in International symposium on intelligent robotics systems, 1995, pp. 185-192: Citeseer.
[53] J. Yang, Y. Xu, and C. S. Chen, "Hidden Markov model approach to skill learning and its application to telerobotics," IEEE transactions on robotics and automation, vol. 10, no. 5, pp. 621-631, 1994.
[54] P. K. Pook and D. H. Ballard, "Recognizing teleoperated manipulations," in [1993] Proceedings IEEE International Conference on Robotics and Automation, 1993, pp. 578-585: IEEE.
[55] G. E. Hovland, P. Sikka, and B. J. McCarragher, "Skill acquisition from human demonstration using a hidden markov model," in Proceedings of IEEE international conference on robotics and automation, 1996, vol. 3, pp. 2706-2711: Ieee.
[56] S. Tso and K. Liu, "Hidden Markov model for intelligent extraction of robot trajectory command from demonstrated trajectories," in Proceedings of the IEEE International Conference on Industrial Technology (ICIT'96), 1996, pp. 294-298: IEEE.
[57] C. Lee and Y. Xu, "Online, interactive learning of gestures for human/robot interfaces," in Proceedings of IEEE International Conference on Robotics and Automation, 1996, vol. 4, pp. 2982-2987: IEEE.
[58] M. Deisenroth and C. E. Rasmussen, "PILCO: A model-based and data-efficient approach to policy search," in Proceedings of the 28th International Conference on machine learning (ICML-11), 2011, pp. 465-472: Citeseer.
[59] S. Levine, N. Wagener, and P. Abbeel, "Learning contact-rich manipulation skills with guided policy search (2015)," arXiv preprint arXiv:1501.05611, 2015.
[60] I. Lenz, R. A. Knepper, and A. Saxena, "DeepMPC: Learning deep latent features for model predictive control," in Robotics: Science and Systems, 2015: Rome, Italy.
[61] V. Kumar, E. Todorov, and S. Levine, "Optimal control with learned local models: Application to dexterous manipulation," in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 378-383: IEEE.
[62] C. Finn and S. Levine, "Deep visual foresight for planning robot motion," in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 2786-2793: IEEE.
[63] C. Schenck and D. Fox, "Spnets: Differentiable fluid dynamics for deep neural networks," in Conference on Robot Learning, 2018, pp. 317-335: PMLR.
[64] S. Gu, T. Lillicrap, I. Sutskever, and S. Levine, "Continuous deep q-learning with model-based acceleration," in International conference on machine learning, 2016, pp. 2829-2838: PMLR.
[65] Y. Chebotar, K. Hausman, M. Zhang, G. Sukhatme, S. Schaal, and S. Levine, "Combining model-based and model-free updates for trajectory-centric reinforcement learning," in International conference on machine learning, 2017, pp. 703-711: PMLR.
[66] V. Pong, S. Gu, M. Dalal, and S. Levine, "Temporal difference models: Model-free deep rl for model-based control," arXiv preprint arXiv:1802.09081, 2018.
[67] P. Englert and M. Toussaint, "Combined Optimization and Reinforcement Learning for Manipulation Skills," in Robotics: Science and systems, 2016, vol. 2016.
[68] V. Feinberg, A. Wan, I. Stoica, M. I. Jordan, J. E. Gonzalez, and S. Levine, "Model-based value expansion for efficient model-free reinforcement learning," in Proceedings of the 35th International Conference on Machine Learning (ICML 2018), 2018.
[69] J. Mahler, M. Matl, X. Liu, A. Li, D. Gealy, and K. Goldberg, "Dex-net 3.0: Computing robust vacuum suction grasp targets in point clouds using a new analytic model and deep learning," in 2018 IEEE International Conference on robotics and automation (ICRA), 2018, pp. 5620-5627: IEEE.
[70] C.-T. Lee and J.-Y. Chang, "Design of Hybrid Fully Actuated and Self-Adaptive Mechanism for Anthropomorphic Robotic Finger," Journal of Mechanisms and Robotics, vol. 15, no. 4, p. 041004, 2023.
[71] C.-T. Lee and J.-Y. Chang, "Design and Analysis of Finger Kinematics of a Prosthetic Hand," in 2018 International Conference on System Science and Engineering (ICSSE), 2018, pp. 1-5: IEEE.
[72] T. Mańkowski, J. Tomczyński, K. Walas, and D. Belter, "Put-hand—hybrid industrial and biomimetic gripper for elastic object manipulation," Electronics, vol. 9, no. 7, p. 1147, 2020.
[73] L. Zollo, S. Roccella, E. Guglielmelli, M. C. Carrozza, and P. Dario, "Biomechatronic design and control of an anthropomorphic artificial hand for prosthetic and robotic applications," IEEE/ASME Transactions On Mechatronics, vol. 12, no. 4, pp. 418-429, 2007.
[74] B. Massa, S. Roccella, M. C. Carrozza, and P. Dario, "Design and development of an underactuated prosthetic hand," in Proceedings 2002 IEEE international conference on robotics and automation (Cat. No. 02CH37292), 2002, vol. 4, pp. 3374-3379: IEEE.
[75] D. M. Aukes et al., "Design and testing of a selectively compliant underactuated hand," The International Journal of Robotics Research, vol. 33, no. 5, pp. 721-735, 2014.
[76] S. Jacobsen, E. Iversen, D. Knutti, R. Johnson, and K. Biggers, "Design of the Utah/MIT dextrous hand," in Proceedings. 1986 IEEE International Conference on Robotics and Automation, 1986, vol. 3, pp. 1520-1532: IEEE.
[77] J. Martin and M. Grossard, "Design of a fully modular and backdrivable dexterous hand," The International Journal of Robotics Research, vol. 33, no. 5, pp. 783-798, 2014.
[78] M. Quigley, C. Salisbury, A. Y. Ng, and J. K. Salisbury, "Mechatronic design of an integrated robotic hand," The International Journal of Robotics Research, vol. 33, no. 5, pp. 706-720, 2014.
[79] F. Rothling, R. Haschke, J. J. Steil, and H. Ritter, "Platform portable anthropomorphic grasping with the bielefeld 20-dof shadow and 9-dof tum hand," in 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007, pp. 2951-2956: IEEE.
[80] Y. Su, Y. Wu, K. Lee, Z. Du, and Y. Demiris, "Robust grasping for an under-actuated anthropomorphic hand under object position uncertainty," in 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012), 2012, pp. 719-725: IEEE.
[81] T. Treratanakulwong, H. Kaminaga, and Y. Nakamura, "Low-friction tendon-driven robot hand with carpal tunnel mechanism in the palm by optimal 3D allocation of pulleys," in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 6739-6744: IEEE.
[82] J. L. Pons et al., "The MANUS-HAND dextrous robotics upper limb prosthesis: mechanical and manipulation aspects," Autonomous Robots, vol. 16, no. 2, pp. 143-163, 2004.
[83] F. Lotti, P. Tiezzi, G. Vassura, L. Biagiotti, G. Palli, and C. Melchiorri, "Development of UB hand 3: Early results," in Proceedings of the 2005 IEEE International Conference on Robotics and Automation, 2005, pp. 4488-4493: IEEE.
[84] M. Controzzi, C. Cipriani, and M. C. Carrozza, "Mechatronic design of a transradial cybernetic hand," in 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, pp. 576-581: IEEE.
[85] C. Cipriani, M. Controzzi, and M. C. Carrozza, "Objectives, criteria and methods for the design of the SmartHand transradial prosthesis," Robotica, vol. 28, no. 6, pp. 919-927, 2010.
[86] Y. Kamikawa and T. Maeno, "Underactuated five-finger prosthetic hand inspired by grasping force distribution of humans," in 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, pp. 717-722: IEEE.
[87] S. A. Dalley, T. E. Wiste, T. J. Withrow, and M. Goldfarb, "Design of a multifunctional anthropomorphic prosthetic hand with extrinsic actuation," IEEE/ASME transactions on mechatronics, vol. 14, no. 6, pp. 699-706, 2009.
[88] Z. Xu, V. Kumar, and E. Todorov, "A low-cost and modular, 20-DOF anthropomorphic robotic hand: Design, actuation and modeling," in 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2013, pp. 368-375: IEEE.
[89] Z. Xu and E. Todorov, "Design of a highly biomimetic anthropomorphic robotic hand towards artificial limb regeneration," in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 3485-3492: IEEE.
[90] C. Della Santina, G. Grioli, M. Catalano, A. Brando, and A. Bicchi, "Dexterity augmentation on a synergistic hand: the Pisa/IIT SoftHand+," in 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), 2015, pp. 497-503: IEEE.
[91] R. R. Ma, L. U. Odhner, and A. M. Dollar, "A modular, open-source 3D printed underactuated hand," in 2013 IEEE International Conference on Robotics and Automation, 2013, pp. 2737-2743: IEEE.
[92] G. Salvietti, I. Hussain, M. Malvezzi, and D. Prattichizzo, "Design of the passive joints of underactuated modular soft hands for fingertip trajectory tracking," IEEE Robotics and Automation Letters, vol. 2, no. 4, pp. 2008-2015, 2017.
[93] Schunk., Servo-Electric 3-Finger Gripping Hand (SDH). Available online: https://schunk.com/fileadmin/pim/docs/IM0023480.PDF (accessed on 3 Jan 2022).
[94] H. Liu et al., "Multisensory five-finger dexterous hand: The DLR/HIT Hand II," in 2008 IEEE/RSJ international conference on intelligent robots and systems, 2008, pp. 3692-3697: IEEE.
[95] N. Dechev, W. Cleghorn, and S. Naumann, "Multiple finger, passive adaptive grasp prosthetic hand," Mechanism and machine theory, vol. 36, no. 10, pp. 1157-1173, 2001.
[96] C. Light and P. Chappell, "Development of a lightweight and adaptable multiple-axis hand prosthesis," Medical engineering & physics, vol. 22, no. 10, pp. 679-684, 2000.
[97] R. Deimel and O. Brock, "A novel type of compliant and underactuated robotic hand for dexterous grasping," The International Journal of Robotics Research, vol. 35, no. 1-3, pp. 161-185, 2016.
[98] I. N. Gaiser, C. Pylatiuk, S. Schulz, A. Kargov, R. Oberle, and T. Werner, "The FLUIDHAND III: A multifunctional prosthetic hand," JPO: Journal of Prosthetics and Orthotics, vol. 21, no. 2, pp. 91-96, 2009.
[99] W. S. You, Y. H. Lee, H. S. Oh, G. Kang, and H. R. Choi, "Design of a 3D-printable, robust anthropomorphic robot hand including intermetacarpal joints," Intelligent Service Robotics, vol. 12, no. 1, pp. 1-16, 2019.
[100] H. Jeong and J. Cheong, "Design of hybrid type robotic hand: The KU hybrid HAND," in 2011 11th International Conference on Control, Automation and Systems, 2011, pp. 1113-1116: IEEE.
[101] H. Jeong and J. Cheong, "Design and analysis of KU hybrid hand—Type II," in 2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), 2013, pp. 580-583: IEEE.
[102] K. Mianowski, K. Berns, and J. Hirth, "The artificial hand with elastic fingers for humanoid robot ROMAN," in 2013 18th International Conference on Methods & Models in Automation & Robotics (MMAR), 2013, pp. 448-453: IEEE.
[103] H. Yang, G. Wei, and L. Ren, "Design and development of a linkage-tendon hybrid driven anthropomorphic robotic hand," in International Conference on Intelligent Robotics and Applications, 2019, pp. 117-128: Springer.
[104] Y. Losier et al., "An overview of the UNB hand system," in Proceeding of the 2011 Myoelectric Controls/Powered Prosthetics Symposium, 2011.
[105] K. Mizushima, T. Oku, Y. Suzuki, T. Tsuji, and T. Watanabe, "Multi-fingered robotic hand based on hybrid mechanism of tendon-driven and jamming transition," in 2018 IEEE International Conference on Soft Robotics (RoboSoft), 2018, pp. 376-381: IEEE.
[106] M. V. Yazici, A. Kahveci, F. S. Kiziltaş, N. Mülayim, and E. Gezgin, "Design and Development of a Surgical Robotic Hand with Hybrid Structure," in 2018 Medical Technologies National Congress (TIPTEKNO), 2018, pp. 1-4: IEEE.
[107] G. B. Mahanta, A. Rout, B. Deepak, B. B. Biswal, and B. M. Gunji, "Preliminary Design and Fabrication of Bio-Inspired Low-Cost Hybrid Soft-Rigid Robotic Hand for Grasping Delicate Objects," in 2019 9th Annual Information Technology, Electromechanical Engineering and Microelectronics Conference (IEMECON), 2019, pp. 17-23: IEEE.
[108] G. Cerruti, D. Chablat, D. Gouaillier, and S. Sakka, "ALPHA: A hybrid self-adaptable hand for a social humanoid robot," in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 900-906: IEEE.
[109] D. Che and W. Zhang, "GCUA humanoid robotic hand with tendon mechanisms and its upper limb," International Journal of Social Robotics, vol. 3, no. 4, pp. 395-404, 2011.
[110] A. M. Dollar and R. D. Howe, "The highly adaptive SDM hand: Design and performance evaluation," The international journal of robotics research, vol. 29, no. 5, pp. 585-597, 2010.
[111] D.-p. Yang et al., "An anthropomorphic robot hand developed based on underactuated mechanism and controlled by EMG signals," Journal of Bionic Engineering, vol. 6, no. 3, pp. 255-263, 2009.
[112] L. Birglen, T. Laliberté, and C. M. Gosselin, Underactuated robotic hands. Springer, 2007.
[113] D. Aukes, B. Heyneman, V. Duchaine, and M. R. Cutkosky, "Varying spring preloads to select grasp strategies in an adaptive hand," in 2011 IEEE/RSJ International conference on intelligent robots and systems, 2011, pp. 1373-1379: IEEE.
[114] L. Hua, X. Sheng, D. Zhang, and X. Zhu, "Towards the analysis and optimization of underactuated hands for effective grasp," International Journal of Humanoid Robotics, vol. 13, no. 03, p. 1650004, 2016.
[115] T. Laliberté and C. M. Gosselin, "Simulation and design of underactuated mechanical hands," Mechanism and machine theory, vol. 33, no. 1-2, pp. 39-57, 1998.
[116] G. A. Kragten and J. L. Herder, "The ability of underactuated hands to grasp and hold objects," Mechanism and Machine Theory, vol. 45, no. 3, pp. 408-425, 2010.
[117] M. Cheng, S. Fan, D. Yang, and L. Jiang, "Design of an Underactuated Finger Based on a Novel Nine-Bar Mechanism," Journal of Mechanisms and Robotics, vol. 12, no. 6, p. 065001, 2020.
[118] M. Cheng, S. Fan, D. Yang, and L. Jiang, "Design of an Underactuated Finger Based on a Novel Nine-Bar Mechanism," ASME J. Mech. Rob., vol. 12, no. 6, p. 065001, 2020.
[119] G. Li, C. Zhang, W. Zhang, Z. Sun, and Q. Chen, "Coupled and self-adaptive under-actuated finger with a novel s-coupled and secondly self-adaptive mechanism," Journal of Mechanisms and Robotics, vol. 6, no. 4, p. 041010, 2014.
[120] S.-L. Chang, J.-J. Lee, and H.-C. Yen, "Kinematic and compliance analysis for tendon-driven robotic mechanisms with flexible tendons," Mechanism and machine theory, vol. 40, no. 6, pp. 728-739, 2005.
[121] J. J. Lee and Y. H. Lee, "Dynamic analysis of tendon driven robotic mechanisms," Journal of Robotic Systems, vol. 20, no. 5, pp. 229-238, 2003.
[122] H. Deng, H. Luo, R. Wang, and Y. Zhang, "Grasping Force Planning and Control for Tendon-driven Anthropomorphic Prosthetic Hands," Journal of Bionic Engineering, vol. 15, no. 5, pp. 795-804, 2018.
[123] R. Ozawa, K. Hashirii, Y. Yoshimura, M. Moriya, and H. Kobayashi, "Design and control of a three-fingered tendon-driven robotic hand with active and passive tendons," Autonomous Robots, vol. 36, no. 1-2, pp. 67-78, 2014.
[124] 李俊則、張禎元(2021) 光學視覺與機械手臂系統整合之校正方法介紹﹝電子版﹞。《科儀新知》226期-光學量測技術產業應用_專題文章.
[125] C.-T. Cao, V.-P. Do, and B.-R. Lee, "A novel indirect calibration approach for robot positioning error compensation based on neural network and hand-eye vision," Applied Sciences, vol. 9, no. 9, p. 1940, 2019.
[126] J. Heller, D. Henrion, and T. Pajdla, "Hand-eye and robot-world calibration by global polynomial optimization," in 2014 IEEE international conference on robotics and automation (ICRA), 2014, pp. 3157-3164: IEEE.
[127] M. Shah, "Solving the robot-world/hand-eye calibration problem using the Kronecker product," Journal of Mechanisms and Robotics, vol. 5, no. 3, p. 031007, 2013.
[128] R. Y. Tsai and R. K. Lenz, "A new technique for fully autonomous and efficient 3 D robotics hand/eye calibration," IEEE Transactions on robotics and automation, vol. 5, no. 3, pp. 345-358, 1989.
[129] K. Daniilidis, "Hand-eye calibration using dual quaternions," The International Journal of Robotics Research, vol. 18, no. 3, pp. 286-298, 1999.
[130] N. Andreff, R. Horaud, and B. Espiau, "Robot hand-eye calibration using structure-from-motion," The International Journal of Robotics Research, vol. 20, no. 3, pp. 228-248, 2001.
[131] K. H. Strobl and G. Hirzinger, "Optimal hand-eye calibration," in 2006 IEEE/RSJ international conference on intelligent robots and systems, 2006, pp. 4647-4653: IEEE.
[132] Y. Motai and A. Kosaka, "Hand–eye calibration applied to viewpoint selection for robotic vision," IEEE Transactions on Industrial Electronics, vol. 55, no. 10, pp. 3731-3741, 2008.
[133] Y. Liu, H. Liu, F.-l. Ni, and W.-f. Xu, "New self-calibration approach to space robots based on hand-eye vision," Journal of Central South University of Technology, vol. 18, no. 4, pp. 1087-1096, 2011.
[134] J. Heller, M. Havlena, and T. Pajdla, "Globally optimal hand-eye calibration using branch-and-bound," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 38, no. 5, pp. 1027-1033, 2015.
[135] H. Wang, X. Lu, Z. Hu, and Y. Li, "A vision-based fully-automatic calibration method for hand-eye serial robot," Industrial Robot: An International Journal, 2015.
[136] Q. Ma, H. Li, and G. S. Chirikjian, "New probabilistic approaches to the AX= XB hand-eye calibration without correspondence," in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 4365-4371: IEEE.
[137] K. Pachtrachai et al., "Adjoint transformation algorithm for hand–eye calibration with applications in robotic assisted surgery," Annals of biomedical engineering, vol. 46, no. 10, pp. 1606-1620, 2018.
[138] J. Zeng, G.-Z. Cao, W.-B. Li, and B.-C. Chen, "An algorithm of hand-eye calibration for arc welding robot," in 2019 16th International Conference on Ubiquitous Robots (UR), 2019, pp. 1-6: IEEE.
[139] H. Zhuang, Z. S. Roth, and R. Sudhakar, "Simultaneous robot/world and tool/flange calibration by solving homogeneous transformation equations of the form AX= YB," IEEE Transactions on Robotics and Automation, vol. 10, no. 4, pp. 549-554, 1994.
[140] R. L. Hirsh, G. N. DeSouza, and A. C. Kak, "An iterative approach to the hand-eye and base-world calibration problem," in Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), 2001, vol. 3, pp. 2171-2176: IEEE.
[141] A. Li, L. Wang, and D. Wu, "Simultaneous robot-world and hand-eye calibration using dual-quaternions and Kronecker product," International Journal of Physical Sciences, vol. 5, no. 10, pp. 1530-1536, 2010.
[142] Z. Zhang, L. Zhang, and G.-Z. Yang, "A computationally efficient method for hand–eye calibration," International journal of computer assisted radiology and surgery, vol. 12, no. 10, pp. 1775-1787, 2017.
[143] I. Lundberg, M. Björkman, and P. Ögren, "Intrinsic camera and hand-eye calibration for a robot vision system using a point marker," in 2014 IEEE-RAS International Conference on Humanoid Robots, 2014, pp. 59-66: IEEE.
[144] L. Wu, J. Wang, L. Qi, K. Wu, H. Ren, and M. Q.-H. Meng, "Simultaneous Hand–Eye, Tool–Flange, and Robot–Robot Calibration for Comanipulation by Solving the $\mathbf {AXB= YCZ} $ Problem," IEEE TRansactions on robotics, vol. 32, no. 2, pp. 413-428, 2016.
[145] M.-S. Wang, "Eye to hand calibration using ANFIS for stereo vision-based object manipulation system," Microsystem Technologies, vol. 24, no. 1, pp. 305-317, 2018.
[146] J. Schmidt and H. Niemann, "Data selection for hand-eye calibration: a vector quantization approach," The International Journal of Robotics Research, vol. 27, no. 9, pp. 1027-1053, 2008.
[147] A. Malti and J. P. Barreto, "Robust hand-eye calibration for computer aided medical endoscopy," in 2010 IEEE International Conference on Robotics and Automation, 2010, pp. 5543-5549: IEEE.
[148] H. Pan, N. L. Wang, and Y. S. Qin, "A closed-form solution to eye-to-hand calibration towards visual grasping," Industrial Robot: An International Journal, 2014.
[149] J.-S. Hu and Y.-J. Chang, "Calibration of an eye-to-hand system using a laser pointer on hand and planar constraints," in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 982-987: IEEE.
[150] L. Wu and H. Ren, "Finding the kinematic base frame of a robot by hand-eye calibration using 3D position data," IEEE Transactions on Automation Science and Engineering, vol. 14, no. 1, pp. 314-324, 2016.
[151] S. Thompson et al., "Hand–eye calibration for rigid laparoscopes using an invariant point," International journal of computer assisted radiology and surgery, vol. 11, no. 6, pp. 1071-1080, 2016.
[152] M. Zhou et al., "Towards robotic eye surgery: Marker-free, online hand-eye calibration using optical coherence tomography images," IEEE Robotics and Automation Letters, vol. 3, no. 4, pp. 3944-3951, 2018.
[153] S. De Ma, "A self-calibration technique for active vision systems," IEEE Transactions on Robotics and Automation, vol. 12, no. 1, pp. 114-120, 1996.
[154] N. Andreff, R. Horaud, and B. Espiau, "On-line hand-eye calibration," in Second International Conference on 3-D Digital Imaging and Modeling (Cat. No. PR00062), 1999, pp. 430-436: IEEE.
[155] J. Jiang, X. Luo, Q. Luo, L. Qiao, and M. Li, "An overview of hand-eye calibration," The International Journal of Advanced Manufacturing Technology, vol. 119, no. 1, pp. 77-97, 2022.
[156] R. Bruder, F. Griese, F. Ernst, and A. Schweikard, "High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound," in Bildverarbeitung für die Medizin 2011: Springer, 2011, pp. 179-183.
[157] H. Li, Q. Ma, T. Wang, and G. S. Chirikjian, "Simultaneous hand-eye and robot-world calibration by solving the $ AX= YB $ problem without correspondence," IEEE Robotics and Automation Letters, vol. 1, no. 1, pp. 145-152, 2015.
[158] K. Pachtrachai, F. Vasconcelos, G. Dwyer, V. Pawar, S. Hailes, and D. Stoyanov, "CHESS—calibrating the hand-eye matrix with screw constraints and synchronization," IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 2000-2007, 2018.
[159] Q. Zhan and X. Wang, "Hand–eye calibration and positioning for a robot drilling system," The International Journal of Advanced Manufacturing Technology, vol. 61, no. 5-8, pp. 691-701, 2012.
[160] E. Olson, "AprilTag: A robust and flexible visual fiducial system," in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 3400-3407: IEEE.
[161] F. Chaumette and S. Hutchinson, "Visual servo control. I. Basic approaches," IEEE Robotics & Automation Magazine, vol. 13, no. 4, pp. 82-90, 2006.
[162] K. S. Arun, T. S. Huang, and S. D. Blostein, "Least-squares fitting of two 3-D point sets," IEEE Transactions on pattern analysis and machine intelligence, no. 5, pp. 698-700, 1987.
[163] S. Qiu, M. Wang, and M. R. Kermani, "A new formulation for hand–eye calibrations as point-set matching," IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 9, pp. 6490-6498, 2020.
[164] T. Paulino et al., "Low-cost 3-axis soft tactile sensors for the human-friendly robot Vizzy," in 2017 IEEE international conference on robotics and automation (ICRA), 2017, pp. 966-971: IEEE.
[165] C.-T. Lee and J.-Y. Chang, "A Workspace-Analysis-Based Genetic Algorithm for Solving Inverse Kinematics of a Multi-Fingered Anthropomorphic Hand," Applied Sciences, vol. 11, no. 6, p. 2668, 2021.
[166] H. Ren and P. Ben-Tzvi, "Learning inverse kinematics and dynamics of a robotic manipulator using generative adversarial networks," Robotics and Autonomous Systems, vol. 124, p. 103386, 2020.
[167] B.-H. Kim, "An adaptive neural network learning-based solution for the inverse kinematics of humanoid fingers," International Journal of Advanced Robotic Systems, vol. 11, no. 1, p. 3, 2014.
[168] E. Al-Gallaf, "Multi-fingered robot hand optimal task force distribution: Neural inverse kinematics approach," Robotics and Autonomous Systems, vol. 54, no. 1, pp. 34-51, 2006.
[169] R. V. Mayorga and P. Sanongboon, "Inverse kinematics and geometrically bounded singularities prevention of redundant manipulators: An Artificial Neural Network approach," Robotics and Autonomous Systems, vol. 53, no. 3-4, pp. 164-176, 2005.
[170] R. KöKer, "A genetic algorithm approach to a neural-network-based inverse kinematics solution of robotic manipulators based on error minimization," Information Sciences, vol. 222, pp. 528-543, 2013.
[171] S. Tabandeh, W. W. Melek, and C. M. Clark, "An adaptive niching genetic algorithm approach for generating multiple solutions of serial manipulator inverse kinematics with applications to modular robots," Robotica, vol. 28, no. 4, pp. 493-507, 2010.
[172] Z. Zhou, H. Guo, Y. Wang, Z. Zhu, J. Wu, and X. Liu, "Inverse kinematics solution for robotic manipulator based on extreme learning machine and sequential mutation genetic algorithm," International Journal of Advanced Robotic Systems, vol. 15, no. 4, p. 1729881418792992, 2018.
[173] C. Tajani, O. Abdoun, and A. I. Lahjouji, "Genetic algorithm adopting immigration operator to solve the asymmetric traveling salesman problem," International Journal of Pure and Applied Mathematics, vol. 115, no. 4, pp. 801-812, 2017.
[174] C.-T. Lee, C.-H. Tsai, and J.-Y. Chang, "A CAD-Free Random Bin Picking System for Fast Changeover on Multiple Objects," in Information Storage and Processing Systems, 2020, vol. 83600, p. V001T02A002: American Society of Mechanical Engineers.
[175] C.-H. Wu, S.-Y. Jiang, and K.-T. Song, "CAD-based pose estimation for random bin-picking of multiple objects using a RGB-D camera," in 2015 15th International Conference on Control, Automation and Systems (ICCAS), 2015, pp. 1645-1649: IEEE.
[176] K. Harada et al., "Project on development of a robot system for random picking-grasp/manipulation planner for a dual-arm manipulator," in 2014 IEEE/SICE International Symposium on System Integration, 2014, pp. 583-589: IEEE.
[177] X. Fan, X. Wang, and Y. Xiao, "A combined 2D-3D vision system for automatic robot picking," in Proceedings of the 2014 International Conference on Advanced Mechatronic Systems, 2014, pp. 513-516: IEEE.
[178] K. Liu, Z. Sun, and M. Fujii, "Ellipse detection based bin-picking visual servoing system," in 2010 Chinese Conference on Pattern Recognition (CCPR), 2010, pp. 1-5: IEEE.
[179] Austin Weber. (2018 Nov. 7). "Random Bin Picking Comes of Age", ASSEMBLY, Retrieved Jan. 11, 2023, from https://www.assemblymag.com/articles/94549-random-bin-picking-comes-of-age.
[180] H. F. Castro, J. S. Cardoso, and M. T. Andrade, "A Systematic Survey of ML Datasets for Prime CV Research Areas—Media and Metadata," Data, vol. 6, no. 2, p. 12, 2021.
[181] K. He, G. Gkioxari, P. Dollár, and R. Girshick, "Mask r-cnn," in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2961-2969.
[182] T.-Y. Lin et al., "Microsoft coco: Common objects in context," in European conference on computer vision, 2014, pp. 740-755: Springer.
[183] 張禎元, "基於人類技能移轉之人工智慧晶片發展環境及其於擬人雙手機器人之應用-子計畫一:靈巧具感知擬人雙手智慧機器人機電整合系統開發( II )", 科技部補助專題研究計畫報告, 2020.
[184] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. El Saddik, "Evaluating and improving the depth accuracy of Kinect for Windows v2," IEEE Sensors Journal, vol. 15, no. 8, pp. 4275-4285, 2015.
[185] "Projection in Intel RealSense SDK 2.0", Intel REALSENSE Documentation, Retrieved Jan. 12, 2023, from https://dev.intelrealsense.com/docs/projection-in-intel-realsense-sdk-20.

(此全文20260321後開放外部瀏覽)
電子全文
摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *