帳號:guest(3.135.208.201)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):王若羽
作者(外文):Wang, Jo-Yu
論文名稱(中文):結合OpenPose二維模型和人體計測進行三維關節角度計算
論文名稱(外文):3D joint angle computation using OpenPose 2D model and anthropometry
指導教授(中文):李昀儒
指導教授(外文):Lee, Yun-Ju
口試委員(中文):盧俊銘
黃瀅瑛
口試委員(外文):Lu, Jun-Ming
Huang, Ying-Yin
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:108034560
出版年(民國):110
畢業學年度:109
語文別:中文
論文頁數:140
中文關鍵詞:職業傷害動作分析影像處理三維重建
外文關鍵詞:Occupational injuryMotion analysisImage processing3D Reconstruction
相關次數:
  • 推薦推薦:0
  • 點閱點閱:162
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
在職業傷害中肌肉骨骼相關疾病相當常見為其要因之一,因為不良姿勢在長期累積下會造成肌肉骨骼傷害。在評估傷害的風險時,業界多採用國際通用的人因量表進行評估,通過在工作現場錄製影片和現場觀察的方法來進行風險判斷,但需耗費人力與時間。本研究期望利用影像分析技術,以二維關鍵點聯合預測取得人體關鍵點資訊,所建構二維人體骨架為基礎。考量人體在三維的活動,採用影像分析技術OpenPose輸出的二維人體關鍵點資訊搭配兩種肢段長度,進行深度的推算。
使用兩種角度同時拍攝,放置在正前方與側方,兩台攝影機夾角為90度,拍攝受試者進行彎腰取物後旋轉至另一面的動作,進行角度驗證。之後對OpenPose輸出的二維關鍵點進行運動軌跡的平滑修復,搭配兩種肢段長度,分別為使用自身身體肢段與使用人體計測資料來推算深度。共收取10名身體健康之受試者並分為兩組 ,第一組使用自身身體肢段長度、第二組使用人體計測資料庫的肢段長度進行深度的計算並建構三維座標,再使用座標進行關節角度的計算。
本研究使用單一攝影機數據進行深度推算,發現在人體旋轉時會產生座標軸的轉換,當人體旋轉至90度時進行修正可得到最小的誤差 ,結果顯示在使用自身身體肢段推算的三維座標與關節角度有較準確的結果。第一組與第二組前方及側方攝影機數據總關節距離平均誤差為11.94 cm 、9.35 cm與17.81 cm、12.40 cm,三維關節角度在矢狀面、額狀面、水平面的平均誤差總和,第一組與第二組前方攝影機數據結果為11.42度、14.84度,側方攝影機數據結果為8.97度、10.76度。 驗證的角度有軀幹、肩膀、手肘、膝蓋的關節角度。本研究使用自動化推算人體三維關節角度之系統,以簡易方式推算深度資訊,避免了二維情形下只能考量到單一平面關節角度的問題並可使用一台攝影機拍攝之 RGB影像即可達成目的,在肢段量測上,搭配利用影像辨識相關技術,取得人體肢段長度將可使系統更加便利,並可結合人因評估量表做風險評估使用。
Musculoskeletal diseases are common in occupational injuries as one of the main reasons due to awkward posture can cause musculoskeletal damage under long-term accumulation. When assessing the risk of injury, most of the industry adopts an international scale for assessment. The risk judgment is made by recording videos and on-site observations at the worksite, but it needs to cost labor and time. Consider the 3D human body motions, using the 2D keypoint information output by the image-base motion capture technology OpenPose is combined with the length of the segment to calculate the depth.
Two cameras were placed in front and side; the angle between the two cameras was 90 degrees. The subjects made the motion that bends over to take the object and then rotates to the other side to verify the angle while two cameras used two directions to shoot at the same time. The gaps of each key point trajectory from the OpenPose outputs were conducted moving average to repaired the trajectory. Subsequently, these 2D outputs were used to calculate the depth data with two types of segment lengths, respectively. A total of 10 healthy subjects were recruited. Participants in the first group used their own segment length, and participants in the second group used anthropometry database segment length to calculate the depth and construct the 3D coordinates, then calculate the joint angle.
This study demonstrated that a single camera data could be used to do depth estimation. When the human body makes a rotation, it will cause the coordinate axis conversion. The best correction time point was at the human body rotated to 90 degrees, which also resulted in the smallest error. The average error of the total joint distance between the first group and the second group from the front and side camera data was 11.94 cm, 9.35 cm, and 17.81 cm, 12.40 cm, respectively. The average error of joint angles was calculated the sum of the sagittal, coronal, horizontal planes. The error was first group and the second group were 11.42 degrees from the front camera in the first group, 14.84 degrees from the front camera in the second group, and 8.97 degrees from the side camera in the first group, and 10.76degrees from the side camera in the second group. The verified angles were the joint angles of the trunk, shoulders, elbows, and knees. This study estimated the 3D joint angle of the human body in a simple way by adding depth information. It avoided the problem that only a single plane joint angle could be considered in a 2D situation. For future researches and applications, the image recognition models can obtain the length of segments, which could make the system more convenient and reduce the joint angle calculation errors. Furthermore, it is expected to be applied with occupational assessment scales for musculoskeletal risk evaluations.

摘要II
表目錄VI
圖目錄VII
第一章 緒論1
1.1研究背景與動機1
1.2研究目的與範圍3
1.3研究架構與流程3
第二章 文獻回顧4
2.1姿勢與職業傷害關係4
2.2動作擷取技術11
2.2.1非影像式技術13
2.2.2影像式技術與OpenPose14
2.3三維重建方法19
2.4關節角度選擇27
2.5小結28
第三章 研究方法29
3.1問題定義與描述29
3.2 系統架構與資料獲取29
3.3 數據收集32
3.3.1 受遮蔽關鍵點訊號處理37
3.3.2 三維推算39
3.3.2角度計算40
第四章 結果與討論41
4.1三維座標結果42
4.1.1 座標軸修正42
4.1.2 三維距離45
4.2 角度驗證結果62
4.3研究限制75
第五章 結論與未來方向76
參考文獻77
附錄(一)81
中文部分:
1.職業安全衛生法。
取自:https://law.moj.gov.tw/LawClass/LawAll.aspx?pcode=N0060001
2.張簡振銘(2019),職業安全衛生概論。
取自:https://reurl.cc/v1r6Xk
3.勞動部勞工保險局(2019),勞工保險職業病現金給付人次-按職業病成因及行業分類。
取自:https://www.bli.gov.tw/0104061.html
4.勞動部職業安全衛生署,107年全國職業傷病診治網絡職業疾病通報統計概況。
取自: https://www.osha.gov.tw/1106/1113/1114/24256/
5.勞動部職業安全衛生署,職業傷病防治107年年報
取自: https://tmsc.osha.gov.tw/info_03.asp?kk=&sl=2
6.勞動部勞動及職業安全衛生研究所(2015),「人因性危害預防手冊」。TOSHMS中區促進會,2015。
7.勞動部職業安全衛生署(2014),人因性危害預防計畫指引。
取自: https://www.rootlaw.com.tw/LawContent.aspxLawID=A040290091057400-1030819
8.動作捕捉的分類與發展
取自:https://kknews.cc/zh-tw/tech/29mr99.html
9.林永祥(2010),以SOWT分析光學式動作擷取系統發展趨勢,第三屆運動科學暨休閒遊憩 管理學術研討會論文集。
10.勞動部勞動研究所(2016),「上肢重複性作業姿勢分析軟體與失利測量設備開發」。
取自:http://ebooks.lib.ntu.edu.tw/1_file/ilosh/2016080337/2016080337.pdf
11.勞動部勞動及職業安全衛生研究所(2017),「我國勞工人體計測調查研究」
取自:https://www.grb.gov.tw/search/planDetail?id=8280965



外文部分:
1.Addin En.Reflist, Abdel-Aziz, K. (2015). Direct linear transformation from comparator coordinates into object space coordinates in closerange photogrammetry. CloseRange Photogrammetry.
2.Ahmed Abobakr, Darius Nahavandi, Julie Iskander, Mohammed Hossny, & Smets, S. N. a. M. (2017). A Kinect-Based Workplace Postural Analysis. IEEE International Systems Engineering Symposium (ISSE).
3.Alexander Toshev, & Szegedy, C. (2014). DeepPose: Human Pose Estimation via Deep Neural Networks. arXiv:1312.4659.
4.Antonio Carlos Sementille, Luís Escaramuzi Lourenço, José Remo Ferreira Brega, & Rodello, I. (2004). A Motion Capture System Using Passive Markers. ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry.
5.Bortolini, M., Faccio, M., Gamberi, M., & Pilati, F. (2020). Motion Analysis System (MAS) for production and ergonomics assessment in the manufacturing processes. Computers & Industrial Engineering, 139.
6.CCOHS. (2020). Work-related Musculoskeletal Disorders (WMSDs). Retrieved from https://www.ccohs.ca/oshanswers/diseases/rmirsi.html
7.Chen, D., Cai, Y., Cui, J., Chen, J., Jiang, H., & Huang, M.-C. (2018). Risk factors identification and visualization for work-related musculoskeletal disorders with wearable and connected gait analytics system and Kinect skeleton models. Smart Health, 7-8, 60-77.
8.Dario Pavllo, C. F., David Grangier,Michael Auli. (2019). 3D human pose estimation in video with temporal convolutions and semi-supervised training. arXiv:1811.11742v2.
9.Dominic Tan, & Balaraman, T. (2020). Working Posture and Musculoskeletal Pain among Restaurant Chef. Indian Journal of Physiotherapy & Occupational Therapy, Vol. 14, No.2.
10.Dutta, T. (2012). Evaluation of the Kinect sensor for 3-D kinematic measurement in the workplace. Applied Ergonomics, 43(4), 645-649.
11.Eldar Insafutdinov, Leonid Pishchulin, Bjoern Andres, Mykhaylo Andriluka, & Schiele, B. (2016). DeeperCut A Deeper Stronger and Faster. arXiv:1605.03170, 3.
12.He Ling WANG, Y.-J. L. (2019). Occupational evaluation with Rapid Entire Body Assessment(REBA) via imaging processing in field. 50th Nordic Ergonomics and Human Factors Society Conference 2019, 156-161.
13.Huber, M. E., Seitz, A. L., Leeser, M., & Sternad, D. (2015). Validity and reliability of Kinect skeleton for measuring shoulder joint angles: a feasibility study. Physiotherapy, 101(4), 389-393.
14.Krigslund, R., Dosen, S., Popovski, P., Dideriksen, J. L., Pedersen, G. F., & Farina, D. (2013). A novel technology for motion capture using passive UHF RFID tags. IEEE Trans Biomed Eng, 60(5), 1453-1457.
15.Lascano, A., Patín, G., Larrea, A., & Antonio, T. S. (2019). Ergonomic Evaluation of Risk Level by Exposure to Forced Postures in Cattle Slaughterhouse Workers in Ecuador. In Advances in Social and Occupational Ergonomics,212-217.
16.Li, L., Martin, T., & Xu, X. (2020). A novel vision-based real-time method for evaluating postural risk factors associated with musculoskeletal disorders. Appl Ergon, 87, 103-138.
17.Lowe, B. D. (2004). Accuracy and validity of observational estimates of wrist and forearm posture. Ergonomics, 47(5), 527-554.
18.Lynn McAtamney, & Corlett, E. N. (1993). RULA: a survey method for the investigation of world-related upper limb disorders. Applied Ergonomics, 24(2), 91-99.
19.Mengjie Huang, Taeyong Lee, Ian Gibson, & Hajizadeh., K. (2012). Effect of sitting posture on spine joint angles and forces. iCREATe '12: 6th Rehabilitation Engineering & Assistive Technology, Singapore.
20.Momeni, Z., Choobineh, A., Razeghi, M., Ghaem, H., Azadian, F., & Daneshmandi, H. (2020). Work-related Musculoskeletal Symptoms among Agricultural Workers: A Cross-sectional Study in Iran. J Agromedicine, 1-10.
21.Norkin, C. C., & White, D. J. (2016). Measurement of joint motion: a guide to goniometry. FA Davis.
22.Okafor, U. A. C., Oghumu, S. N., Oke, K. I., Abaraogu, U. O., Unachukwu, O. O., & Sokunbi, O. G. (2020). Musculoskeletal disorders and ergonomic risk exposure assessment in manual material handlers in Lagos, Nigeria. AJOL, 19 No.2.
23.Quan Wei, J. S., Han Cheng, Zhang Yu, BAI Lijuan, Zhao Haimei (2016). A method of 3D human-motion capture and reconstruction. IEEE International Conference.
24.Rahul M. (2018). Review on Motion Capture Technology. Global Journal of Computer Science and Technology, 18(1).
25.S. Mihradi, Ferryanto, T. Dirgantara, & Mahyuddin, A. I. (2011). Development of an Optical Motion-Capture System. International Conference on Instrumentation, Communication, Information Technology and Biomedical Engineering.
26.Saba Bakhshi, Mohammad H. Mahoor, & S.Davidson., B. (2011). Development Of a Body Joint Angle Measurement System. IEEE Engineering in Medicine and Biology Society.
27.Schlagenhauf, F., , S. S., & Singhose, a. W. (2018). Comparison of Kinect and Vicon Motion Capture of Upper-Body Joint. 2018 IEEE 14th International Conference on Control and Automation (ICCA).
28.Tianlang Chen, C. F., Xiaohui Shen, Yiheng Zhu, Zhili Chen and, & Luo, J. (2020). Anatomy-aware 3D Human Pose Estimation in Videos. arXiv:2002.10322.
29.Vieira, S., & Corrente, J. E. (2011). Statistical methods for assessing agreement between double readings of clinical measurements. J Appl Oral Sci, 19(5), 488-492.
30.W. Chua, S.H. Hana, X. Luob, & Zhu, a. Z. (2019). 3D Human Body Reconstruction for Worker Ergonomic. 36th International Symposium on Automation and Robotics in Construction (ISARC).
31.Wang, H. L., &Lee, Y.-J. (2019). Occupational evaluation with Rapid Entire Body Assessment (REBA) via imaging processing in field. Human Factors Society Conference.
32.Yiannakides, A., Aristidou, A., & Chrysanthou, Y. (2019). Real‐time 3D human pose and motion reconstruction from monocular RGB videos. Computer Animation and Virtual Worlds, 30(3-4).
33.Zhang, Z. Y. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330-1334.
34.Zhe Cao, Gines Hidalgo, Tomas Simon, Shih-En Wei, & Sheikh, Y. (2019). OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. arXiv:1812.08008v2.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *