帳號:guest(3.137.217.220)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):范憶偉
作者(外文):Fan, Yi-Wei
論文名稱(中文):整合多台深度相機數據於動作偵測
論文名稱(外文):The integration of multiple depth cameras for posture identification
指導教授(中文):張堅琦
指導教授(外文):Chang, Chien-Chi
口試委員(中文):吳欣潔
孫天龍
口試委員(外文):Wu, Hsin-Chieh
Sun, Tien-Lung
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:106034571
出版年(民國):108
畢業學年度:107
語文別:中文
論文頁數:67
中文關鍵詞:人工物料搬運肌肉骨骼傷害深度相機上肢關節角度
外文關鍵詞:Manual Material HandlingMusculoskeletal injuryDepth cameraUpper limbs joint angle
相關次數:
  • 推薦推薦:0
  • 點閱點閱:376
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
人工物料搬運作業仍為造成工作相關肌肉骨骼傷害的其中一個主要成因。為了評估與預防由人工物料搬運所引起的肌肉骨骼傷害,通常會在實驗室環境中使用動作追蹤系統來捕捉動作數據並分析。雖然傳統的動作追蹤系統擁有高準確度,但若需應用在實際的工作環境中,仍存在著許多限制。因此,許多研究開始探討使用成本較低及可攜帶的深度相機作為替代工具的可能性,並且整合多台深度相機進行數據蒐集與應用,為了擴大其偵測環境。然而,這些研究大部分著重於下肢步態數據上的驗證與分析,且鮮少討論在實際工作環境中進行物料搬運作業時常會遇到的遮蔽議題。
綜合上述,本研究提出了一套使用數台深度相機來偵測人工物料搬運作業過程中整體上肢關節角度的偵測方法,並以傳統動作追蹤系統的數據為基準來探討在有、無物料箱遮蔽的狀況下,不同相機組合類型 (單台正向深度相機、單台側向深度相機、整合相機 (正向+側向深度相機))在偵測整體上肢關節運動角度的表現。
結果顯示,當偵測過程不會受到物料箱遮蔽時,整合相機與單台正向深度相機的偵測表現沒有顯著差異 (p-value =1.00)、而使用單台正向深度相機的偵測表現會顯著地優於使用單台側向深度相機(p-value <0.01);當偵測過程中實際拿取物料箱時,使用整合相機的偵測準確度會顯著地高於使用側向深度相機(p-value <0.01)的準確度,然而雖然使用整合相機與使用正向深度相機在這個狀況下的偵測表現並無顯著差異(p-value =0.11),但整合相機 (誤差平均數:19.41°)相對於正向深度相機(誤差平均數:24.27°)仍有表現較好的趨勢。未來使用深度相機進行人工物料搬運作業的上肢關節角度偵測時,透過整合數台的深度相機,可以提升偵測環境中有受到遮蔽物影響時的偵測表現;而當偵測環境不會受到遮蔽物影響時,使用單台擺放在前方的深度相機應可達到一定的偵測水準。
Manual Material Handling (MMH) is one of major causes of work-related musculoskeletal injury. To assess and prevent such injuries caused by MMH, the motion capture system is commonly used to monitor real-time working postures in the laboratory environment. However, such a system is often not practical in real work environments due to its cost and environmental limitations. Several studies have evaluated the possibility of using lower-cost and portable depth camera as an alternative for motion tracking based on the output from the motion capture system. In addition, the integration of multiple depth cameras for data collection and application was also reported to expand the detection range of depth cameras. Most of those studies focused on verification and analysis for lower limb gait data tracking performance. Besides, the influence of occlusions, which is a common problem when monitoring the human MMH posture in a real work environment, is rarely investigated. The objective of this study is to propose an integration of multi-depth cameras for overall upper limbs joint angle estimation. The data extracted from motion capture system is used as a reference to evaluate the tracking performance of depth camera under different tracking conditions (with occlusion/ without occlusion) and depth camera combinations (0° camera, 90° camera, multi- camera: 0° camera + 90° camera) during MMH tasks. The results show that the performance of the 0° camera is significantly better than the 90° camera (p-value<0.01) and there is no significant difference between the tracking accuracy of the 0° camera and multi-cameras when tracking motion data without occlusions (p-value=1.00). On the other hand, when there are occlusions present in the tracking environment, the tracking error of multi-cameras is significantly lower than the 90° camera (p-value<0.01); the 0° camera and multi-cameras have no significantly different detecting performance (p-value=0.11), however the average tracking error of multi-cameras (19.41°) is lower than for the 0° camera (24.27°). According to the results of this study, integrating multi-depth cameras can improve the detecting performance for overall upper limbs joint angle estimation with occlusions in the tracking environment. When there are no occlusions present, only one 0° camera is acceptable for MMH motion tracking.
致謝.......... i
摘要.......... ii
Abstract.......... iv
圖目錄.......... ix
表目錄.......... x
第一章 緒論.......... 1
1.1 研究背景與動機.......... 1
1.2 研究目的.......... 4
第二章 文獻回顧.......... 5
2.1 肌肉骨骼傷害.......... 5
2.1.1 人工物料搬運.......... 5
2.1.2 肌肉骨骼傷害調查.......... 5
2.1.3 生物力學法.......... 7
2.2 動作分析設備與軟體.......... 8
2.2.1 實驗室中的動作追蹤系統與其研究應用.......... 8
2.2.2 動作分析軟體.......... 9
2.3 深度相機的應用.......... 10
2.3.1 深度相機.......... 10
2.3.2 深度相機應用於肌肉骨骼偵測.......... 11
2.3.3 深度相機偵測關節角度.......... 12
2.3.4 影響深度相機偵測表現的因子.......... 13
2.3.5 提升深度相機偵測表現的研究.......... 13
第三章 研究方法.......... 15
3.1 實驗儀器與設備.......... 15
3.2 實驗設計.......... 18
3.2.1 研究參與者.......... 19
3.2.2 環境架設.......... 19
3.2.3 實驗流程.......... 22
3.3 數據分析.......... 25
3.3.1 座標系建構.......... 25
3.3.3人體骨骼模型建立.......... 32
3.4 統計分析.......... 33
第四章 結果.......... 34
4.1 敘述性統計資料.......... 35
4.2 二因子重複量測變異數分析.......... 38
4.3事後分析.......... 40
4.4簡單主效果分析.......... 42
4.4.1 KinectTM v2之事後分析.......... 44
4.4.2 抬舉動作之事後分析.......... 46
第五章、建議與討論.......... 48
5.1. 主效果顯著之因子探討.......... 48
5.1.1 不同Kinect 相機對於偵測關節角度誤差的探討.......... 48
5.1.2 不同抬舉動作對於偵測關節角度誤差的探討.......... 49
5.2 簡單主效果事後分析.......... 50
5.2.1 同一個抬舉動作下,KinectTM v2相機組合的比較.......... 50
5.2.2 同一個KinectTM v2相機組合中,抬舉動作的比較.......... 51
5.3 Multi-Kinect的偵測表現.......... 54
5.4 研究限制.......... 58
第六章 結論.......... 59
參考文獻.......... 60

1. Albayrak, A., Van Veelen, M., Prins, J., Snijders, C., De Ridder, H., & Kazemier, G. (2007). A newly designed ergonomic body support for surgeons. Surgical endoscopy, 21(10), 1835-1840.
2. Ayoub, M., & Dempsey, P. G. (1999). The psychophysical approach to manual materials handling task design. Ergonomics, 42(1), 17-31.
3. Badger, D. W., & Habes, D. (1981). Work practices guide for manual lifting: US Government Printing Office.
4. Bell, A. L., Brand, R. A., & Pedersen, D. R. (1987). Prediction of hip joint center location from external landmarks. Journal of biomechanics, 20(9), 913.
5. Bell, A. L., Pedersen, D. R., & Brand, R. A. (1990). A comparison of the accuracy of several hip center location prediction methods. Journal of biomechanics, 23(6), 617-621.
6. Best, R., & Begg, R. (2006). Overview of movement analysis and gait features Computational intelligence for movement sciences: neural networks and other emerging techniques (pp. 1-69): IGI Global.
7. Brodie, M., Walmsley, A., & Page, W. (2008). The static accuracy and calibration of inertial measurement units for 3D orientation.
8. Bronner, S. (2012). Differences in segmental coordination and postural control in a multi-joint dance movement: developpe arabesque. Journal of Dance Medicine & Science, 16(1), 26-35.
9. Burdorf, A., Derksen, J., Naaktgeboren, B., & van Riel, M. (1992). Measurement of trunk bending during work by direct observation and continuous measurement. Applied ergonomics, 23(4), 263-267.
10. Clark, R. A., Pua, Y.-H., Fortin, K., Ritchie, C., Webster, K. E., Denehy, L., & Bryant, A. L. (2012). Validity of the Microsoft Kinect for assessment of postural control. Gait & posture, 36(3), 372-377.
11. Davis III, R. B., Ounpuu, S., Tyburski, D., & Gage, J. R. (1991). A gait analysis data collection and reduction technique. Human Movement Science, 10(5), 575-587.
12. Del Pizzo, L., Foggia, P., Greco, A., Percannella, G., & Vento, M. (2016). Counting people by RGB or depth overhead cameras. Pattern Recognition Letters, 81, 41-50.
13. Dutta, T. (2012). Evaluation of the Kinect™ sensor for 3-D kinematic measurement in the workplace. Applied ergonomics, 43(4), 645-649.
14. Eltoukhy, M., Kuenze, C., Andersen, M. S., Oh, J., & Signorile, J. (2017). Prediction of ground reaction forces for Parkinson's disease patients using a kinect-driven musculoskeletal gait analysis model. Medical engineering & physics, 50, 75-82.
15. Eltoukhy, M., Oh, J., Kuenze, C., & Signorile, J. (2017). Improved kinect-based spatiotemporal and kinematic treadmill gait assessment. Gait & posture, 51, 77-83.
16. Faber, G. S., Chang, C.-C., Kingma, I., & Dennerlein, J. T. (2013). Estimating dynamic external hand forces during manual materials handling based on ground reaction forces and body segment accelerations. Journal of biomechanics, 46(15), 2736-2740.
17. Faber, G. S., Kingma, I., & van Dieen, J. H. (2011). Effect of initial horizontal object position on peak L5/S1 moments in manual lifting is dependent on task type and familiarity with alternative lifting strategies. Ergonomics, 54(1), 72-81.
18. Galna, B., Barry, G., Jackson, D., Mhiripiri, D., Olivier, P., & Rochester, L. (2014). Accuracy of the Microsoft Kinect sensor for measuring movement in people with Parkinson's disease. Gait & posture, 39(4), 1062-1068.
19. Geerse, D. J., Coolen, B. H., & Roerdink, M. (2015). Kinematic validation of a multi-Kinect v2 instrumented 10-meter walkway for quantitative gait assessments. PloS one, 10(10), e0139913.
20. Genaidy, A., & Asfour, S. S. (1987). Review and evaluation of physiological cost prediction models for manual materials handling. Human factors, 29(4), 465-476.
21. Gonzalez-Jorge, H., Riveiro, B., Vazquez-Fernandez, E., Martínez-Sánchez, J., & Arias, P. (2013). Metrological evaluation of microsoft kinect and asus xtion sensors. Measurement, 46(6), 1800-1806.
22. Haggag, H., Hossny, M., Nahavandi, S., & Creighton, D. (2013). Real time ergonomic assessment for assembly operations using kinect. Paper presented at the 2013 UKSim 15th International Conference on Computer Modelling and Simulation.
23. Henry, P., Krainin, M., Herbst, E., Ren, X., & Fox, D. (2012). RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. The International Journal of Robotics Research, 31(5), 647-663.
24. Hof, A. L. (1992). An explicit expression for the moment in multibody systems. Journal of biomechanics, 25(10), 1209-1211.
25. Huynh, T., Min, R., & Dugelay, J.-L. (2012). An efficient LBP-based descriptor for facial depth images applied to gender recognition using RGB-D face data. Paper presented at the Asian Conference on Computer Vision.
26. Kadaba, M. P., Ramakrishnan, H., & Wootten, M. (1990). Measurement of lower extremity kinematics during level walking. Journal of orthopaedic research, 8(3), 383-392.
27. Kaenchan, S., Mongkolnam, P., Watanapa, B., & Sathienpong, S. (2013). Automatic multiple kinect cameras setting for simple walking posture analysis. Paper presented at the 2013 International Computer Science and Engineering Conference (ICSEC).
28. Karhu, O., Härkönen, R., Sorvali, P., & Vepsäläinen, P. (1981). Observing working postures in industry: Examples of OWAS application. Applied ergonomics, 12(1), 13-17.
29. Kingma, I., de Looze, M. P., Toussaint, H. M., Klijnsma, H. G., & Bruijnen, T. B. (1996). Validation of a full body 3-D dynamic linked segment model. Human Movement Science, 15(6), 833-860.
30. Kuorinka, I., Jonsson, B., Kilbom, A., Vinterberg, H., Biering-Sørensen, F., Andersson, G., & Jørgensen, K. (1987). Standardised Nordic questionnaires for the analysis of musculoskeletal symptoms. Applied ergonomics, 18(3), 233-237.
31. Li, Z., Chang, C.-C., DiDomenico, A., Qi, C., & Chiu, S.-L. (2015). Investigating gait adjustments and body sway while walking across wooden scaffold boards. Ergonomics, 58(9), 1581-1588.
32. Lim, D., Kim, C., Jung, H., Jung, D., & Chun, K. J. (2015). Use of the Microsoft Kinect system to characterize balance ability during balance training. Clinical interventions in aging, 10, 1077.
33. Müller, B., Ilg, W., Giese, M. A., & Ludolph, N. (2017). Validation of enhanced kinect sensor based motion capturing for gait assessment. PloS one, 12(4), e0175813.
34. McAtamney, L., & Corlett, E. N. (1995). RULA: a survey method for the. irwestigation of world-related upper limb disorders. Appl. Ergon, 19.
35. McGill, S. M., Marshall, L., & Andersen, J. (2013). Low back loads while walking and carrying: comparing the load carried in one hand or in both hands. Ergonomics, 56(2), 293-302.
36. Mehrizi, R., Peng, X., Xu, X., Zhang, S., Metaxas, D., & Li, K. (2018). A computer vision based method for 3D posture estimation of symmetrical lifting. Journal of biomechanics, 69, 40-46.
37. Mehrizi, R., Xu, X., Zhang, S., Pavlovic, V., Metaxas, D., & Li, K. (2017). Using a marker-less method for estimating L5/S1 moments during symmetrical lifting. Applied ergonomics, 65, 541-550.
38. Mentiplay, B. F., Perraton, L. G., Bower, K. J., Pua, Y.-H., McGaw, R., Heywood, S., & Clark, R. A. (2015). Gait assessment using the Microsoft Xbox One Kinect: Concurrent validity and inter-day reliability of spatiotemporal and kinematic variables. Journal of biomechanics, 48(10), 2166-2170.
39. Mital, A. (1984). Comprehensive maximum acceptable weight of lift database for regular 8-hour work shifts. Ergonomics, 27(11), 1127-1138.
40. Obdržálek, Š., Kurillo, G., Ofli, F., Bajcsy, R., Seto, E., Jimison, H., & Pavel, M. (2012). Accuracy and robustness of Kinect pose estimation in the context of coaching of elderly population. Paper presented at the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.
41. Pfister, A., West, A. M., Bronner, S., & Noah, J. A. (2014). Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis. Journal of medical engineering & technology, 38(5), 274-280.
42. Plantard, P., Muller, A., Pontonnier, C., Dumont, G., Shum, H. P., & Multon, F. (2017). Inverse dynamics based on occlusion-resistant Kinect data: Is it usable for ergonomics? International Journal of Industrial Ergonomics, 61, 71-80.
43. Plantard, P., Shum, H. P., Le Pierres, A.-S., & Multon, F. (2017). Validation of an ergonomic assessment method using Kinect data in real workplace conditions. Applied ergonomics, 65, 562-569.
44. Rab, G., Petuskey, K., & Bagley, A. (2002). A method for determination of upper extremity kinematics. Gait & posture, 15(2), 113-119.
45. Rantz, M., Skubic, M., Abbott, C., Galambos, C., Popescu, M., Keller, J., . . . Petroski, G. F. (2015). Automated in-home fall risk assessment and detection sensor system for elders. The Gerontologist, 55(Suppl_1), S78-S87.
46. Sanders, M. S., & McCormick, E. J. (1987). Human factors in engineering and design: McGRAW-HILL book company.
47. Schmitz, A., Ye, M., Shapiro, R., Yang, R., & Noehren, B. (2014). Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. Journal of biomechanics, 47(2), 587-591.
48. Seo, N. J., Fathi, M. F., Hur, P., & Crocher, V. (2016). Modifying Kinect placement to improve upper limb joint angle measurement accuracy. Journal of Hand Therapy, 29(4), 465-473.
49. Serratos-Perez, J. N., Hidalgo-Valadez, C., & Negrete-Garcia, M. C. (2015). Ergonomic risks in operating rooms: An unexplored area in Mexico. Procedia Manufacturing, 3, 67-73.
50. Sessoms, P. H., Wyatt, M., Grabiner, M., Collins, J.-D., Kingsbury, T., Thesing, N., & Kaufman, K. (2014). Method for evoking a trip-like response using a treadmill-based perturbation during locomotion. Journal of biomechanics, 47(1), 277-280.
51. Staranowicz, A. N., Ray, C., & Mariottini, G.-L. (2015). Easy-to-use, general, and accurate multi-Kinect calibration and its application to gait monitoring for fall prediction. Paper presented at the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).
52. Steven Moore, J., & Garg, A. (1995). The strain index: a proposed method to analyze jobs for risk of distal upper extremity disorders. American Industrial Hygiene Association Journal, 56(5), 443-458.
53. Stone, E. E., & Skubic, M. (2011). Passive in-home measurement of stride-to-stride gait variability comparing vision and Kinect sensing. Paper presented at the 2011 Annual international conference of the IEEE engineering in medicine and biology society.
54. Wang, Q., Kurillo, G., Ofli, F., & Bajcsy, R. (2015). Evaluation of pose tracking accuracy in the first and second generations of microsoft kinect. Paper presented at the 2015 international conference on healthcare informatics.
55. Waters, T. R., Putz-Anderson, V., Garg, A., & Fine, L. J. (1993). Revised NIOSH equation for the design and evaluation of manual lifting tasks. Ergonomics, 36(7), 749-776.
56. Xu, X., & McGorry, R. W. (2015). The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures. Applied ergonomics, 49, 47-54.
57. Xu, X., McGorry, R. W., Chou, L.-S., Lin, J.-h., & Chang, C.-c. (2015). Accuracy of the Microsoft Kinect™ for measuring gait parameters during treadmill walking. Gait & posture, 42(2), 145-151.
58. Xu, X., Robertson, M., Chen, K. B., Lin, J.-h., & McGorry, R. W. (2017). Using the Microsoft Kinect™ to assess 3-D shoulder kinematics during computer use. Applied ergonomics, 65, 418-423.
59. Yang, L., Zhang, L., Dong, H., Alelaiwi, A., & El Saddik, A. (2015). Evaluating and improving the depth accuracy of Kinect for Windows v2. IEEE Sensors Journal, 15(8), 4275-4285.
60. Yocum, D., Weinhandl, J. T., Fairbrother, J. T., & Zhang, S. (2018). Wide step width reduces knee abduction moment of obese adults during stair negotiation. Journal of biomechanics, 75, 138-146.
61. Zhang, X., Paquette, M. R., & Zhang, S. (2013). A comparison of gait biomechanics of flip-flops, sandals, barefoot and shoes. Journal of foot and ankle research, 6(1), 45.

(此全文未開放授權)
電子全文
中英文摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *