帳號:guest(3.133.155.163)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):鍾少軒
作者(外文):Chung, Shao Hsuan
論文名稱(中文):RGB-D相機之人體靜態動作追蹤表現評估
論文名稱(外文):Evaluating the Performance of RGB-D Camera for Static Posture Tracking
指導教授(中文):張堅琦
指導教授(外文):Chang, Chien Chi
口試委員(中文):邱銘傳
黃思皓
口試委員(外文):Chiu, Ming Chuan
Huang, Szu Hao
學位類別:碩士
校院名稱:國立清華大學
系所名稱:工業工程與工程管理學系
學號:103034562
出版年(民國):105
畢業學年度:104
語文別:中文
論文頁數:67
中文關鍵詞:RGB-D相機Kinect v2座標系校正
外文關鍵詞:RGB-D CameraKinect v2Coordinate System Calibration
相關次數:
  • 推薦推薦:0
  • 點閱點閱:104
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
大量的機械化及自動化設備導入取代了部分人力工作,仍有許多高度重複性及高生理負荷的工作項目使其工作人員暴露在高度的肌肉骨骼傷害風險因子當中,為了評估此傷害因子,必須要取得工作人員的工作姿勢,而人體姿勢通常使用人體主要關節點即可定義出來。RGB-D相機被視為除了傳統動作追蹤系統外另一項人因評估方法,目前也被廣泛應用在許多研究當中,但其中很少研究在使用RGB-D相機前,預先客觀的評估該相機的能力範圍。有鑑於此,本研究利用一套可行之座標系校正道具,使得RGB-D相機與傳統動作追蹤系統能夠簡單地進行座標系校正。以傳統動作追蹤系統數據為標準值,並利用12種靜態站姿、9種靜態坐姿來初步提供參考依據,探討當Kinect v2放置於三個不同角度時(0度、45度與90度),對於偵測各關節點以及不同分群關節誤差的差異。透過本研究,能夠為往後欲使用此系統的研究人員,了解到在不同情況之下Kinect v2的能力範圍,利用此系統開發出在工廠中完善的人員監控系統。
Many automatic machines are used in modern factories to replace manual works. However, there are many high-loaded tasks sometimes still need to be performed manually. This could put the workers under high risks of musculoskeletal injuries. In order to indentify the potential injury when workers perform various jobs, their working postures need to be collected and analyzed. Tranditionally, a complex and expensive motion tracking are used to collect these movement data. Recently, many researchers proposed the use of low cost RGB-D camera as an alternative. But not many researchers evaluated the capability of RGB-D cameras objectively before using it. In this study, we evaluate the accuracy of the RGB-D camera when it was used to track several static postures. The data from traditional motion tracking system were compared to the data outputs from the RGB-D camera. A total of 12static standing poustures and 9 static sitting postures were evaluated. The effect of RGB-D camera viewing angles (0°、45°、90°) are also examined.
摘要 i
Abstract ii
目錄 iii
表目錄 vi
第一章、緒論 1
1.1 研究背景與動機 1
1.2 研究目的 3
第二章、文獻探討 4
2.1 動作追蹤系統 4
2.1.1 光學類動作追蹤系統 4
2.1.2 非光學類動作追蹤系統 5
2.2 RGB-D相機 6
2.2.1 RGB-D相機應用 8
2.2.2 RGB-D相機能力評估 9
2.3 相機座標系校正 9
第三章、實驗方法與步驟 11
3.1 儀器架設 11
3.2 相機座標校正 13
3.2.1 座標系校正道具 13
3.2.2 座標系校正 14
3.3 準確度測量 16
3.3.1 關節點定義 17
3.3.2 靜態姿勢定義 19
3.3.3 誤差校正 22
3.3.4 蒐集數據及分析 23
第四章、研究結果 26
4.1 固定誤差測量 26
4.2 Kinect v2在靜態追蹤之誤差分析 27
4.2.1 整體誤差表現 27
4.2.2 站、坐姿整體誤差表現 29
4.2.3 關節點 31
4.2.4 分群關節在站、坐姿 44
4.2.5 上下半身在站、坐姿 54
第五章、建議與討論 58
5.1 討論 58
第六章、結論與未來建議 62
6.1 研究結論 62
6.2 未來研究方向 62
參考文獻 64

徐儆暉、王子娟、梁慧雯、彭淑美、吳承恩 (2006)。工作健康危害研究及職場健康操效果評估-站姿作業。行政院勞工委員會勞工安全衛生研究所,出版,1-32
123Kinect. (2014). KinectTM v2 Tech Specs. Retrieved from http://123kinect.com/everything-kinect-2-one-place/43136/
Canessa, A., Chessa, M., Gibaldi, A., Sabatini, S. P., & Solari, F. (2014). Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment. Journal of Visual Communication and Image Representation, 25(1), 227–237.
Chang, Y.-J., Chen, S.-F., & Huang, J.-D. (2011). A Kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities. Research in Developmental Disabilities, 32(6), 2566–2570.
Clark, R. A., Pua, Y. H., Bryant, A. L., & Hunt, M. A. (2013). Validity of the Microsoft Kinect for providing lateral trunk lean feedback during gait retraining. Gait and Posture, 38(4), 1064–1066.
Clark, R. A., Pua, Y. H., Fortin, K., Ritchie, C., Webster, K. E., Denehy, L., & Bryant, A. L. (2012). Validity of the Microsoft Kinect for assessment of postural control. Gait and Posture, 36(3), 372–377.
Dutta, T. (2012). Evaluation of the KinectPTMP sensor for 3-D kinematic measurement in the workplace. Applied Ergonomics, 43(4), 645–9.
Elbouz, M., Alfalou, A., Brosseau, C., Ben Haj Yahia, N., & Alam, M. S. (2015). Assessing the performance of a motion tracking system based on optical joint transform correlation. Optics Communications, 349, 65–82.
Filipe, V., Fernandes, F., Fernandes, H., Sousa, A., Paredes, H., & Barroso, J. (2012). Blind Navigation Support System based on Microsoft Kinect. Procedia Computer Science, 14, 94–101.
Furniss, M. (n.d.). Motion Capture. Retrieved from http://web.mit.edu/comm-forum/papers/furniss.html
Gonzalez-Jorge, H., Riveiro, B., Vazquez-Fernandez, E., Martínez-Sánchez, J., & Arias, P. (2013). Metrological evaluation of Microsoft Kinect and Asus Xtion sensors. Measurement: Journal of the International Measurement Confederation, 46(6), 1800–1806.
Hajibozorgi, M., & Arjmand, N. (2015). Sagittal range of motion of the thoracic spine using inertial tracking device and effect of measurement errors on model predictions. Journal of Biomechanics, 49, 1–6.
Helga, H., Kim, B. S., Peter Maria, V., & Bodo, R. (2015). The Kinect Recording System for objective three- and four-dimensional breast assessment with image overlays. Journal of Plastic, Reconstructive & Aesthetic Surgery : JPRAS, 3753.
Henseler, H., Kuznetsova, A., Vogt, P., & Rosenhahn, B. (2014). Validation of the Kinect device as a new portable imaging system for three-dimensional breast assessment. Journal of Plastic, Reconstructive & Aesthetic Surgery : JPRAS, 67(4), 483–8.
Khongma, A., Ruchanurucks, M., Koanantakool, T., Phatrapornnant, T., Koike, Y., & Rakprayoon, P. (2014). Soft Computing Techniques in Engineering Applications. In S. Patnaik & B. Zhong (Eds.), 15–32. Cham: Springer International Publishing.
Macknojia, R., Chavez-Aragon, A., Payeur, P., & Laganiere, R. (2013). Calibration of a network of Kinect sensors for robotic inspection over a large workspace. 2013 IEEE Workshop on Robot Vision, WORV 2013, 184–190.
Nguyen, T. T., Vandevoorde, K., Wouters, N., Kayacan, E., De Baerdemaeker, J. G., & Saeys, W. (2016). Detection of red and bicoloured apples on tree with an RGB-D camera. Biosystems Engineering, 1–12.
Parry, I., Carbullido, C., Kawada, J., Bagley, A., Sen, S., Greenhalgh, D., & Palmieri, T. (2014). Keeping up with video game technology: objective analysis of Xbox KinectPTMP and PlayStation 3 MovePTMP for use in burn rehabilitation. Burns : Journal of the International Society for Burn Injuries, 40(5), 852–9.
Schmid, S., Studer, D., Hasler, C.-C., Romkes, J., Taylor, W. R., Lorenzetti, S., & Brunner, R. (2016). Quantifying spinal gait kinematics using an enhanced optical motion capture approach in adolescent idiopathic scoliosis. Gait & Posture, 44, 231–237.
Schmitz, A., Ye, M., Shapiro, R., Yang, R., & Noehren, B. (2014). Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. Journal of Biomechanics, 47(2), 587–91.
Sun, T. L., & Lee, C. H. (2013). An Impact Study of the Design of Exergaming Parameters on Body Intensity from Objective and Gameplay-Based Player Experience Perspectives, Based on Balance Training Exergame. PLoS ONE, 8(7).
Tian, Y., Meng, X., Tao, D., Liu, D., & Feng, C. (2015). Upper limb motion tracking with the integration of IMU and Kinect. Neurocomputing, 159(1), 207–218.
Xu, X., Chang, C. C., Faber, G. S., Kingma, I., & Dennerlein, J. T. (2012). Estimation of 3-D peak L5/S1 joint moment during asymmetric lifting tasks with cubic spline interpolation of segment Euler angles. Applied Ergonomics, 43(1), 115–120.
Xu, X., & McGorry, R. W. (2015). The validity of the first and second generation Microsoft Kinect for identifying joint center locations during static postures. Applied Ergonomics.
Yeung, L. F., Cheng, K. C., Fong, C. H., Lee, W. C. C., & Tong, K.-Y. (2014). Evaluation of the Microsoft Kinect as a clinical assessment tool of body sway. Gait & Posture, 40(4), 532–8.
(此全文未開放授權)
電子全文
摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *