|
[1] Wang L., Liu S., Liu H., Wang X.V. (2020). Overview of Human-Robot Collaboration in Manufacturing, in Proceedings of 5th International Conference on the Industry 4.0 Model for Advanced Manufacturing, p. 15-58 [2] Pini, F., Ansaloni, M., & Leali, F. (2016, September). Evaluation of operator relief for an effective design of HRC workcells. In 2016 IEEE 21st international conference on emerging technologies and factory automation (ETFA) (pp. 1-6). IEEE. [3] Schlotzhauer, A., Kaiser, L., Wachter, J., Brandstötter, M., & Hofbaur, M. (2019, August). On the trustability of the safety measures of collaborative robots: 2D Collision-force-map of a sensitive manipulator for safe HRC. In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE) (pp. 1676-1683). IEEE. [4] Human’s FOV: https://vrui-research.gitbook.io/researchonvrui/ergonomic-issues/jie-mian-zui-yu/ren-yan-kan-neng-li [5] Wickens, C. D. (2008). Multiple Resources and Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50(3), 449–455. [6] Kumar, S., Savur, C., & Sahin, F. (2020). Survey of human–robot collaboration in industrial settings: Awareness, intelligence, and compliance. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 51(1), 280-297. [7] Villani, V., Pini, F., Leali, F., & Secchi, C. (2018). Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics, 55, 248-266. [8] Bdiwi, M., Pfeifer, M., & Sterzing, A. (2017). A new strategy for ensuring human safety during various levels of interaction with industrial robots. CIRP Annals, 66(1), 453-456. [9] International Organization for Standardization. (2016). Robots and robotic devices - Collaborative robots (ISO Standard No. 15066: 2016). https://www.iso.org/standard/62996.html [10] Green, S. A., Billinghurst, M., Chen, X., & Chase, J. G. (2008). Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems, 5(1), 1. [11] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., & Kruijff, E. (2019, October). Non-visual cues for view management in narrow field of view augmented reality displays. In 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 190-201). IEEE. [12] Lipowski, Z. J. (1975). Sensory and information inputs overload: behavioral effects. Comprehensive Psychiatry. [13] Hietanen, A., Pieters, R., Lanz, M., Latokartano, J., & Kämäräinen, J. K. (2020). AR-based interaction for human-robot collaborative manufacturing. Robotics and Computer-Integrated Manufacturing, 63, 101891. [14] Bolano, G., Fu, Y., Roennau, A., & Dillmann, R. (2021, July). Deploying Multi-Modal Communication Using Augmented Reality in a Shared Workspace. In 2021 18th International Conference on Ubiquitous Robots (UR) (pp. 302-307). IEEE. [15] Grushko, S., Vysocký, A., Oščádal, P., Vocetka, M., Novák, P., & Bobovský, Z. (2021). Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors, 21(11), 3673. [16] Scheggi, S., Chinello, F., & Prattichizzo, D. (2012, June). Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks. In Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments (pp. 1-4). [17] Eckert, M., Blex, M., & Friedrich, C. M. (2018, January). Object detection featuring 3D audio localization for Microsoft HoloLens. In Proc. 11th Int. Joint Conf. on Biomedical Engineering Systems and Technologies (Vol. 5, pp. 555-561). [18] Zijlstra, A. T. (2017). Using the HoloLens' Spatial Sound System to aid the Visually Impaired when Navigating Indoors (Doctoral dissertation, Faculty of Science and Engineering, University of Groningen). [19] Ménélas, B., Picinalli, L., Katz, B. F., & Bourdot, P. (2010, March). Audio haptic feedbacks for an acquisition task in a multi-target context. In 2010 IEEE symposium on 3D user interfaces (3DUI) (pp. 51-54). IEEE. [20] Dehais, F., Sisbot, E. A., Alami, R., & Causse, M. (2011). Physiological and subjective evaluation of a human–robot object hand-over task. Applied ergonomics, 42(6), 785-791. [21] Lasota, P. A., Fong, T., & Shah, J. A. (2017). A survey of methods for safe human-robot interaction. Now Publishers. [22] NASA-TLX Questionnaire: https://humansystems.arc.nasa.gov/groups/tlx/ [23] Salvendy, G. (Ed.). (2006). Handbook of human factors and ergonomics (Vol. 144). New York: Wiley. [24] Deci, E. L., Vallerand, R. J., Pelletier, L. G., & Ryan, R. M. (1991). Motivation and education: The self-determination perspective. Educational psychologist, 26(3-4), 325-346. [25] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., Kruijff, E., & Schöning, J. (2020). Comparing non-visual and visual guidance methods for narrow field of view augmented reality displays. IEEE Transactions on Visualization and Computer Graphics, 26(12), 3389-3401. [26] Trossen Robotics ROS Research Arm: WidowX 250 Robot Arm, https://www.trossenrobotics.com/widowx-250-robot-arm.aspx [27] SensorGlove Nova: https://www.senseglove.com/product/nova/ [28] HTC VIVE Base Station 2.0: https://www.vive.com/us/accessory/base-station2/ [29] HTC VIVE Tracker 2.0: https://www.vive.com/nz/accessory/vive-tracker/ [30] Microsoft HoloLens 2: https://www.microsoft.com/en-us/HoloLens/hardware [31] Windows Subsystem for Linux Documentation: https://docs.microsoft.com/en-us/windows/wsl/ [32] ROS-Robot Operating System: https://www.ros.org/ [33] System Usability Scale, SUS: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html [34] ROS MoveIt: https://moveit.ros.org/
|