Robot Safety Monitoring using Programmable Light Curtains (2404.03556v1)
Abstract: As factories continue to evolve into collaborative spaces with multiple robots working together with human supervisors in the loop, ensuring safety for all actors involved becomes critical. Currently, laser-based light curtain sensors are widely used in factories for safety monitoring. While these conventional safety sensors meet high accuracy standards, they are difficult to reconfigure and can only monitor a fixed user-defined region of space. Furthermore, they are typically expensive. Instead, we leverage a controllable depth sensor, programmable light curtains (PLC), to develop an inexpensive and flexible real-time safety monitoring system for collaborative robot workspaces. Our system projects virtual dynamic safety envelopes that tightly envelop the moving robot at all times and detect any objects that intrude the envelope. Furthermore, we develop an instrumentation algorithm that optimally places (multiple) PLCs in a workspace to maximize the visibility coverage of robots. Our work enables fence-less human-robot collaboration, while scaling to monitor multiple robots with few sensors. We analyze our system in a real manufacturing testbed with four robot arms and demonstrate its capabilities as a fast, accurate, and inexpensive safety monitoring solution.
- J. R. Bartels, J. Wang et al., “Agile depth sensing using triangulation light curtains,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 7900–7908.
- S. Ancha, Y. Raaj et al., “Active perception using light curtains for autonomous driving,” in Computer Vision – ECCV 2020, A. Vedaldi, H. Bischof, T. Brox, and J.-M. Frahm, Eds. Cham: Springer International Publishing, 2020, pp. 751–766.
- E. Matheson, R. Minto et al., “Human–robot collaboration in manufacturing applications: A review,” Robotics, vol. 8, no. 4, p. 100, 2019.
- L. Gualtieri, E. Rauch, and R. Vidoni, “Emerging research fields in safety and ergonomics in industrial collaborative robotics: A systematic literature review,” Robotics and Computer-Integrated Manufacturing, vol. 67, p. 101998, 2021.
- S. Robla-Gómez, V. M. Becerra et al., “Working together: A review on safe human-robot collaboration in industrial environments,” IEEE, vol. 5, pp. 26 754–26 773, 2017.
- A. Buerkle, W. Eaton et al., “Towards industrial robots as a service (iraas): Flexibility, usability, safety and business models,” Robotics and Computer-Integrated Manufacturing, vol. 81, p. 102484, 2023.
- M. Ozkahraman, C. Yilmaz, and H. Livatyali, “Design and validation of a camera-based safety system for fenceless robotic work cells,” Applied Sciences, vol. 11, no. 24, p. 11679, 2021.
- K. Zheng, F. Wu, and X. Chen, “Laser-based people detection and obstacle avoidance for a hospital transport robot,” Sensors, vol. 21, no. 3, p. 961, 2021.
- S. Pasinetti, C. Nuzzi et al., “Development and characterization of a safety system for robotic cells based on multiple time of flight (tof) cameras and point cloud analysis,” in 2018 Workshop on Metrology for Industry 4.0 and IoT. IEEE, 2018, pp. 1–6.
- Intel, “Using intel® fpgas in veo freemove* system to enable safe, dynamic, and flexible human-robot collaboration,” 2020. [Online]. Available: https://www.intel.com/content/dam/www/central-libraries/us/en/documents/fpgas-in-veo-freemove-case-study.pdf
- J. Wang, J. Bartels et al., “Programmable triangulation light curtains,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 19–34.
- Y. Raaj, S. Ancha et al., “Exploiting & refining depth distributions with triangulation light curtains,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 7434–7442.
- S. Ancha, G. Pathak et al., “Active velocity estimation using light curtains via self-supervised multi-armed bandits,” in Proceedings of Robotics: Science and Systems, Daegu, Republic of Korea, July 2023.
- S. Ancha, G. Pathak, S. Narasimhan, and D. Held, “Active Safety Envelopes using Light Curtains with Probabilistic Guarantees,” in Proceedings of Robotics: Science and Systems, Virtual, July 2021.
- C. Altuntas, “Review of scanning and pixel array-based lidar point-cloud measurement techniques to capture 3d shape or motion,” Applied Sciences, vol. 13, no. 11, p. 6488, 2023.
- S. Yang, W. Xu et al., “Multi-source vision perception for human-robot collaboration in manufacturing,” in 2018 ieee 15th international conference on networking, sensing and control (icnsc). IEEE, 2018, pp. 1–6.
- A. Mohammed, B. Schmidt, and L. Wang, “Active collision avoidance for human–robot collaboration driven by vision sensors,” International Journal of Computer Integrated Manufacturing, vol. 30, no. 9, pp. 970–980, 2017.
- J. Shepard, “Product detection and ranging using ultrasonic sensors.” [Online]. Available: https://www.digikey.com/en/articles/product-detection-and-ranging-using-ultrasonic-sensors
- Z. Tong, H. Hu et al., “An ultrasonic proximity sensing skin for robot safety control by using piezoelectric micromachined ultrasonic transducers (pmuts),” IEEE Sensors Journal, vol. 22, no. 18, pp. 17 351–17 361, 2021.
- “Safety Light Curtains | KEYENCE America.” [Online]. Available: https://www.keyence.com/products/safety/light-curtain/
- D. Podgorelec, S. Uran et al., “Lidar-based maintenance of a safe distance between a human and a robot arm,” Sensors, vol. 23, no. 9, p. 4305, 2023.
- M. J. Rosenstrauch, T. J. Pannen, and J. Krüger, “Human robot collaboration-using kinect v2 for iso/ts 15066 speed and separation monitoring,” Procedia CIRP, vol. 76, pp. 183–186, 2018.
- C. Vogel, M. Poggendorf, C. Walter, and N. Elkmann, “Towards safe physical human-robot collaboration: A projection-based safety system,” in 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, pp. 3355–3360.
- D. Chan, S. Narasimhan, and M. O’Toole, “Holocurtains: Programming light curtains via binary holography,” Computer Vision and Pattern Recognition, 2022.
- S. Ancha, G. Pathak, S. Narasimhan, and D. Held, “Active safety envelopes using light curtains with probabilistic guarantees,” in Proceedings of Robotics: Science and Systems, Virtual, July 2021.
- M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.
- C. B. Barber, D. P. Dobkin, and H. Huhdanpaa, “The quickhull algorithm for convex hulls,” ACM Trans. Math. Softw., vol. 22, no. 4, p. 469–483, dec 1996. [Online]. Available: https://doi.org/10.1145/235815.235821
- P. J. Besl and N. D. McKay, “Method for registration of 3-d shapes,” in Sensor fusion IV: control paradigms and data structures, vol. 1611. Spie, 1992, pp. 586–606.
- M. Esposito et al., “Easy-handeye: Automated, hardware-independent hand-eye calibration for ros1.” [Online]. Available: https://github.com/IFL-CAMP/easy_handeye
- S. Achar, J. R. Bartels et al., “Epipolar time-of-flight imaging,” ACM Trans. Graph., vol. 36, no. 4, jul 2017. [Online]. Available: https://doi.org/10.1145/3072959.3073686