Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Robotic Arms through Natural Language Processing, Computer Vision, and Edge Computing (2405.17665v3)

Published 27 May 2024 in cs.RO

Abstract: This paper introduces a prototype for a new approach to assistive robotics, integrating edge computing with NLP and computer vision to enhance the interaction between humans and robotic systems. Our proof of concept demonstrates the feasibility of using LLMs and vision systems in tandem for interpreting and executing complex commands conveyed through natural language. This integration aims to improve the intuitiveness and accessibility of assistive robotic systems, making them more adaptable to the nuanced needs of users with disabilities. By leveraging the capabilities of edge computing, our system has the potential to minimize latency and support offline capability, enhancing the autonomy and responsiveness of assistive robots. Experimental results from our implementation on a robotic arm show promising outcomes in terms of accurate intent interpretation and object manipulation based on verbal commands. This research lays the groundwork for future developments in assistive robotics, focusing on creating highly responsive, user-centric systems that can significantly improve the quality of life for individuals with disabilities. For video demonstrations and source code, please refer to: https://tinyurl.com/EnhancedArmEdgeNLP.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. B. Abdulrazak and M. Mokhtari, “Assistive robotics for independent living,” The engineering handbook of smart technology for aging, disability, and independence, pp. 355–374, 2008.
  2. S. Gushi, Y. Shimabukuro, and H. Higa, “A self-feeding assistive robotic arm for people with physical disabilities of the extremities,” in 2020 5th International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS).   IEEE, 2020, pp. 61–64.
  3. R. J. López-Sastre, M. Baptista-Ríos, F. J. Acevedo-Rodríguez, S. Pacheco-da Costa, S. Maldonado-Bascón, and S. Lafuente-Arroyo, “A low-cost assistive robot for children with neurodevelopmental disorders to aid in daily living activities,” International Journal of Environmental Research and Public Health, vol. 18, no. 8, p. 3974, 2021.
  4. L. Fiorini, M. De Mul, I. Fabbricotti, R. Limosani, A. Vitanza, G. D’Onofrio, M. Tsui, D. Sancarlo, F. Giuliani, A. Greco, et al., “Assistive robots to improve the independent living of older persons: Results from a needs study,” Disability and Rehabilitation: Assistive Technology, vol. 16, no. 1, pp. 92–102, 2021.
  5. J. Ten Kate, G. Smit, and P. Breedveld, “3d-printed upper limb prostheses: a review,” Disability and Rehabilitation: Assistive Technology, vol. 12, no. 3, pp. 300–314, 2017.
  6. M. S. H. Sunny, M. I. I. Zarif, I. Rulik, J. Sanjuan, M. H. Rahman, S. I. Ahamed, I. Wang, K. Schultz, and B. Brahmi, “Eye-gaze control of a wheelchair mounted 6dof assistive robot for activities of daily living,” Journal of NeuroEngineering and Rehabilitation, vol. 18, no. 1, pp. 1–12, 2021.
  7. F. Robinson and G. Nejat, “An analysis of design recommendations for socially assistive robot helpers for effective human-robot interactions in senior care,” Journal of Rehabilitation and Assistive Technologies Engineering, vol. 9, p. 20556683221101389, 2022.
  8. I. Rulik, M. S. H. Sunny, J. D. Sanjuan De Caro, M. I. I. Zarif, B. Brahmi, S. I. Ahamed, K. Schultz, I. Wang, T. Leheng, J. P. Longxiang, et al., “Control of a wheelchair-mounted 6dof assistive robot with chin and finger joysticks,” Frontiers in Robotics and AI, vol. 9, p. 885610, 2022.
  9. L. Wu, R. Alqasemi, and R. Dubey, “Development of smartphone-based human-robot interfaces for individuals with disabilities,” IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 5835–5841, 2020.
  10. Y.-S. L.-K. Cio, M. Raison, C. L. Menard, and S. Achiche, “Proof of concept of an assistive robotic arm control using artificial stereovision and eye-tracking,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 27, no. 12, pp. 2344–2352, 2019.
  11. C.-S. Chung, H. W. Ka, H. Wang, D. Ding, A. Kelleher, and R. A. Cooper, “Performance evaluation of a mobile touchscreen interface for assistive robotic manipulators: A pilot study,” Topics in spinal cord injury rehabilitation, vol. 23, no. 2, pp. 131–139, 2017.
  12. H. W. Ka, C.-S. Chung, D. Ding, K. James, and R. Cooper, “Performance evaluation of 3d vision-based semi-autonomous control method for assistive robotic manipulator,” Disability and Rehabilitation: Assistive Technology, vol. 13, no. 2, pp. 140–145, 2018.
  13. C.-S. Chung, B. Styler, E. L. Wang, and D. Ding, “Robotic assistance in action: Examining control methods for long-term owners of wheelchair-mounted robotic arms,” in ASSISTIVE TECHNOLOGY, vol. 36, no. 1.   TAYLOR & FRANCIS INC 530 WALNUT STREET, STE 850, PHILADELPHIA, PA 19106 USA, 2024, pp. 85–86.
  14. Z. Li, Y. Mu, Z. Sun, S. Song, J. Su, and J. Zhang, “Intention understanding in human–robot interaction based on visual-nlp semantics,” Frontiers in Neurorobotics, vol. 14, p. 610139, 2021.
  15. A. Radford, J. Wu, R. Child, D. Luan, D. Amodei, I. Sutskever, et al., “Language models are unsupervised multitask learners,” OpenAI blog, vol. 1, no. 8, p. 9, 2019.
  16. J. Wang, Z. Wu, Y. Li, H. Jiang, P. Shu, E. Shi, H. Hu, C. Ma, Y. Liu, X. Wang, et al., “Large language models for robotics: Opportunities, challenges, and perspectives,” arXiv preprint arXiv:2401.04334, 2024.
  17. F. Zeng, W. Gan, Y. Wang, N. Liu, and P. S. Yu, “Large language models for robotics: A survey,” arXiv preprint arXiv:2311.07226, 2023.
  18. S. Sharan, F. Pittaluga, M. Chandraker, et al., “Llm-assist: Enhancing closed-loop planning with language-based reasoning,” arXiv preprint arXiv:2401.00125, 2023.
  19. C. Keroglou, I. Kansizoglou, P. Michailidis, K. M. Oikonomou, I. T. Papapetros, P. Dragkola, I. T. Michailidis, A. Gasteratos, E. B. Kosmatopoulos, and G. C. Sirakoulis, “A survey on technical challenges of assistive robotics for elder people in domestic environments: The aspida concept,” IEEE Transactions on Medical Robotics and Bionics, 2023.
  20. A. Samanta, F. Esposito, and T. G. Nguyen, “Asap: Adaptive and scalable microservice provisioning for edge-iot networks,” in 2023 18th Wireless On-Demand Network Systems and Services Conference (WONS).   IEEE, 2023, pp. 80–87.
  21. F. Esposito, “Digital telepathology and virtual control of a microscope using edge computing,” Nov. 8 2022, uS Patent 11,494,901.
  22. A. Barnawi, M. Alharbi, and M. Chen, “Intelligent search and find system for robotic platform based on smart edge computing service,” IEEE Access, vol. 8, pp. 108 821–108 834, 2020.
  23. S. Wan, Z. Gu, and Q. Ni, “Cognitive computing and wireless communications on the edge for healthcare service robots,” Computer Communications, vol. 149, pp. 99–106, 2020.
  24. S. Poirier, F. Routhier, and A. Campeau-Lecours, “Voice control interface prototype for assistive robots for people living with upper limb disabilities,” in 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR).   IEEE, 2019, pp. 46–52.
  25. M. Babaiasl, “Modern robotics course wiki,” https://github.com/madibabaiasl/modern-robotics-course/wiki, accessed: 02/21/2024.
  26. University of Michigan, “Apriltag: A robust and flexible visual fiducial system,” https://april.eecs.umich.edu/software/apriltag, accessed: 2024-02-21.

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com