DynaCon: Dynamic Robot Planner with Contextual Awareness via LLMs (2309.16031v1)
Abstract: Mobile robots often rely on pre-existing maps for effective path planning and navigation. However, when these maps are unavailable, particularly in unfamiliar environments, a different approach become essential. This paper introduces DynaCon, a novel system designed to provide mobile robots with contextual awareness and dynamic adaptability during navigation, eliminating the reliance of traditional maps. DynaCon integrates real-time feedback with an object server, prompt engineering, and navigation modules. By harnessing the capabilities of LLMs, DynaCon not only understands patterns within given numeric series but also excels at categorizing objects into matched spaces. This facilitates dynamic path planner imbued with contextual awareness. We validated the effectiveness of DynaCon through an experiment where a robot successfully navigated to its goal using reasoning. Source code and experiment videos for this work can be found at: https://sites.google.com/view/dynacon.
- H. Qin, S. Shao, T. Wang, X. Yu, Y. Jiang, and Z. Cao, “Review of autonomous path planning algorithms for mobile robots,” Drones, vol. 7, no. 3, p. 211, 2023.
- H. Jiang, H. Wang, W.-Y. Yau, and K.-W. Wan, “A brief survey: Deep reinforcement learning in mobile robot navigation,” in 2020 15th IEEE Conference on Industrial Electronics and Applications (ICIEA). IEEE, 2020, pp. 592–597.
- A. R. Willms and S. X. Yang, “An efficient dynamic system for real-time robot-path planning,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 36, no. 4, pp. 755–766, 2006.
- B. Patle, A. Pandey, D. Parhi, A. Jagadeesh, et al., “A review: On path planning strategies for navigation of mobile robot,” Defence Technology, vol. 15, no. 4, pp. 582–606, 2019.
- A. Bonarini, “Communication in human-robot interaction,” Current Robotics Reports, vol. 1, pp. 279–285, 2020.
- M. Kritsotakis, M. Michou, E. Nikoloudakis, A. Bikakis, T. Patkos, G. Antoniou, and D. Plexousakis, “C-ngine: A contextual navigation guide for indoor environments,” in Ambient Intelligence: European Conference, AmI 2008, Nuremberg, Germany, November 19-22, 2008. Proceedings. Springer, 2008, pp. 258–275.
- D. Carton, W. Olszowy, D. Wollherr, and M. Buss, “Socio-contextual constraints for human approach with a mobile robot,” International Journal of Social Robotics, vol. 9, pp. 309–327, 2017.
- M. Gao, J. Oberländer, T. Schamm, and J. M. Zöllner, “Contextual task-aware shared autonomy for assistive mobile robot teleoperation,” in 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014, pp. 3311–3318.
- T. Carlson and Y. Demiris, “Human-wheelchair collaboration through prediction of intention and adaptive assistance,” in 2008 IEEE International Conference on Robotics and Automation. IEEE, 2008, pp. 3926–3931.
- D. Calisi, A. Farinelli, G. Grisetti, L. Iocchi, D. Nardi, S. Pellegrini, D. Tipaldi, and V. A. Ziparo, “Uses of contextual knowledge in mobile robots,” in Congress of the Italian Association for Artificial Intelligence. Springer, 2007, pp. 543–554.
- Z. Mathews, M. Lechón, J. B. Calvo, A. Dhir, A. Duff, S. B. i Badia, and P. F. Verschure, “Insect-like mapless navigation based on head direction cells and contextual learning using chemo-visual sensors,” in 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2009, pp. 2243–2250.
- M. Ahn, A. Brohan, N. Brown, Y. Chebotar, O. Cortes, B. David, C. Finn, C. Fu, K. Gopalakrishnan, K. Hausman, et al., “Do as i can, not as i say: Grounding language in robotic affordances,” arXiv preprint arXiv:2204.01691, 2022.
- S. Vemprala, R. Bonatti, A. Bucker, and A. Kapoor, “Chatgpt for robotics: Design principles and model abilities,” Microsoft Auton. Syst. Robot. Res, vol. 2, p. 20, 2023.
- H. Biggie, A. N. Mopidevi, D. Woods, and C. Heckman, “Tell me where to go: A composable framework for context-aware embodied robot navigation,” arXiv preprint arXiv:2306.09523, 2023.
- C. Huang, O. Mees, A. Zeng, and W. Burgard, “Visual language maps for robot navigation,” in 2023 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2023, pp. 10 608–10 615.
- I. Singh, V. Blukis, A. Mousavian, A. Goyal, D. Xu, J. Tremblay, D. Fox, J. Thomason, and A. Garg, “Progprompt: Generating situated robot task plans using large language models,” in 2023 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2023, pp. 11 523–11 530.
- G. Dagan, F. Keller, and A. Lascarides, “Dynamic planning with a llm,” arXiv preprint arXiv:2308.06391, 2023.
- B. Kim, J. Kim, Y. Kim, C. Min, and J. Choi, “Context-aware planning and environment-aware memory for instruction following embodied agents,” arXiv preprint arXiv:2308.07241, 2023.
- K. Vamsi, P. Alle, T. Brichpuria, and P. Malarvizhi, “Ros based autonomous disinfectant mobile robot for hospitals,” in 2021 5th International Conference on Electronics, Communication and Aerospace Technology (ICECA). IEEE, 2021, pp. 94–100.
- D. Fox, W. Burgard, and S. Thrun, “The dynamic window approach to collision avoidance,” IEEE Robotics & Automation Magazine, vol. 4, no. 1, pp. 23–33, 1997.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.