Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

UIVNAV: Underwater Information-driven Vision-based Navigation via Imitation Learning (2309.08806v2)

Published 15 Sep 2023 in cs.RO

Abstract: Autonomous navigation in the underwater environment is challenging due to limited visibility, dynamic changes, and the lack of a cost-efficient accurate localization system. We introduce UIVNav, a novel end-to-end underwater navigation solution designed to drive robots over Objects of Interest (OOI) while avoiding obstacles, without relying on localization. UIVNav uses imitation learning and is inspired by the navigation strategies used by human divers who do not rely on localization. UIVNav consists of the following phases: (1) generating an intermediate representation (IR), and (2) training the navigation policy based on human-labeled IR. By training the navigation policy on IR instead of raw data, the second phase is domain-invariant -- the navigation policy does not need to be retrained if the domain or the OOI changes. We show this by deploying the same navigation policy for surveying two different OOIs, oyster and rock reefs, in two different domains, simulation, and a real pool. We compared our method with complete coverage and random walk methods which showed that our method is more efficient in gathering information for OOIs while also avoiding obstacles. The results show that UIVNav chooses to visit the areas with larger area sizes of oysters or rocks with no prior information about the environment or localization. Moreover, a robot using UIVNav compared to complete coverage method surveys on average 36% more oysters when traveling the same distances. We also demonstrate the feasibility of real-time deployment of UIVNavin pool experiments with BlueROV underwater robot for surveying a bed of oyster shells.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. L. Christensen, J. de Gea Fernández, M. Hildebrandt, C. E. S. Koch, and B. Wehbe, “Recent advances in ai for navigation and control of underwater robots,” Current Robotics Reports, pp. 1–11, 2022.
  2. J. Hansen, S. Manjanna, A. Q. Li, I. Rekleitis, and G. Dudek, “Autonomous marine sampling enhanced by strategically deployed drifters in marine flow fields,” in OCEANS 2018 MTS/IEEE Charleston.   IEEE, 2018, pp. 1–7.
  3. N. Karapetyan, J. Moulton, and I. Rekleitis, “Meander-based river coverage by an autonomous surface vehicle,” in Field and Service Robotics.   Springer, 2021, pp. 353–364.
  4. S. Manjanna, N. Kakodkar, M. Meghjani, and G. Dudek, “Efficient terrain driven coral coverage using gaussian processes for mosaic synthesis,” in 2016 13th Conference on Computer and Robot Vision (CRV).   IEEE, 2016, pp. 448–455.
  5. M. Xanthidis, N. Karapetyan, H. Damron, S. Rahman, J. Johnson, A. O’Connell, J. M. O’Kane, and I. Rekleitis, “Navigation in the presence of obstacles for an agile autonomous underwater vehicle,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 892–899.
  6. F. Jalal and F. Nasir, “Underwater navigation, localization and path planning for autonomous vehicles: A review,” in 2021 International Bhurban Conference on Applied Sciences and Technologies (IBCAST).   IEEE, 2021, pp. 817–828.
  7. A. Xu, C. Viriyasuthee, and I. Rekleitis, “Efficient complete coverage of a known arbitrary environment with applications to aerial operations,” Autonomous Robots, vol. 36, pp. 365–381, 2014.
  8. N. Karapetyan, J. Moulton, J. S. Lewis, A. Quattrini Li, J. M. O’Kane, and I. M. Rekleitis, “Multi-robot dubins coverage with autonomous surface vehicles,” in icra, 2018.
  9. E. Vidal, N. Palomeras, K. Istenič, J. D. Hernández, and M. Carreras, “Two-dimensional frontier-based viewpoint generation for exploring and mapping underwater environments,” Sensors, vol. 19, no. 6, p. 1460, 2019.
  10. E. Galceran, R. Campos, N. Palomeras, D. Ribas, M. Carreras, and P. Ridao, “Coverage path planning with real-time replanning and surface reconstruction for inspection of three-dimensional underwater structures using autonomous underwater vehicles,” Journal of Field Robotics, vol. 32, no. 7, pp. 952–983, 2015.
  11. M. Xanthidis, M. Kalaitzakis, N. Karapetyan, J. Johnson, N. Vitzilaios, J. M. O’Kane, and I. Rekleitis, “Aquavis: A perception-aware autonomous navigation framework for underwater vehicles,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 5410–5417.
  12. B. Joshi, S. Rahman, M. Kalaitzakis, B. Cain, J. Johnson, M. Xanthidis, N. Karapetyan, A. Hernandez, A. Q. Li, N. Vitzilaios, et al., “Experimental comparison of open source visual-inertial-based state estimation algorithms in the underwater domain,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2019, pp. 7227–7233.
  13. T. Manderson, J. C. G. Higuera, R. Cheng, and G. Dudek, “Vision-based autonomous underwater swimming in dense coral for combined collision avoidance and target selection,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 1885–1891.
  14. N. Karapetyan, J. V. Johnson, and I. Rekleitis, “Human diver-inspired visual navigation: Towards coverage path planning of shipwrecks,” Marine Technology Society Journal, vol. 55, no. 4, pp. 24–32, 2021.
  15. B. Braginsky and H. Guterman, “Obstacle avoidance approaches for autonomous underwater vehicle: Simulation and experimental results,” IEEE Journal of oceanic engineering, vol. 41, no. 4, pp. 882–892, 2016.
  16. N. Yang, D. Chang, M. Johnson-Roberson, and J. Sun, “Energy-optimal path planning with active flow perception for autonomous underwater vehicles,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 9928–9934.
  17. R. Pérez-Alcocer, L. A. Torres-Méndez, E. Olguín-Díaz, and A. A. Maldonado-Ramírez, “Vision-based autonomous underwater vehicle navigation in poor visibility conditions using a model-free robust control,” Journal of Sensors, vol. 2016, 2016.
  18. A. Manzanilla, S. Reyes, M. Garcia, D. Mercado, and R. Lozano, “Autonomous navigation for unmanned underwater vehicles: Real-time experiments using computer vision,” IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 1351–1356, 2019.
  19. V. A. Bobkov, V. Y. Mashentsev, A. Y. Tolstonogov, and A. P. Scherbatyuk, “Adaptive method for auv navigation using stereo vision,” in The 26th International Ocean and Polar Engineering Conference.   OnePetro, 2016.
  20. E. Galceran and M. Carreras, “Efficient seabed coverage path planning for asvs and auvs,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.   IEEE, 2012, pp. 88–93.
  21. E. Vidal, J. D. Hernández, K. Istenič, and M. Carreras, “Online view planning for inspecting unexplored underwater structures,” IEEE Robotics and Automation Letters, vol. 2, no. 3, pp. 1436–1443, 2017.
  22. T. Manderson, J. C. Gamboa Higuera, S. Wapnick, J.-F. Tremblay, F. Shkurti, D. Meger, and G. Dudek, “Vision-based goal-conditioned policies for underwater navigation in the presence of obstacles,” Robotics: Science and Systems XVI, Jul 2020.
  23. T. Liu, M. Yu, and N. Chopra, “Learning-based autonomous underwater vehicle navigation following human actions in confined environment,” in OCEANS 2022, Hampton Roads.   IEEE, 2022, pp. 1–8.
  24. T. Manderson, J. C. G. Higuera, S. Wapnick, J.-F. Tremblay, F. Shkurti, D. Meger, and G. Dudek, “Vision-based goal-conditioned policies for underwater navigation in the presence of obstacles,” arXiv preprint arXiv:2006.16235, 2020.
  25. N. Smolyanskiy, A. Kamenev, J. Smith, and S. Birchfield, “Toward low-flying autonomous mav trail navigation using deep neural networks for environmental awareness,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2017, pp. 4241–4247.
  26. A. Kirillov, E. Mintun, N. Ravi, H. Mao, C. Rolland, L. Gustafson, T. Xiao, S. Whitehead, A. C. Berg, W.-Y. Lo, et al., “Segment anything,” arXiv preprint arXiv:2304.02643, 2023.
  27. X. Lin, N. J. Sanket, N. Karapetyan, and Y. Aloimonos, “Oysternet: Enhanced oyster detection using simulation,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 5170–5176.
  28. O. Ronneberger, P. Fischer, and T. Brox, “U-net: Convolutional networks for biomedical image segmentation,” in Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18.   Springer, 2015, pp. 234–241.
  29. B. Yu, J. Wu, and M. J. Islam, “Udepth: Fast monocular depth estimation for visually-guided underwater robots,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 3116–3123.
  30. X. Lin, N. Jha, M. Joshi, N. Karapetyan, Y. Aloimonos, and M. Yu, “Oystersim: Underwater simulation for enhancing oyster reef monitoring,” in OCEANS 2022, Hampton Roads.   IEEE, 2022, pp. 1–6.
  31. F. Chollet et al. (2015) Keras. [Online]. Available: https://github.com/fchollet/keras
  32. I. Laina, C. Rupprecht, V. Belagiannis, F. Tombari, and N. Navab, “Deeper depth prediction with fully convolutional residual networks,” in 2016 Fourth international conference on 3D vision (3DV).   IEEE, 2016, pp. 239–248.
  33. J. D. Hunter, “Matplotlib: A 2d graphics environment,” Computing in Science & Engineering, vol. 9, no. 3, pp. 90–95, 2007.
  34. J. M. Joyce, “Kullback-leibler divergence,” in International encyclopedia of statistical science.   Springer, 2011, pp. 720–722.
  35. W. C. Chow, “Brownian bridge,” Wiley interdisciplinary reviews: computational statistics, vol. 1, no. 3, pp. 325–332, 2009.
  36. J. S. Horne, E. O. Garton, S. M. Krone, and J. S. Lewis, “Analyzing animal movements using brownian bridges,” Ecology, vol. 88, no. 9, pp. 2354–2363, 2007.
  37. B. Pang, Y. Song, C. Zhang, H. Wang, and R. Yang, “A swarm robotic exploration strategy based on an improved random walk method,” Journal of Robotics, vol. 2019, pp. 1–9, 2019.
  38. I. A. Wagner, M. Lindenbaum, and A. M. Bruckstein, “Robotic exploration, brownian motion and electrical resistance,” in Randomization and Approximation Techniques in Computer Science: Second International Workshop, RANDOM’98 Barcelona, Spain, October 8–10, 1998 Proceedings 2.   Springer, 1998, pp. 116–130.
  39. B. Robotics, “Bluerov2: The world’s most affordable high-performance rov,” BlueROV2 Datasheet; Blue Robotics: Torrance, CA, USA, 2016.
Citations (4)

Summary

We haven't generated a summary for this paper yet.