Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive Landmark Color for AUV Docking in Visually Dynamic Environments (2310.02944v2)

Published 4 Oct 2023 in cs.RO and cs.CV

Abstract: Autonomous Underwater Vehicles (AUVs) conduct missions underwater without the need for human intervention. A docking station (DS) can extend mission times of an AUV by providing a location for the AUV to recharge its batteries and receive updated mission information. Various methods for locating and tracking a DS exist, but most rely on expensive acoustic sensors, or are vision-based, which is significantly affected by water quality. In this \doctype, we present a vision-based method that utilizes adaptive color LED markers and dynamic color filtering to maximize landmark visibility in varying water conditions. Both AUV and DS utilize cameras to determine the water background color in order to calculate the desired marker color. No communication between AUV and DS is needed to determine marker color. Experiments conducted in a pool and lake show our method performs 10 times better than static color thresholding methods as background color varies. DS detection is possible at a range of 5 meters in clear water with minimal false positives.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. J. S. Jaffe, “Computer modeling and the design of optimal underwater imaging systems,” IEEE Journal of Oceanic Engineering, vol. 15, no. 2, pp. 101–111, Apr. 1990, doi: 10.1109/48.50695.
  2. M. J. Islam, Y. Xia, and J. Sattar, “Fast Underwater Image Enhancement for Improved Visual Perception,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 3227–3234, Apr. 2020, doi: 10.1109/LRA.2020.2974710.
  3. W. Zhang, P. Zhuang, H.-H. Sun, G. Li, S. Kwong, and C. Li, “Underwater Image Enhancement via Minimal Color Loss and Locally Adaptive Contrast Enhancement,” IEEE Transactions on Image Processing, vol. 31, pp. 3997–4010, 2022, doi: 10.1109/TIP.2022.3177129.
  4. D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 8, pp. 2822–2837, Aug. 2021, doi: 10.1109/TPAMI.2020.2977624.
  5. A. R. Smith, “Color gamut transform pairs,” SIGGRAPH Comput. Graph., vol. 12, no. 3, pp. 12–19, Aug. 1978, doi: 10.1145/965139.807361.
  6. F. Jaffre, R. Littlefield, M. Grund, and M. Purcell, “Development of a New Version of the REMUS 6000 Autonomous Underwater Vehicle,” in OCEANS 2019 - Marseille, Jun. 2019, pp. 1–7. doi: 10.1109/OCEANSE.2019.8867297.
  7. S. Liu, M. Ozay, T. Okatani, H. Xu, K. Sun, and Y. Lin, “Detection and Pose Estimation for Short-Range Vision-Based Underwater Docking,” IEEE Access, vol. 7, pp. 2720–2749, 2019, doi: 10.1109/ACCESS.2018.2885537.
  8. S. Liu, H. Xu, Y. Lin, and L. Gao, “Visual Navigation for Recovering an AUV by Another AUV in Shallow Water,” Sensors (Basel), vol. 19, no. 8, p. 1889, Apr. 2019, doi: 10.3390/s19081889.
  9. M. F. Yahya and M. R. Arshad, “Tracking of multiple markers based on color for visual servo control in underwater docking,” in 2015 IEEE International Conference on Control System, Computing and Engineering (ICCSCE), Nov. 2015, pp. 482–487. doi: 10.1109/ICCSCE.2015.7482233.
  10. E. Olson, “AprilTag: A robust and flexible visual fiducial system,” in 2011 IEEE International Conference on Robotics and Automation, May 2011, pp. 3400–3407. doi: 10.1109/ICRA.2011.5979561.
  11. N. Palomeras, M. Carreras, P. Ridao, and E. Hernandez, “Mission control system for dam inspection with an AUV,” in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 2006, pp. 2551–2556. doi: 10.1109/IROS.2006.281705.
  12. M. Q. Zaman and R. Mardiyanto, “Development of marker detection method for estimating angle and distance of underwater remotely operated vehicle to buoyant boat,” International Journal of Advances in Intelligent Informatics, vol. 7, no. 3, pp. 249–267, Nov. 2021, doi: 10.26555/ijain.v7i3.455.
  13. E. Fiorelli, N. E. Leonard, P. Bhatta, D. A. Paley, R. Bachmayer, and D. M. Fratantoni, “Multi-AUV Control and Adaptive Sampling in Monterey Bay,” IEEE Journal of Oceanic Engineering, vol. 31, no. 4, pp. 935–948, Oct. 2006, doi: 10.1109/JOE.2006.880429.
  14. M. Fulton, J. Hong, M. J. Islam, and J. Sattar, “Robotic Detection of Marine Litter Using Deep Visual Detection Models,” in 2019 International Conference on Robotics and Automation (ICRA), May 2019, pp. 5752–5758. doi: 10.1109/ICRA.2019.8793975.
  15. M. Fulton, C. Edge, and J. Sattar, “Robot Communication Via Motion: Closing the Underwater Human-Robot Interaction Loop,” in 2019 International Conference on Robotics and Automation (ICRA), May 2019, pp. 4660–4666. doi: 10.1109/ICRA.2019.8793491.
  16. Fan, C. Liu, B. Li, Y. Xu, and W. Xu, “AUV docking based on USBL navigation and vision guidance,” J Mar Sci Technol, vol. 24, no. 3, pp. 673–685, Sep. 2019, doi: 10.1007/s00773-018-0577-8.
  17. J. G. Bellingham, “Autonomous Underwater Vehicle Docking,” in Springer Handbook of Ocean Engineering, M. R. Dhanak and N. I. Xiros, Eds., in Springer Handbooks. Cham: Springer International Publishing, 2016, pp. 387–406. doi: 10.1007/978-3-319-16649-0_16.
  18. R. Vivekanandan, D. Chang, and G. A. Hollinger, “Autonomous Underwater Docking using Flow State Estimation and Model Predictive Control,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), May 2023, pp. 1062–1068. doi: 10.1109/ICRA48891.2023.10160272.
  19. V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An Accurate O(n) Solution to the PnP Problem,” Int J Comput Vis, vol. 81, no. 2, pp. 155–166, Feb. 2009, doi: 10.1007/s11263-008-0152-6.

Summary

We haven't generated a summary for this paper yet.