Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compositional Oil Spill Detection Based on Object Detector and Adapted Segment Anything Model from SAR Images (2401.07502v2)

Published 15 Jan 2024 in cs.CV

Abstract: Semantic segmentation-based methods have attracted extensive attention in oil spill detection from SAR images. However, the existing approaches require a large number of finely annotated segmentation samples in the training stage. To alleviate this issue, we propose a composite oil spill detection framework, SAM-OIL, comprising an object detector (e.g., YOLOv8), an Adapted Segment Anything Model (SAM), and an Ordered Mask Fusion (OMF) module. SAM-OIL is the first application of the powerful SAM in oil spill detection. Specifically, the SAM-OIL strategy uses YOLOv8 to obtain the categories and bounding boxes of oil spill-related objects, then inputs bounding boxes into the Adapted SAM to retrieve category-agnostic masks, and finally adopts the OMF module to fuse the masks and categories. The Adapted SAM, combining a frozen SAM with a learnable Adapter module, can enhance SAM's ability to segment ambiguous objects. The OMF module, a parameter-free method, can effectively resolve pixel category conflicts within SAM. Experimental results demonstrate that SAM-OIL surpasses existing semantic segmentation-based oil spill detection methods, achieving mIoU of 69.52\%. The results also indicated that both OMF and Adapter modules can effectively improve the accuracy in SAM-OIL.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. C. Brekke and A. H. Solberg, “Oil spill detection by satellite remote sensing,” Remote Sens. Environ., vol. 95, no. 1, pp. 1–13, 2005.
  2. R. Al-Ruzouq, M. B. A. Gibril, A. Shanableh, A. Kais, O. Hamed, S. Al-Mansoori, and M. A. Khalil, “Sensors, features, and machine learning for oil spill detection and monitoring: A review,” Remote Sens., vol. 12, no. 20, p. 3338, 2020.
  3. W. Alpers, B. Holt, and K. Zeng, “Oil spill detection by imaging radars: Challenges and pitfalls,” Remote Sens. Environ., vol. 201, pp. 133–147, 2017.
  4. M. Krestenitis, G. Orfanidis, K. Ioannidis, K. Avgerinakis, S. Vrochidis, and I. Kompatsiaris, “Oil spill identification from satellite images using deep neural networks,” Remote Sens., vol. 11, no. 15, p. 1762, 2019.
  5. C. Brekke and A. H. Solberg, “Classifiers and confidence estimation for oil spill detection in envisat asar images,” IEEE Geosci. Remote Sens. Lett., vol. 5, no. 1, pp. 65–69, 2008.
  6. Q. Zhu, Y. Zhang, Z. Li, X. Yan, Q. Guan, Y. Zhong, L. Zhang, and D. Li, “Oil spill contextual and boundary-supervised detection network based on marine sar images,” IEEE Trans. Geosci. Remote Sens., vol. 60, pp. 1–10, 2021.
  7. R. Hasimoto-Beltran, M. Canul-Ku, G. M. D. Méndez, F. J. Ocampo-Torres, and B. Esquivel-Trava, “Ocean oil spill detection from sar images based on multi-channel deep learning semantic segmentation,” Mar. Pollut. Bull., vol. 188, p. 114651, 2023.
  8. X. Ma, J. Xu, P. Wu, and P. Kong, “Oil spill detection based on deep convolutional neural networks using polarimetric scattering information from sentinel-1 sar images,” IEEE Trans. Geosci. Remote Sens., vol. 60, pp. 1–13, 2021.
  9. A. Kirillov, E. Mintun, N. Ravi, H. Mao, C. Rolland, L. Gustafson, T. Xiao, S. Whitehead, A. C. Berg, W.-Y. Lo, P. Dollar, and R. Girshick, “Segment anything,” in Proc. IEEE/CVF Int. Conf. Comput. Vis. (ICCV), October 2023, pp. 4015–4026.
  10. K. Chen, C. Liu, H. Chen, H. Zhang, W. Li, Z. Zou, and Z. Shi, “Rsprompter: Learning to prompt for remote sensing instance segmentation based on visual foundation model,” arXiv:2306.16269, 2023.
  11. M. Contributors, “MMYOLO: OpenMMLab YOLO series toolbox and benchmark,” https://github.com/open-mmlab/mmyolo, 2022.
  12. L. Ke, M. Ye, M. Danelljan, Y. Liu, Y.-W. Tai, C.-K. Tang, and F. Yu, “Segment anything in high quality,” in NeurIPS, 2023.
Citations (5)

Summary

We haven't generated a summary for this paper yet.