Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Active learning for efficient annotation in precision agriculture: a use-case on crop-weed semantic segmentation (2404.02580v1)

Published 3 Apr 2024 in cs.CV and cs.AI

Abstract: Optimizing deep learning models requires large amounts of annotated images, a process that is both time-intensive and costly. Especially for semantic segmentation models in which every pixel must be annotated. A potential strategy to mitigate annotation effort is active learning. Active learning facilitates the identification and selection of the most informative images from a large unlabelled pool. The underlying premise is that these selected images can improve the model's performance faster than random selection to reduce annotation effort. While active learning has demonstrated promising results on benchmark datasets like Cityscapes, its performance in the agricultural domain remains largely unexplored. This study addresses this research gap by conducting a comparative study of three active learning-based acquisition functions: Bayesian Active Learning by Disagreement (BALD), stochastic-based BALD (PowerBALD), and Random. The acquisition functions were tested on two agricultural datasets: Sugarbeet and Corn-Weed, both containing three semantic classes: background, crop and weed. Our results indicated that active learning, especially PowerBALD, yields a higher performance than Random sampling on both datasets. But due to the relatively large standard deviations, the differences observed were minimal; this was partly caused by high image redundancy and imbalanced classes. Specifically, more than 89\% of the pixels belonged to the background class on both datasets. The absence of significant results on both datasets indicates that further research is required for applying active learning on agricultural datasets, especially if they contain a high-class imbalance and redundant images. Recommendations and insights are provided in this paper to potentially resolve such issues.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. Andreas Kamilaris and Francesc X. Prenafeta-Boldú “Deep learning in agriculture: A survey” In Computers and Electronics in Agriculture 147, 2018, pp. 70–90 DOI: https://doi.org/10.1016/j.compag.2018.02.016
  2. “Semantic segmentation of agricultural images: A survey” In Information Processing in Agriculture, 2023 DOI: https://doi.org/10.1016/j.inpa.2023.02.001
  3. “Label-Efficient Learning in Agriculture: A Comprehensive Review”, 2023 arXiv:2305.14691 [cs.CV]
  4. Yarin Gal, Riashat Islam and Zoubin Ghahramani “Deep Bayesian Active Learning with Image Data” In Proceedings of the 34th International Conference on Machine Learning - Volume 70, ICML’17 Sydney, NSW, Australia: JMLR.org, 2017, pp. 1183–1192
  5. Usman A. Zahidi and Grzegorz Cielniak “Active Learning for Crop-Weed Discrimination by Image Classification from Convolutional Neural Network’s Feature Pyramid Levels” In Computer Vision Systems: 13th International Conference, ICVS 2021, Virtual Event, September 22-24, 2021, Proceedings Berlin, Heidelberg: Springer-Verlag, 2021, pp. 245–257 DOI: 10.1007/978-3-030-87156-7˙20
  6. “Active learning with MaskAL reduces annotation effort for training Mask R-CNN on a broccoli dataset with visually similar classes” In Computers and Electronics in Agriculture 197 Elsevier, 2022, pp. 106917
  7. “Hands-On Bayesian Neural Networks—A Tutorial for Deep Learning Users” In IEEE Computational Intelligence Magazine 17.2 Institute of ElectricalElectronics Engineers (IEEE), 2022, pp. 29–48 DOI: 10.1109/mci.2022.3155327
  8. “Bayesian active learning for classification and preference learning” In arXiv preprint arXiv:1112.5745, 2011
  9. “Stochastic Batch Acquisition for Deep Active Learning”, 2022 arXiv:2106.12059 [cs.LG]
  10. “Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields” In The International Journal of Robotics Research, 2017 DOI: 10.1177/0278364917720510
  11. “Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning” In Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48, ICML’16 New York, NY, USA: JMLR.org, 2016, pp. 1050–1059
  12. “Baal, a bayesian active learning library”, https://github.com/baal-org/baal/, 2022
  13. “HarDNet: A Low Memory Traffic Network” In Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2019, pp. 3551–3560 DOI: 10.1109/ICCV.2019.00365
  14. “PhenoBench–A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain” In arXiv preprint arXiv:2306.04557, 2023
  15. “Semantic Segmentation of Crops and Weeds with Probabilistic Modeling and Uncertainty Quantification” In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 582–592
  16. “Active learning for convolutional neural networks: A core-set approach” In arXiv preprint arXiv:1708.00489, 2017
  17. Donggeun Yoo and In So Kweon “Learning Loss for Active Learning”, 2019 arXiv:1905.03677 [cs.CV]
  18. “Active learning with point supervision for cost-effective panicle detection in cereal crops” In Plant Methods 16.1, 2020, pp. 34 DOI: 10.1186/s13007-020-00575-8
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Bart M. van Marrewijk (2 papers)
  2. Charbel Dandjinou (1 paper)
  3. Dan Jeric Arcega Rustia (1 paper)
  4. Nicolas Franco Gonzalez (1 paper)
  5. Boubacar Diallo (6 papers)
  6. Jérôme Dias (4 papers)
  7. Paul Melki (4 papers)
  8. Pieter M. Blok (6 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com