Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross Domain Early Crop Mapping using CropSTGAN (2401.07398v2)

Published 15 Jan 2024 in cs.CV, cs.LG, and eess.IV

Abstract: Driven by abundant satellite imagery, machine learning-based approaches have recently been promoted to generate high-resolution crop cultivation maps to support many agricultural applications. One of the major challenges faced by these approaches is the limited availability of ground truth labels. In the absence of ground truth, existing work usually adopts the "direct transfer strategy" that trains a classifier using historical labels collected from other regions and then applies the trained model to the target region. Unfortunately, the spectral features of crops exhibit inter-region and inter-annual variability due to changes in soil composition, climate conditions, and crop progress, the resultant models perform poorly on new and unseen regions or years. Despite recent efforts, such as the application of the deep adaptation neural network (DANN) model structure in the deep adaptation crop classification network (DACCN), to tackle the above cross-domain challenges, their effectiveness diminishes significantly when there is a large dissimilarity between the source and target regions. This paper introduces the Crop Mapping Spectral-temporal Generative Adversarial Neural Network (CropSTGAN), a novel solution for cross-domain challenges, that doesn't require target domain labels. CropSTGAN learns to transform the target domain's spectral features to those of the source domain, effectively bridging large dissimilarities. Additionally, it employs an identity loss to maintain the intrinsic local structure of the data. Comprehensive experiments across various regions and years demonstrate the benefits and effectiveness of the proposed approach. In experiments, CropSTGAN is benchmarked against various state-of-the-art (SOTA) methods. Notably, CropSTGAN significantly outperforms these methods in scenarios with large data distribution dissimilarities between the target and source domains.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. F. Waldner, S. Fritz, A. Di Gregorio, and P. Defourny, “Mapping priorities to focus cropland mapping activities: Fitness assessment of existing global, regional and national cropland maps,” Remote Sensing, vol. 7, no. 6, pp. 7959–7986, 2015.
  2. C. Singha and K. C. Swain, “Land suitability evaluation criteria for agricultural crop selection: A review,” Agricultural reviews, vol. 37, no. 2, pp. 125–132, 2016.
  3. J. Xue, B. Su et al., “Significant remote sensing vegetation indices: A review of developments and applications,” Journal of sensors, vol. 2017, 2017.
  4. A. Joshi, B. Pradhan, S. Gite, and S. Chakraborty, “Remote-sensing data and deep-learning techniques in crop mapping and yield prediction: A systematic review,” Remote Sensing, vol. 15, no. 8, p. 2014, 2023.
  5. L. Zhang, Z. Liu, D. Liu, Q. Xiong, N. Yang, T. Ren, C. Zhang, X. Zhang, and S. Li, “Crop mapping based on historical samples and new training samples generation in heilongjiang province, china,” Sustainability, vol. 11, no. 18, p. 5052, 2019.
  6. N. You, J. Dong, J. Huang, G. Du, G. Zhang, Y. He, T. Yang, Y. Di, and X. Xiao, “The 10-m crop type maps in northeast china during 2017–2019,” Scientific data, vol. 8, no. 1, p. 41, 2021.
  7. C. Boryan, Z. Yang, R. Mueller, and M. Craig, “Monitoring us agriculture: the us department of agriculture, national agricultural statistics service, cropland data layer program,” Geocarto International, vol. 26, no. 5, pp. 341–358, 2011.
  8. Y. Wang, Z. Zhang, L. Feng, Y. Ma, and Q. Du, “A new attention-based cnn approach for crop mapping using time series sentinel-2 images,” Computers and electronics in agriculture, vol. 184, p. 106090, 2021.
  9. M. Hamidi, A. Safari, and S. Homayouni, “An auto-encoder based classifier for crop mapping from multitemporal multispectral imagery,” International Journal of Remote Sensing, vol. 42, no. 3, pp. 986–1016, 2021.
  10. H. Crisóstomo de Castro Filho, O. Abílio de Carvalho Júnior, O. L. Ferreira de Carvalho, P. Pozzobon de Bem, R. dos Santos de Moura, A. Olino de Albuquerque, C. Rosa Silva, P. H. Guimaraes Ferreira, R. Fontes Guimarães, and R. A. Trancoso Gomes, “Rice crop detection using lstm, bi-lstm, and machine learning models from sentinel-1 time series,” Remote Sensing, vol. 12, no. 16, p. 2655, 2020.
  11. P. Hao, L. Di, C. Zhang, and L. Guo, “Transfer learning for crop classification with cropland data layer data (cdl) as training samples,” Science of The Total Environment, vol. 733, p. 138869, 2020.
  12. S. Ge, J. Zhang, Y. Pan, Z. Yang, and S. Zhu, “Transferable deep learning model based on the phenological matching principle for mapping crop extent,” International Journal of Applied Earth Observation and Geoinformation, vol. 102, p. 102451, 2021.
  13. V. S. Konduri, J. Kumar, W. W. Hargrove, F. M. Hoffman, and A. R. Ganguly, “Mapping crops within the growing season across the united states,” Remote Sensing of Environment, vol. 251, p. 112048, 2020.
  14. L. Zhong, P. Gong, and G. S. Biging, “Efficient corn and soybean mapping with temporal extendability: A multi-year experiment using landsat imagery,” Remote Sensing of Environment, vol. 140, pp. 1–13, 2014.
  15. A. Nowakowski, J. Mrziglod, D. Spiller, R. Bonifacio, I. Ferrari, P. P. Mathieu, M. Garcia-Herranz, and D.-H. Kim, “Crop type mapping by using transfer learning,” International Journal of Applied Earth Observation and Geoinformation, vol. 98, p. 102313, 2021.
  16. R. Chew, J. Rineer, R. Beach, M. O’Neil, N. Ujeneza, D. Lapidus, T. Miano, M. Hegarty-Craver, J. Polly, and D. S. Temple, “Deep neural networks and transfer learning for food crop identification in uav images,” Drones, vol. 4, no. 1, p. 7, 2020.
  17. I. Kalita, G. P. Singh, and M. Roy, “Crop classification using aerial images by analyzing an ensemble of dcnns under multi-filter & multi-scale framework,” Multimedia Tools and Applications, vol. 82, no. 12, pp. 18 409–18 433, 2023.
  18. J. Zhu, T. Park, P. Isola, and A. A. Efros, “Unpaired image-to-image translation using cycle-consistent adversarial networks,” in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2223–2232.
  19. B. Zheng, S. W. Myint, P. S. Thenkabail, and R. M. Aggarwal, “A support vector machine to identify irrigated crop types using time-series landsat ndvi data,” International Journal of Applied Earth Observation and Geoinformation, vol. 34, pp. 103–112, 2015.
  20. P. Hao, Y. Zhan, L. Wang, Z. Niu, and M. Shakir, “Feature selection of time series modis data for early crop classification using random forest: A case study in kansas, usa,” Remote Sensing, vol. 7, no. 5, pp. 5347–5369, 2015.
  21. R. Saini and S. K. Ghosh, “Crop classification on single date sentinel-2 imagery using random forest and suppor vector machine,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 42, pp. 683–688, 2018.
  22. Z. Sun, L. Di, and H. Fang, “Using long short-term memory recurrent neural network in land cover classification on landsat and cropland data layer time series,” International journal of remote sensing, vol. 40, no. 2, pp. 593–614, 2019.
  23. E. C. Tetila, B. B. Machado, G. Astolfi, N. A. de Souza Belete, W. P. Amorim, A. R. Roel, and H. Pistori, “Detection and classification of soybean pests using deep learning with uav images,” Computers and Electronics in Agriculture, vol. 179, p. 105836, 2020.
  24. M. Lavreniuk, N. Kussul, and A. Novikov, “Deep learning crop classification approach based on sparse coding of time series of satellite data,” in IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium.   IEEE, 2018, pp. 4812–4815.
  25. Y. Ganin, E. Ustinova, H. Ajakan, P. Germain, H. Larochelle, F. Laviolette, M. Marchand, and V. Lempitsky, “Domain-adversarial training of neural networks,” The journal of machine learning research, vol. 17, no. 1, pp. 2096–2030, 2016.
  26. Y. Wang, L. Feng, Z. Zhang, and F. Tian, “An unsupervised domain adaptation deep learning method for spatial and temporal transferable crop type mapping using sentinel-2 imagery,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 199, pp. 102–117, 2023.
  27. Y. Wang, L. Feng, W. Sun, Z. Zhang, H. Zhang, G. Yang, and X. Meng, “Exploring the potential of multi-source unsupervised domain adaptation in crop mapping using sentinel-2 images,” GIScience & Remote Sensing, vol. 59, no. 1, pp. 2247–2265, 2022.
  28. L. Blickensdörfer, M. Schwieder, D. Pflugmacher, C. Nendel, S. Erasmi, and P. Hostert, “Mapping of crop types and crop sequences with combined time series of sentinel-1, sentinel-2 and landsat 8 data for germany,” Remote sensing of environment, vol. 269, p. 112831, 2022.
  29. C. F. Brown, S. P. Brumby, B. Guzder-Williams, T. Birch, S. B. Hyde, J. Mazzariello, W. Czerwinski, V. J. Pasquarella, R. Haertel, S. Ilyushchenko et al., “Dynamic world, near real-time global 10 m land use land cover mapping,” Scientific Data, vol. 9, no. 1, p. 251, 2022.
  30. S. Skakun, J. Wevers, C. Brockmann, G. Doxani, M. Aleksandrov, M. Batič, D. Frantz, F. Gascon, L. Gómez-Chova, O. Hagolle et al., “Cloud mask intercomparison exercise (cmix): An evaluation of cloud masking algorithms for landsat 8 and sentinel-2,” Remote Sensing of Environment, vol. 274, p. 112990, 2022.
  31. L. Van der Maaten and G. Hinton, “Visualizing data using t-sne.” Journal of machine learning research, vol. 9, no. 11, 2008.

Summary

We haven't generated a summary for this paper yet.