Papers
Topics
Authors
Recent
Search
2000 character limit reached

Lightweight, Pre-trained Transformers for Remote Sensing Timeseries

Published 27 Apr 2023 in cs.CV and cs.AI | (2304.14065v4)

Abstract: Machine learning methods for satellite data have a range of societally relevant applications, but labels used to train models can be difficult or impossible to acquire. Self-supervision is a natural solution in settings with limited labeled data, but current self-supervised models for satellite data fail to take advantage of the characteristics of that data, including the temporal dimension (which is critical for many applications, such as monitoring crop growth) and availability of data from many complementary sensors (which can significantly improve a model's predictive performance). We present Presto (the Pretrained Remote Sensing Transformer), a model pre-trained on remote sensing pixel-timeseries data. By designing Presto specifically for remote sensing data, we can create a significantly smaller but performant model. Presto excels at a wide variety of globally distributed remote sensing tasks and performs competitively with much larger models while requiring far less compute. Presto can be used for transfer learning or as a feature extractor for simple models, enabling efficient deployment at scale.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Tick tick bloom: Harmful algal bloom detection challenge. https://www.drivendata.org/competitions/143/tick-tick-bloom/page/649/, 2023. Accessed: 2023-03-10.
  2. Treesatai benchmark archive: A multi-sensor, multi-label dataset for tree species classification in remote sensing. Earth System Science Data, 2023.
  3. Geography-aware self-supervised learning. In CVPR, 2021.
  4. Sar-based landslide classification pretraining leads to better segmentation. In Artificial Intelligence for Humanitarian Assistance and Disaster Response Workshop at NeurIPS, 2022.
  5. Semantic segmentation with labeling uncertainty and class imbalance applied to vegetation mapping. International Journal of Applied Earth Observation and Geoinformation, 2022.
  6. Dynamic world, near real-time global 10 m land use land cover mapping. Scientific Data, Jun 2022.
  7. SatMAE: Pre-training transformers for temporal and multi-spectral satellite imagery. In A. H. Oh, A. Agarwal, D. Belgrave, and K. Cho, editors, NeurIPS, 2022. URL https://openreview.net/forum?id=WBhqzpF6KYH.
  8. Annual field-scale maps of tall and short crops at the global scale using gedi and sentinel-2. arXiv preprint arXiv:2212.09681, 2022.
  9. Sentinel-2: Esa’s optical high-resolution mission for gmes operational services. Remote sensing of Environment, 2012.
  10. Impact of satellite data, 2013.
  11. Monitoring sustainable development by means of earth observation data and machine learning: A review. Environmental Sciences Europe, 2020.
  12. Google earth engine: Planetary-scale geospatial analysis for everyone. Remote sensing of Environment, 2017.
  13. High-resolution global maps of 21st-century forest cover change. Science, 2013.
  14. Deep residual learning for image recognition. In CVPR, 2016.
  15. Masked autoencoders are scalable vision learners. In CVPR, 2022.
  16. Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2019.
  17. Soilgrids250m: Global gridded soil information based on machine learning. PLoS one, 2017.
  18. Tile2vec: Unsupervised representation learning for spatially distributed data. In AAAI, 2019.
  19. P. Kansakar and F. Hossain. A review of applications of satellite earth observation data for global societal benefit and stewardship of planet earth. Space Policy, 2016.
  20. Rapid response crop maps in data sparse regions. In ACM SIGKDD Conference on Data Mining and Knowledge Discovery Workshops, 2020.
  21. A. Krafft. ASU researcher combats food insecurity with AI. https://news.asu.edu/20230303-solutions-asu-researcher-combats-food-insecurity-ai. Accessed: 2023-09-21.
  22. Becoming good at ai for good. In AAAI/ACM Conference on AI, Ethics, and Society, 2021.
  23. Seasonal contrast: Unsupervised pre-training from uncurated remote sensing data. In CVPR, 2021.
  24. C. Nakalembe and H. Kerner. Considerations for ai-eo for agriculture in sub-saharan africa. Environmental Research Letters, 2023.
  25. Sowing seeds of food security in africa. Eos (Washington. DC), 102, 2021.
  26. In-domain representation learning for remote sensing. arXiv preprint arXiv:1911.06721, 2019.
  27. Earth science reference handbook. National Aeronautics and Space Administration: Washington, DC, USA, 2006.
  28. Temporal convolutional neural network for the classification of satellite image time series. Remote Sensing, 2019.
  29. The shuttle radar topography mission—a new class of digital elevation models acquired by spaceborne radar. ISPRS journal of photogrammetry and remote sensing, 2003.
  30. Sar-enhanced mapping of live fuel moisture content. Remote Sensing of Environment, 2020.
  31. Scale-mae: A scale-aware masked autoencoder for multiscale geospatial representation learning. arXiv preprint arXiv:2212.14532, 2022.
  32. Large scale high-resolution land cover mapping with multi-resolution data. In CVPR, 2019.
  33. A generalizable and accessible approach to machine learning with global satellite imagery. Nature communications, 2021.
  34. Monitoring vegetation systems in the great plains with erts. NASA Spec. Publ, 351(1):309, 1974.
  35. End-to-end learned early classification of time series for in-season crop type mapping. ISPRS Journal of Photogrammetry and Remote Sensing, 2023. URL https://www.sciencedirect.com/science/article/pii/S092427162200332X.
  36. Satellite image time series classification with pixel-set encoders and temporal self-attention. CVPR, 2020.
  37. Energy and policy considerations for deep learning in nlp. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2019.
  38. The Union of Concerned Scientists. UCS satellite database. https://www.ucsusa.org/resources/satellite-database, 2023. Accessed: 2023-07-28.
  39. Gmes sentinel-1 mission. Remote sensing of environment, 2012.
  40. TIML: Task-informed meta-learning for crop type mapping. In AI for Agriculture and Food Systems at AAAI, 2021a.
  41. Cropharvest: A global dataset for crop-type classification. In NeurIPS, Datasets and Benchmarks Track, 2021b. URL https://openreview.net/forum?id=JtjzUXPEaCu.
  42. Artificial intelligence to advance earth observation: a perspective. arXiv preprint arXiv:2305.08413, 2023.
  43. K. Van Tricht. Mapping crops at global scale! what works and what doesn’t? https://blog.vito.be/remotesensing/worldcereal-benchmarking, 2021. Accessed: 2023-07-31.
  44. Attention is all you need. NeurIPS, 2017.
  45. P. Voosen. Europe builds ‘digital twin’of earth to hone climate forecasts, 2020.
  46. Mapping twenty years of corn and soybean across the us midwest using the landsat archive. Scientific Data, 2020.
  47. Global land cover mapping using earth observation satellite data: Recent progresses and challenges. ISPRS journal of photogrammetry and remote sensing, 2015.
  48. Deep gaussian process for crop yield prediction based on remote sensing data. Proceedings of the AAAI Conference on Artificial Intelligence, 2017.
  49. Y. Yuan and L. Lin. Self-supervised pretraining of transformers for satellite image time series classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 14:474–487, 2020.
  50. Sits-former: A pre-trained spatio-spectral-temporal representation model for sentinel-2 time series classification. International Journal of Applied Earth Observation and Geoinformation, 106:102651, 2022.
  51. Bridging optical and sar satellite image time series via contrastive feature extraction for crop classification. ISPRS Journal of Photogrammetry and Remote Sensing, 195:222–232, 2023.
Citations (48)

Summary

Overview of "Lightweight, Pre-trained Transformers for Remote Sensing Timeseries"

The paper titled "Lightweight, Pre-trained Transformers for Remote Sensing Timeseries" presents the development and evaluation of Presto (Pretrained Remote Sensing Transformer), a model designed to efficiently handle remote sensing data characterized by its temporal and multimodal composition. Presto is a transformer-based model pre-trained using a self-supervised masked autoencoding approach, particularly fit for remote sensing tasks that involve pixel-timeseries data. This model addresses key challenges such as the scarcity of labeled satellite data and the computational demands typically associated with model deployment, aiming to provide a high-performance, energy-efficient solution.

Motivation and Context

Machine learning applications in remote sensing have the potential to deliver significant societal benefits, including disaster management, environmental monitoring, and resource tracking aligned with sustainable development goals. Despite this potential, the effectiveness of such models is often hampered by the limited availability of labeled data, particularly in under-resourced areas. Furthermore, current self-supervised learning models for remote sensing tend to be computationally expensive and overlook the temporal dimension, which is crucial for applications such as crop monitoring and landcover classification.

Presto's Methodological Framework

Presto is specifically tailored to harness the temporal patterns inherent in remote sensing data and effectively absorb information from numerous complementary sensors. Notably, it employs a lightweight transformer architecture, which vastly reduces the number of trainable parameters and floating-point operations necessary for inference compared to larger models like Vision Transformers (ViTs) or ResNets. This reduction in computational requirements makes Presto more suitable for deployment in resource-constrained environments.

The model utilizes a masked autoencoding strategy during pre-training to promote robust feature extraction across diverse remote sensing modalities. This involves reconstructing masked segments of the input data, fostering resilience to missing data typical in remote sensing datasets due to factors like cloud cover.

Empirical Evaluation

Presto demonstrates outstanding performance across various globally distributed tasks, such as land cover classification and crop type mapping, often outperforming larger models with significantly fewer resources. Through its adaptability, Presto accommodates different downstream tasks by either acting as a transferable feature extractor or being fine-tuned for specific tasks. The Presto model exhibits robust performance even with incomplete inputs, which is crucial given the partial observation nature of many remote sensing datasets.

Implications and Future Directions

The development of Presto highlights the feasibility of applying self-supervised transformers in remote sensing, enabling efficient and effective analysis of pixel-timeseries data in a way that prioritizes computational efficiency. The implications of this work suggest a broader adoption of lightweight models for geospatial tasks, potentially expanding accessibility to machine learning technology for numerous practical applications in environmental monitoring.

For future research, there is a promising avenue for exploring enhanced spatio-temporal aggregation techniques to further bolster Presto's efficacy in scene classification tasks. Also, scaling Presto to handle higher spatial resolutions and incorporating additional data sources could extend its utility. Continued innovation in this direction could further diminish barriers to deploying machine learning solutions for critical societal challenges in remote sensing contexts.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.