Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semi-supervised learning for joint SAR and multispectral land cover classification (2108.09075v2)

Published 20 Aug 2021 in eess.IV and cs.CV

Abstract: Semi-supervised learning techniques are gaining popularity due to their capability of building models that are effective, even when scarce amounts of labeled data are available. In this paper, we present a framework and specific tasks for self-supervised pretraining of \textit{multichannel} models, such as the fusion of multispectral and synthetic aperture radar images. We show that the proposed self-supervised approach is highly effective at learning features that correlate with the labels for land cover classification. This is enabled by an explicit design of pretraining tasks which promotes bridging the gaps between sensing modalities and exploiting the spectral characteristics of the input. In a semi-supervised setting, when limited labels are available, using the proposed self-supervised pretraining, followed by supervised finetuning for land cover classification with SAR and multispectral data, outperforms conventional approaches such as purely supervised learning, initialization from training on ImageNet and other recent self-supervised approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Antonio Montanaro (6 papers)
  2. Diego Valsesia (36 papers)
  3. Giulia Fracastoro (24 papers)
  4. Enrico Magli (58 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.