Papers
Topics
Authors
Recent
2000 character limit reached

MSMatch: Semi-Supervised Multispectral Scene Classification with Few Labels

Published 18 Mar 2021 in cs.LG and cs.CV | (2103.10368v2)

Abstract: Supervised learning techniques are at the center of many tasks in remote sensing. Unfortunately, these methods, especially recent deep learning methods, often require large amounts of labeled data for training. Even though satellites acquire large amounts of data, labeling the data is often tedious, expensive and requires expert knowledge. Hence, improved methods that require fewer labeled samples are needed. We present MSMatch, the first semi-supervised learning approach competitive with supervised methods on scene classification on the EuroSAT and UC Merced Land Use benchmark datasets. We test both RGB and multispectral images of EuroSAT and perform various ablation studies to identify the critical parts of the model. The trained neural network achieves state-of-the-art results on EuroSAT with an accuracy that is up to 19.76% better than previous methods depending on the number of labeled training examples. With just five labeled examples per class, we reach 94.53% and 95.86% accuracy on the EuroSAT RGB and multispectral datasets, respectively. On the UC Merced Land Use dataset, we outperform previous works by up to 5.59% and reach 90.71% with five labeled examples. Our results show that MSMatch is capable of greatly reducing the requirements for labeled data. It translates well to multispectral data and should enable various applications that are currently infeasible due to a lack of labeled data. We provide the source code of MSMatch online to enable easy reproduction and quick adoption.

Citations (30)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.