Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mixed-domain Training Improves Multi-Mission Terrain Segmentation (2209.13674v1)

Published 27 Sep 2022 in cs.CV

Abstract: Planetary rover missions must utilize machine learning-based perception to continue extra-terrestrial exploration with little to no human presence. Martian terrain segmentation has been critical for rover navigation and hazard avoidance to perform further exploratory tasks, e.g. soil sample collection and searching for organic compounds. Current Martian terrain segmentation models require a large amount of labeled data to achieve acceptable performance, and also require retraining for deployment across different domains, i.e. different rover missions, or different tasks, i.e. geological identification and navigation. This research proposes a semi-supervised learning approach that leverages unsupervised contrastive pretraining of a backbone for a multi-mission semantic segmentation for Martian surfaces. This model will expand upon the current Martian segmentation capabilities by being able to deploy across different Martian rover missions for terrain navigation, by utilizing a mixed-domain training set that ensures feature diversity. Evaluation results of using average pixel accuracy show that a semi-supervised mixed-domain approach improves accuracy compared to single domain training and supervised training by reaching an accuracy of 97% for the Mars Science Laboratory's Curiosity Rover and 79.6% for the Mars 2020 Perseverance Rover. Further, providing different weighting methods to loss functions improved the models correct predictions for minority or rare classes by over 30% using the recall metric compared to standard cross-entropy loss. These results can inform future multi-mission and multi-task semantic segmentation for rover missions in a data-efficient manner.

Citations (2)

Summary

We haven't generated a summary for this paper yet.