Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ROAM: Random Layer Mixup for Semi-Supervised Learning in Medical Imaging (2003.09439v4)

Published 20 Mar 2020 in cs.CV

Abstract: Medical image segmentation is one of the major challenges addressed by machine learning methods. Yet, deep learning methods profoundly depend on a large amount of annotated data, which is time-consuming and costly. Though, semi-supervised learning methods approach this problem by leveraging an abundant amount of unlabeled data along with a small amount of labeled data in the training process. Recently, MixUp regularizer has been successfully introduced to semi-supervised learning methods showing superior performance. MixUp augments the model with new data points through linear interpolation of the data at the input space. We argue that this option is limited. Instead, we propose ROAM, a RandOm lAyer Mixup, which encourages the network to be less confident for interpolated data points at randomly selected space. ROAM generates more data points that have never seen before, and hence it avoids over-fitting and enhances the generalization ability. We conduct extensive experiments to validate our method on three publicly available datasets on whole-brain image segmentation. ROAM achieves state-of-the-art (SOTA) results in fully supervised (89.5%) and semi-supervised (87.0%) settings with a relative improvement of up to 2.40% and 16.50%, respectively for the whole-brain segmentation.

Citations (10)

Summary

We haven't generated a summary for this paper yet.