Papers
Topics
Authors
Recent
Search
2000 character limit reached

Semi-Supervised Deep Learning for Multi-Tissue Segmentation from Multi-Contrast MRI

Published 7 Sep 2020 in eess.IV | (2009.03128v2)

Abstract: Segmentation of thigh tissues (muscle, fat, inter-muscular adipose tissue (IMAT), bone, and bone marrow) from magnetic resonance imaging (MRI) scans is useful for clinical and research investigations in various conditions such as aging, diabetes mellitus, obesity, metabolic syndrome, and their associated comorbidities. Towards a fully automated, robust, and precise quantification of thigh tissues, herein we designed a novel semi-supervised segmentation algorithm based on deep network architectures. Built upon Tiramisu segmentation engine, our proposed deep networks use variational and specially designed targeted dropouts for faster and robust convergence, and utilize multi-contrast MRI scans as input data. In our experiments, we have used 150 scans from 50 distinct subjects from the Baltimore Longitudinal Study of Aging (BLSA). The proposed system made use of both labeled and unlabeled data with high efficacy for training, and outperformed the current state-of-the-art methods with dice scores of 97.52%, 94.61%, 80.14%, 95.93%, and 96.83% for muscle, fat, IMAT, bone, and bone marrow tissues, respectively. Our results indicate that the proposed system can be useful for clinical research studies where volumetric and distributional tissue quantification is pivotal and labeling is a significant issue. To the best of our knowledge, the proposed system is the first attempt at multi-tissue segmentation using a single end-to-end semi-supervised deep learning framework for multi-contrast thigh MRI scans.

Citations (15)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.