Automatic Breast Lesion Classification by Joint Neural Analysis of Mammography and Ultrasound (2009.11009v1)
Abstract: Mammography and ultrasound are extensively used by radiologists as complementary modalities to achieve better performance in breast cancer diagnosis. However, existing computer-aided diagnosis (CAD) systems for the breast are generally based on a single modality. In this work, we propose a deep-learning based method for classifying breast cancer lesions from their respective mammography and ultrasound images. We present various approaches and show a consistent improvement in performance when utilizing both modalities. The proposed approach is based on a GoogleNet architecture, fine-tuned for our data in two training steps. First, a distinct neural network is trained separately for each modality, generating high-level features. Then, the aggregated features originating from each modality are used to train a multimodal network to provide the final classification. In quantitative experiments, the proposed approach achieves an AUC of 0.94, outperforming state-of-the-art models trained over a single modality. Moreover, it performs similarly to an average radiologist, surpassing two out of four radiologists participating in a reader study. The promising results suggest that the proposed method may become a valuable decision support tool for breast radiologists.
- Gavriel Habib (5 papers)
- Nahum Kiryati (6 papers)
- Miri Sklair-Levy (2 papers)
- Anat Shalmon (1 paper)
- Osnat Halshtok Neiman (1 paper)
- Renata Faermann Weidenfeld (1 paper)
- Yael Yagil (2 papers)
- Eli Konen (6 papers)
- Arnaldo Mayer (4 papers)