Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Mutual Bootstrapping Model for Automated Skin Lesion Segmentation and Classification (1903.03313v4)

Published 8 Mar 2019 in cs.CV

Abstract: Automated skin lesion segmentation and classification are two most essential and related tasks in the computer-aided diagnosis of skin cancer. Despite their prevalence, deep learning models are usually designed for only one task, ignoring the potential benefits in jointly performing both tasks. In this paper, we propose the mutual bootstrapping deep convolutional neural networks (MB-DCNN) model for simultaneous skin lesion segmentation and classification. This model consists of a coarse segmentation network (coarse-SN), a mask-guided classification network (mask-CN), and an enhanced segmentation network (enhanced-SN). On one hand, the coarse-SN generates coarse lesion masks that provide a prior bootstrapping for mask-CN to help it locate and classify skin lesions accurately. On the other hand, the lesion localization maps produced by mask-CN are then fed into enhanced-SN, aiming to transfer the localization information learned by mask-CN to enhanced-SN for accurate lesion segmentation. In this way, both segmentation and classification networks mutually transfer knowledge between each other and facilitate each other in a bootstrapping way. Meanwhile, we also design a novel rank loss and jointly use it with the Dice loss in segmentation networks to address the issues caused by class imbalance and hard-easy pixel imbalance. We evaluate the proposed MB-DCNN model on the ISIC-2017 and PH2 datasets, and achieve a Jaccard index of 80.4% and 89.4% in skin lesion segmentation and an average AUC of 93.8% and 97.7% in skin lesion classification, which are superior to the performance of representative state-of-the-art skin lesion segmentation and classification methods. Our results suggest that it is possible to boost the performance of skin lesion segmentation and classification simultaneously via training a unified model to perform both tasks in a mutual bootstrapping way.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yutong Xie (68 papers)
  2. Jianpeng Zhang (35 papers)
  3. Yong Xia (141 papers)
  4. Chunhua Shen (404 papers)
Citations (249)

Summary

Automated Skin Lesion Segmentation and Classification via Mutual Bootstrapping

The automation of skin lesion segmentation and classification holds significant promise in improving dermatological diagnostics by mitigating issues such as operator bias and inefficiency in manual diagnosis. The paper, "A Mutual Bootstrapping Model for Automated Skin Lesion Segmentation and Classification," delineates a sophisticated approach leveraging the symbiotic relationship between segmentation and classification. The proposed methodology involves a mutual bootstrapping deep convolutional neural networks (MB-DCNN) model, adept at capitalizing on the interrelations between these tasks to enhance efficacy in both.

Proposed Methodology

The core of the MB-DCNN model is built upon three intertwined components: the coarse segmentation network (coarse-SN), mask-guided classification network (mask-CN), and enhanced segmentation network (enhanced-SN). The workflow starts with the coarse-SN producing lesion masks that inform mask-CN, thus equipping it with lesion localization cues necessary for effective classification. Consequently, mask-CN's output improves lesion segmentation in enhanced-SN. Unlike conventional models that isolate segmentation from classification, this integration ensures that each component's learning augments the performance of the other.

To mitigate class imbalance and challenges in difficult-to-segment pixels, a unique hybrid loss combining Dice loss and rank loss is proposed. This advancement allows the network to handle variability in lesion boundaries more effectively by focusing model learning on both class-imbalanced and hard pixels.

Results

The empirical validation conducted on two benchmark datasets—ISIC-2017 and PH2—demonstrates commendable performance by the MB-DCNN model. The reported Jaccard indices, 80.4% and 89.4% for ISIC-2017 and PH2 respectively, highlight a superior segmentation capability. Additionally, the model achieves an impressive area under the curve (AUC) of 93.8% and 97.7% for skin lesion classification on these datasets. These figures underscore the model's ability to advance the state-of-the-art, thereby revealing the potential of mutual bootstrapping in resolving complex diagnostic tasks.

Implications and Future Developments

The MB-DCNN model offers a coherent framework for integrating segmentation with classification, pushing forward the boundaries in computer-aided dermatological diagnosis. Practically, such a model can fortify diagnostic systems against the prevalent obstacles of manual analyses, like subjective biases and operational overheads. Theoretical advancement is also substantial, as it reinforces the pertinence of shared learning across closely aligned visual tasks.

Looking forward, potential advancements may involve extending this mutual integration to encompass additional diagnostic tasks beyond segmentation and classification. Refining the end-to-end training paradigm to enhance robustness, efficiency, and accuracy further remains a viable goal. Furthermore, the exploration into leveraging this architecture in other medical domains could open new avenues for research. The intricate interplay between distinct but related tasks highlights an emerging narrative in machine learning—where the synthesis of task-specific learning can unlock comprehensive automated systems, hence accelerating the translation of AI prowess into real-world benefits.

This paper sets a baseline for future explorations, encouraging the continued integration of salient AI techniques for precise, reliable computer-aided diagnostic solutions in the healthcare sector.