Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Efficient Polyp Segmentation Network (2203.04118v2)

Published 8 Mar 2022 in eess.IV and cs.CV

Abstract: Cancer is a disease that occurs as a result of the uncontrolled division and proliferation of cells. Colon cancer is one of the most common types of cancer in the world. Polyps that can be seen in the large intestine can cause cancer if not removed with early intervention. Deep learning and image segmentation techniques are used to minimize the number of polyps that goes unnoticed by the experts during these interventions. Although these techniques perform well in terms of accuracy, they require too many parameters. We propose a new model to address this problem. Our proposed model requires fewer parameters as well as outperforms the state-of-the-art models. We use EfficientNetB0 for the encoder part, as it performs well in various tasks while requiring fewer parameters. We use partial decoder, which is used to reduce the number of parameters while achieving high accuracy in segmentation. Since polyps have variable appearances and sizes, we use an asymmetric convolution block instead of a classic convolution block. Then, we weight each feature map using a squeeze and excitation block to improve our segmentation results. We used different splits of Kvasir and CVC-ClinicDB datasets for training, validation, and testing, while we use CVC- ColonDB, ETIS, and Endoscene datasets for testing. Our model outperforms state-of-art models with a Dice metric of %71.8 on the ColonDB test dataset, %89.3 on the EndoScene test dataset, and %74.8 on the ETIS test dataset while requiring fewer parameters. Our model requires 2.626.337 parameters in total while the closest model in the state-of-the-art is U-Net++ with 9.042.177 parameters.

Citations (2)

Summary

We haven't generated a summary for this paper yet.