Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Few-Shot Domain Adaptation with Polymorphic Transformers (2107.04805v1)

Published 10 Jul 2021 in cs.CV

Abstract: Deep neural networks (DNNs) trained on one set of medical images often experience severe performance drop on unseen test images, due to various domain discrepancy between the training images (source domain) and the test images (target domain), which raises a domain adaptation issue. In clinical settings, it is difficult to collect enough annotated target domain data in a short period. Few-shot domain adaptation, i.e., adapting a trained model with a handful of annotations, is highly practical and useful in this case. In this paper, we propose a Polymorphic Transformer (Polyformer), which can be incorporated into any DNN backbones for few-shot domain adaptation. Specifically, after the polyformer layer is inserted into a model trained on the source domain, it extracts a set of prototype embeddings, which can be viewed as a "basis" of the source-domain features. On the target domain, the polyformer layer adapts by only updating a projection layer which controls the interactions between image features and the prototype embeddings. All other model weights (except BatchNorm parameters) are frozen during adaptation. Thus, the chance of overfitting the annotations is greatly reduced, and the model can perform robustly on the target domain after being trained on a few annotated images. We demonstrate the effectiveness of Polyformer on two medical segmentation tasks (i.e., optic disc/cup segmentation, and polyp segmentation). The source code of Polyformer is released at https://github.com/askerlee/segtran.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Shaohua Li (43 papers)
  2. Xiuchao Sui (7 papers)
  3. Jie Fu (229 papers)
  4. Huazhu Fu (185 papers)
  5. Xiangde Luo (31 papers)
  6. Yangqin Feng (4 papers)
  7. Xinxing Xu (33 papers)
  8. Yong Liu (721 papers)
  9. Daniel Ting (19 papers)
  10. Rick Siow Mong Goh (59 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.