Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deformable-Detection Transformer for Microbubble Localization in Ultrasound Localization Microscopy (2308.09845v1)

Published 18 Aug 2023 in eess.IV

Abstract: To overcome the half a wavelength resolution limitations of ultrasound imaging, microbubbles (MBs) have been utilized widely in the field. Conventional MB localization methods are limited whether by exhaustive parameter tuning or considering a fixed Point Spread Function (PSF) for MBs. This questions their adaptability to different imaging settings or depths. As a result, development of methods that don't rely on manually adjusted parameters is crucial. Previously, we used a transformer-based approach i.e. DEtection TRansformer (DETR) (arXiv:2005.12872v3 and arXiv:2209.11859v1) to address the above mentioned issues. However, DETR suffers from long training times and lower precision for smaller objects. In this paper, we propose the application of DEformable DETR (DE-DETR) ( arXiv:2010.04159) for MB localization to mitigate DETR's above mentioned challenges. As opposed to DETR, where attention is casted upon all grid pixels, DE-DETR utilizes a multi-scale deformable attention to distribute attention within a limited budget. To evaluate the proposed strategy, pre-trained DE-DETR was fine-tuned on a subset of the dataset provided by the IEEE IUS Ultra-SR challenge organizers using transfer learning principles and subsequently we tested the network on the rest of the dataset, excluding the highly correlated frames. The results manifest an improvement both in precision and recall and the final super-resolution maps compared to DETR.

Citations (5)

Summary

We haven't generated a summary for this paper yet.