Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PET image denoising based on denoising diffusion probabilistic models (2209.06167v2)

Published 13 Sep 2022 in eess.IV, cs.CV, and physics.med-ph

Abstract: Due to various physical degradation factors and limited counts received, PET image quality needs further improvements. The denoising diffusion probabilistic models (DDPM) are distribution learning-based models, which try to transform a normal distribution into a specific data distribution based on iterative refinements. In this work, we proposed and evaluated different DDPM-based methods for PET image denoising. Under the DDPM framework, one way to perform PET image denoising is to provide the PET image and/or the prior image as the network input. Another way is to supply the prior image as the input with the PET image included in the refinement steps, which can fit for scenarios of different noise levels. 120 18F-FDG datasets and 140 18F-MK-6240 datasets were utilized to evaluate the proposed DDPM-based methods. Quantification show that the DDPM-based frameworks with PET information included can generate better results than the nonlocal mean and Unet-based denoising methods. Adding additional MR prior in the model can help achieve better performance and further reduce the uncertainty during image denoising. Solely relying on MR prior while ignoring the PET information can result in large bias. Regional and surface quantification shows that employing MR prior as the network input while embedding PET image as a data-consistency constraint during inference can achieve the best performance. In summary, DDPM-based PET image denoising is a flexible framework, which can efficiently utilize prior information and achieve better performance than the nonlocal mean and Unet-based denoising methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kuang Gong (27 papers)
  2. Keith A. Johnson (3 papers)
  3. Georges El Fakhri (52 papers)
  4. Quanzheng Li (122 papers)
  5. Tinsu Pan (6 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.