- The paper introduces a novel image restoration approach by integrating a pre-trained flow matching model within a PnP forward-backward splitting framework.
- It details a method using a time-dependent denoiser and reprojection onto flow trajectories to balance data fidelity with denoising performance.
- Extensive experiments on datasets like CelebA and AFHQ-Cat show that PnP-Flow outperforms existing methods in PSNR and SSIM metrics.
PnP-Flow: Plug-and-Play Image Restoration with Flow Matching
This paper introduces a novel Plug-and-Play (PnP) algorithm called PnP Flow Matching for image restoration tasks. The method leverages the strengths of pre-trained Flow Matching (FM) models within an optimization framework, offering a computationally efficient and memory-friendly alternative to existing approaches.
Methodology
The core idea is to define a time-dependent denoiser using a pre-trained FM model, which is then integrated into a Forward-Backward Splitting (FBS) PnP framework. The algorithm alternates between three key steps: a gradient descent step on the data-fidelity term, a reprojection step onto the learned FM path, and a denoising step using the time-dependent denoiser. The time-dependent denoiser is defined as Dt=Id+(1−t)vtθ, where vtθ is the learned velocity field from the FM model. This design is motivated by the fact that Dt approximates the conditional expectation E[X1∣Xt=x], where X1 is a sample from the target distribution and Xt is a point along the flow path.
A key aspect of the method is the reprojection step, where iterates are reprojected onto flow trajectories via linear interpolation. Given a current iterate x and a time t, the reprojected point is computed as z~=(1−t)ϵ+tz, where ϵ is a noise sample drawn from the latent distribution P0 and z results from a gradient step on the data-fidelity term. This step ensures that the input to the denoiser lies within the support of the flow path, improving the effectiveness of the denoising operation.



Figure 1: Our method on a 2D denoising task (σ=1.5) with Gaussian distributions.
Implementation Details
The algorithm's performance is sensitive to the choice of the learning rate. The authors suggest using a time-dependent learning rate of the form γt=(1−t)α, where α∈(0,1]. This helps balance the contributions of the data-fidelity term and the denoiser, preventing the algorithm from simply returning the noisy input at later time steps.
The paper also presents a convergence result, stating that if the sequence of iterates produced by the algorithm is bounded and the time sequence (tn)n∈N satisfies ∑n=0∞(1−tn)<+∞, then the sequence converges.
Experimental Results
The authors conducted extensive experiments on image denoising, deblurring, inpainting, and super-resolution tasks, using the CelebA and AFHQ-Cat datasets. The results demonstrate that PnP-Flow Matching consistently outperforms state-of-the-art FM-based and PnP methods in terms of PSNR and SSIM metrics. Furthermore, the method exhibits stability across different tasks, unlike some competing approaches that perform well on certain tasks but struggle on others.
For instance, on the CelebA dataset, PnP-Flow achieved a PSNR of 32.45 dB on the denoising task, outperforming methods like OT-ODE (30.50 dB) and Flow-Priors (29.26 dB). Similarly, on the super-resolution task, PnP-Flow achieved a PSNR of 31.49 dB, surpassing OT-ODE (31.05 dB) and other baselines. Similar trends are seen with SSIM scores.











Figure 2: Results for random inpainting using PnP-Flow across different iterations (time steps) with corresponding PSNR values.
Advantages
The method offers several advantages over existing approaches:
- It is computationally efficient and memory-friendly, as it avoids backpropagation through ODEs and trace computations.
- It is simple to implement and requires few hyper-parameters.
- It supports different latent distributions and flexible initialization.
- It delivers strong performance across various inverse problems.
Limitations
The authors note that the reconstructions produced by PnP-Flow Matching tend to be slightly over-smoothed, which they attribute to the denoising operation acting as a minimum mean squared error estimator.
Implications and Future Work
PnP-Flow Matching offers a promising framework for integrating generative models into image restoration tasks. The method's computational efficiency and versatility make it a valuable tool for practitioners working on real-world imaging problems.
Future research directions include exploring the use of PnP-Flow Matching for other types of measurement noise, such as Poisson noise, and investigating the use of different latent distributions to model categorical data. Also, further investigation into better theoretical convergence bounds would be useful.
Conclusion
The paper presents a compelling approach to image restoration by combining PnP methods with Flow Matching. The proposed PnP-Flow Matching algorithm demonstrates strong empirical performance and offers several practical advantages over existing techniques. The method's versatility and efficiency make it a valuable contribution to the field of computational imaging.