Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 36 tok/s
GPT-5 High 40 tok/s Pro
GPT-4o 99 tok/s
GPT OSS 120B 461 tok/s Pro
Kimi K2 191 tok/s Pro
2000 character limit reached

On Bayesian posterior mean estimators in imaging sciences and Hamilton-Jacobi Partial Differential Equations (2003.05572v1)

Published 12 Mar 2020 in math.ST and stat.TH

Abstract: Variational and Bayesian methods are two approaches that have been widely used to solve image reconstruction problems. In this paper, we propose original connections between Hamilton--Jacobi (HJ) partial differential equations and a broad class of Bayesian methods and posterior mean estimators with Gaussian data fidelity term and log-concave prior. Whereas solutions to certain first-order HJ PDEs with initial data describe maximum a posteriori estimators in a Bayesian setting, here we show that solutions to some viscous HJ PDEs with initial data describe a broad class of posterior mean estimators. These connections allow us to establish several representation formulas and optimal bounds involving the posterior mean estimate. In particular, we use these connections to HJ PDEs to show that some Bayesian posterior mean estimators can be expressed as proximal mappings of twice continuously differentiable functions, and furthermore we derive a representation formula for these functions.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube