Papers
Topics
Authors
Recent
2000 character limit reached

A perturbed preconditioned gradient descent method for the unconstrained minimization of composite objectives

Published 22 Dec 2025 in math.OC and math.NA | (2512.19532v1)

Abstract: We introduce a perturbed preconditioned gradient descent (PPGD) method for the unconstrained minimization of a strongly convex objective $G$ with a locally Lipschitz continuous gradient. We assume that $G(v)=E(v)+F(v)$ and that the gradient of $F$ is only known approximately. Our analysis is conducted in infinite dimensions with a preconditioner built into the framework. We prove a linear rate of convergence, up to an error term dependent on the gradient approximation. We apply the PPGD to the stationary Cahn-Hilliard equations with variable mobility under periodic boundary conditions. Numerical experiments are presented to validate the theoretical convergence rates and explore how the mobility affects the computation.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.