Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence Rate Analysis of Continuous- and Discrete-Time Smoothing Gradient Algorithms (2312.04192v1)

Published 7 Dec 2023 in math.OC

Abstract: This paper addresses the gradient flow -- the continuous-time representation of the gradient method -- with the smooth approximation of a non-differentiable objective function and presents convergence analysis framework. Similar to the gradient method, the gradient flow is inapplicable to the non-differentiable function minimization; therefore, this paper addresses the smoothing gradient method, which exploits a decreasing smoothing parameter sequence in the smooth approximation. The convergence analysis is presented using conventional Lyapunov-function-based techniques, and a Lyapunov function applicable to both strongly convex and non-strongly convex objective functions is provided by taking into consideration the effect of the smooth approximation. Based on the equivalence of the stepsize in the smoothing gradient method and the discretization step in the forward Euler scheme for the numerical integration of the smoothing gradient flow, the sample values of the exact solution of the smoothing gradient flow are compared with the state variable of the smoothing gradient method, and the equivalence of the convergence rates is shown.

Citations (2)

Summary

We haven't generated a summary for this paper yet.