Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parametrized Accelerated Methods Free of Condition Number (1802.10235v1)

Published 28 Feb 2018 in cs.LG and math.OC

Abstract: Analyses of accelerated (momentum-based) gradient descent usually assume bounded condition number to obtain exponential convergence rates. However, in many real problems, e.g., kernel methods or deep neural networks, the condition number, even locally, can be unbounded, unknown or mis-estimated. This poses problems in both implementing and analyzing accelerated algorithms. In this paper, we address this issue by proposing parametrized accelerated methods by considering the condition number as a free parameter. We provide spectral-level analysis for several important accelerated algorithms, obtain explicit expressions and improve worst case convergence rates. Moreover, we show that those algorithm converge exponentially even when the condition number is unknown or mis-estimated.

Citations (3)

Summary

We haven't generated a summary for this paper yet.