Backtracking linesearch for conditional gradient sliding (2006.05272v1)
Abstract: We present a modification of the conditional gradient sliding (CGS) method that was originally developed in \cite{lan2016conditional}. While the CGS method is a theoretical breakthrough in the theory of projection-free first-order methods since it is the first that reaches the theoretical performance limit, in implementation it requires the knowledge of the Lipschitz constant of the gradient of the objective function $L$ and the number of total gradient evaluations $N$. Such requirements imposes difficulties in the actual implementation, not only because that it can be difficult to choose proper values of $L$ and $N$ that satisfies the conditions for convergence, but also since conservative choices of $L$ and $N$ can deteriorate the practical numerical performance of the CGS method. Our proposed method, called the conditional gradient sliding method with linesearch (CGS-ls), does not require the knowledge of either $L$ and $N$, and is able to terminate early before the theoretically required number of iterations. While more practical in numerical implementation, the theoretical performance of our proposed CGS-ls method is still as good as that of the CGS method. We present numerical experiments to show the efficiency of our proposed method in practice.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.