Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Variable Metric and Nesterov Extrapolated Proximal DCA with Backtracking for A Composite DC Program (2205.02667v1)

Published 5 May 2022 in math.OC

Abstract: In this paper, we consider a composite difference-of-convex (DC) program, whose objective function is the sum of a smooth convex function with Lipschitz continuous gradient, a proper closed and convex function, and a continuous concave function. This problem has many applications in machine learning and data science. The proximal DCA (pDCA), a special case of the classical DCA, as well as two Nesterov-type extrapolated DCA -- ADCA (Phan et al. IJCAI:1369--1375, 2018) and pDCAe (Wen et al. Comput Optim Appl 69:297--324, 2018) -- can solve this problem. The algorithmic step-sizes of pDCA, pDCAe, and ADCA are fixed and determined by estimating a prior the smoothness parameter of the loss function. However, such an estimate may be hard to obtain or poor in some real-world applications. Motivated by this difficulty, we propose a variable metric and Nesterov extrapolated proximal DCA with backtracking (SPDCAe), which combines the backtracking line search procedure (not necessarily monotone) and the Nesterov's extrapolation for potential acceleration; moreover, the variable metric method is incorporated for better local approximation. Numerical simulations on sparse binary logistic regression and compressed sensing with Poisson noise demonstrate the effectiveness of our proposed method.

Summary

We haven't generated a summary for this paper yet.