Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SOS-convex Semi-algebraic Programs and its Applications to Robust Optimization: A Tractable Class of Nonsmooth Convex Optimization (1702.02299v1)

Published 8 Feb 2017 in math.OC

Abstract: In this paper, we introduce a new class of nonsmooth convex functions called SOS-convex semialgebraic functions extending the recently proposed notion of SOS-convex polynomials. This class of nonsmooth convex functions covers many common nonsmooth functions arising in the applications such as the Euclidean norm, the maximum eigenvalue function and the least squares functions with $\ell_1$-regularization or elastic net regularization used in statistics and compressed sensing. We show that, under commonly used strict feasibility conditions, the optimal value and an optimal solution of SOS-convex semi-algebraic programs can be found by solving a single semi-definite programming problem (SDP). We achieve the results by using tools from semi-algebraic geometry, convex-concave minimax theorem and a recently established Jensen inequality type result for SOS-convex polynomials. As an application, we outline how the derived results can be applied to show that robust SOS-convex optimization problems under restricted spectrahedron data uncertainty enjoy exact SDP relaxations. This extends the existing exact SDP relaxation result for restricted ellipsoidal data uncertainty and answers the open questions left in [Optimization Letters 9, 1-18(2015)] on how to recover a robust solution from the semi-definite programming relaxation in this broader setting.

Summary

We haven't generated a summary for this paper yet.