Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CS-TSSOS: Correlative and term sparsity for large-scale polynomial optimization (2005.02828v2)

Published 6 May 2020 in math.OC and cs.MS

Abstract: This work proposes a new moment-SOS hierarchy, called CS-TSSOS, for solving large-scale sparse polynomial optimization problems. Its novelty is to exploit simultaneously correlative sparsity and term sparsity by combining advantages of two existing frameworks for sparse polynomial optimization. The former is due to Waki et al. while the latter was initially proposed by Wang et al. and later exploited in the TSSOS hierarchy. In doing so we obtain CS-TSSOS -- a two-level hierarchy of semidefinite programming relaxations with (i), the crucial property to involve blocks of SDP matrices and (ii), the guarantee of convergence to the global optimum under certain conditions. We demonstrate its efficiency and scalability on several large-scale instances of the celebrated Max-Cut problem and the important industrial optimal power flow problem, involving up to six thousand variables and tens of thousands of constraints.

Citations (61)

Summary

We haven't generated a summary for this paper yet.