Counterexample to global convergence of DSOS and SDSOS hierarchies (1707.02964v2)
Abstract: We exhibit a convex polynomial optimization problem for which the diagonally-dominant sum-of-squares (DSOS) and the scaled diagonally-dominant sum-of-squares (SDSOS) hierarchies, based on linear programming and second-order conic programming respectively, do not converge to the global infimum. The same goes for the r-DSOS and r-SDSOS hierarchies. This refutes the claim in the literature according to which the DSOS and SDSOS hierarchies can solve any polynomial optimization problem to arbitrary accuracy. In contrast, the Lasserre hierarchy based on semidefinite programming yields the global infimum and the global minimizer with the first order relaxation. We further observe that the dual to the SDSOS hierarchy is the moment hierarchy where every positive semidefinite constraint is relaxed to all necessary second-order conic constraints. As a result, the number of second-order conic constraints grows quadratically in function of the size of the positive semidefinite constraints in the Lasserre hierarchy. Together with the counterexample, this suggests that DSOS and SDSOS are not necessarily more tractable alternatives to sum-of-squares.