Computation of sum of squares polynomials from data points (1812.02444v5)
Abstract: We propose an iterative algorithm for the numerical computation of sums of squares of polynomials approximating given data at prescribed interpolation points. The method is based on the definition of a convex functional $G$ arising from the dualization of a quadratic regression over the Cholesky factors of the sum of squares decomposition. In order to justify the construction, the domain of $G$, the boundary of the domain and the behavior at infinity are analyzed in details. When the data interpolate a positive univariate polynomial, we show that in the context of the Lukacs sum of squares representation, $G$ is coercive and strictly convex which yields a unique critical point and a corresponding decomposition in sum of squares. For multivariate polynomials which admit a decomposition in sum of squares and up to a small perturbation of size $\varepsilon$, $G\varepsilon$ is always coercive and so it minimum yields an approximate decomposition in sum of squares. Various unconstrained descent algorithms are proposed to minimize $G$. Numerical examples are provided, for univariate and bivariate polynomials.