A Parallelizable Dual Smoothing Method for Large Scale Convex Regression Problems (1608.02227v1)
Abstract: Convex regression (CR) is an approach for fitting a convex function to a finite number of observations. It arises in various applications from diverse fields such as statistics, operations research, economics, and electrical engineering. The least squares (LS) estimator, which can be computed via solving a quadratic program (QP), is an intuitive method for convex regression with already established strong theoretical guarantees. On the other hand, since the number of constraints in the QP formulation increases quadratically in the number of observed data points, the QP quickly becomes impractical to solve using traditional interior point methods. To address this issue, we propose a first-order method based on dual smoothing that carefully manages the memory usage through parallelization in order to efficiently compute the LS estimator in practice for large-scale CR instances.