Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Parallelizable Dual Smoothing Method for Large Scale Convex Regression Problems (1608.02227v1)

Published 7 Aug 2016 in math.OC

Abstract: Convex regression (CR) is an approach for fitting a convex function to a finite number of observations. It arises in various applications from diverse fields such as statistics, operations research, economics, and electrical engineering. The least squares (LS) estimator, which can be computed via solving a quadratic program (QP), is an intuitive method for convex regression with already established strong theoretical guarantees. On the other hand, since the number of constraints in the QP formulation increases quadratically in the number of observed data points, the QP quickly becomes impractical to solve using traditional interior point methods. To address this issue, we propose a first-order method based on dual smoothing that carefully manages the memory usage through parallelization in order to efficiently compute the LS estimator in practice for large-scale CR instances.

Summary

We haven't generated a summary for this paper yet.