Papers
Topics
Authors
Recent
Search
2000 character limit reached

Coordinate Descent Algorithm for Least Absolute Deviations Regression

Published 19 Mar 2026 in stat.ME | (2603.19336v1)

Abstract: Least Absolute Deviations (LAD) regression provides a robust alternative to ordinary least squares by minimizing the sum of absolute residuals. However, its widespread use has been limited by the computational cost of existing solvers, particularly simplex-based methods in high-dimensional settings. We propose a coordinate descent algorithm for LAD regression that avoids matrix inversion, naturally accommodates the non-differentiability of the objective function, and remains well-defined even when the number of predictors exceeds the number of observations. The key observation is that each coordinate update reduces to a one-dimensional minimization admitting a closed-form solution given by a median or weighted median. The resulting algorithm has per-iteration complexity $O(p\,n \log n)$ and is provably convergent due to the convexity of the LAD objective and the exactness of each coordinate update. Experiments on synthetic and real datasets show that the method matches the accuracy of linear-programming-based LAD solvers while offering improved scalability and stability in high-dimensional regimes, including cases where $p \ge n$. The method is easy to implement, requires no specialized optimization software, and provides a practical tool for robust linear models.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.