Papers
Topics
Authors
Recent
Search
2000 character limit reached

Optimal Reduced Isotonic Regression

Published 9 Dec 2014 in stat.CO and cs.DS | (1412.2844v1)

Abstract: Isotonic regression is a shape-constrained nonparametric regression in which the regression is an increasing step function. For $n$ data points, the number of steps in the isotonic regression may be as large as $n$. As a result, standard isotonic regression has been criticized as overfitting the data or making the representation too complicated. So-called "reduced" isotonic regression constrains the outcome to be a specified number of steps $b$, $b \leq n$. However, because the previous algorithms for finding the reduced $L_2$ regression took $\Theta(n+bm2)$ time, where $m$ is the number of steps of the unconstrained isotonic regression, researchers felt that the algorithms were too slow and instead used approximations. Other researchers had results that were approximations because they used a greedy top-down approach. Here we give an algorithm to find an exact solution in $\Theta(n+bm)$ time, and a simpler algorithm taking $\Theta(n+b m \log m)$ time. These algorithms also determine optimal $k$-means clustering of weighted 1-dimensional data.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.