Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal Reduced Isotonic Regression (1412.2844v1)

Published 9 Dec 2014 in stat.CO and cs.DS

Abstract: Isotonic regression is a shape-constrained nonparametric regression in which the regression is an increasing step function. For $n$ data points, the number of steps in the isotonic regression may be as large as $n$. As a result, standard isotonic regression has been criticized as overfitting the data or making the representation too complicated. So-called "reduced" isotonic regression constrains the outcome to be a specified number of steps $b$, $b \leq n$. However, because the previous algorithms for finding the reduced $L_2$ regression took $\Theta(n+bm2)$ time, where $m$ is the number of steps of the unconstrained isotonic regression, researchers felt that the algorithms were too slow and instead used approximations. Other researchers had results that were approximations because they used a greedy top-down approach. Here we give an algorithm to find an exact solution in $\Theta(n+bm)$ time, and a simpler algorithm taking $\Theta(n+b m \log m)$ time. These algorithms also determine optimal $k$-means clustering of weighted 1-dimensional data.

Citations (3)

Summary

We haven't generated a summary for this paper yet.