Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss (1805.11386v2)

Published 29 May 2018 in stat.ML, cs.LG, math.ST, and stat.TH

Abstract: We consider the setting of online linear regression for arbitrary deterministic sequences, with the square loss. We are interested in the aim set by Bartlett et al. (2015): obtain regret bounds that hold uniformly over all competitor vectors. When the feature sequence is known at the beginning of the game, they provided closed-form regret bounds of $2d B2 \ln T + \mathcal{O}_T(1)$, where $T$ is the number of rounds and $B$ is a bound on the observations. Instead, we derive bounds with an optimal constant of $1$ in front of the $d B2 \ln T$ term. In the case of sequentially revealed features, we also derive an asymptotic regret bound of $d B2 \ln T$ for any individual sequence of features and bounded observations. All our algorithms are variants of the online non-linear ridge regression forecaster, either with a data-dependent regularization or with almost no regularization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Pierre Gaillard (44 papers)
  2. Sébastien Gerchinovitz (21 papers)
  3. Malo Huard (3 papers)
  4. Gilles Stoltz (30 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.