Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PAC-Bayes unleashed: generalisation bounds with unbounded losses (2006.07279v2)

Published 12 Jun 2020 in stat.ML, cs.LG, math.ST, and stat.TH

Abstract: We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typically assumed to take values in the interval [0;1]). In order to relax this assumption, we propose a new notion called HYPE (standing for \emph{HYPothesis-dependent rangE}), which effectively allows the range of the loss to depend on each predictor. Based on this new notion we derive a novel PAC-Bayesian generalisation bound for unbounded loss functions, and we instantiate it on a linear regression problem. To make our theory usable by the largest audience possible, we include discussions on actual computation, practicality and limitations of our assumptions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Maxime Haddouche (14 papers)
  2. Benjamin Guedj (68 papers)
  3. Omar Rivasplata (16 papers)
  4. John Shawe-Taylor (68 papers)
Citations (51)

Summary

We haven't generated a summary for this paper yet.