Papers
Topics
Authors
Recent
2000 character limit reached

Leave-One-Out Learning with Log-Loss (2511.12718v1)

Published 16 Nov 2025 in cs.IT

Abstract: We study batch learning with log-loss in the individual setting, where the outcome sequence is deterministic. Because empirical statistics are not directly applicable in this regime, obtaining regret guarantees for batch learning has long posed a fundamental challenge. We propose a natural criterion based on leave-one-out regret and analyze its minimax value for several hypothesis classes. For the multinomial simplex over $m$ symbols, we show that the minimax regret is $\frac{m-1}{N} + o!\left(\frac{1}{N}\right)$, and compare it to the stochastic realizable case where it is $\frac{m-1}{2N} + o!\left(\frac{1}{N}\right)$. More generally, we prove that every hypothesis class of VC dimension $d$ is learnable in the individual batch-learning problem, with regret at most $\frac{d\log(N)}{N} + o!\left(\frac{\log(N)}{N}\right)$, and we establish matching lower bounds for certain classes. We further derive additional upper bounds that depend on structural properties of the hypothesis class. These results establish, for the first time, that universal batch learning with log-loss is possible in the individual setting.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.