Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Multiclass Boosting (1702.07305v3)

Published 23 Feb 2017 in stat.ML and cs.LG

Abstract: Recent work has extended the theoretical analysis of boosting algorithms to multiclass problems and to online settings. However, the multiclass extension is in the batch setting and the online extensions only consider binary classification. We fill this gap in the literature by defining, and justifying, a weak learning condition for online multiclass boosting. This condition leads to an optimal boosting algorithm that requires the minimal number of weak learners to achieve a certain accuracy. Additionally, we propose an adaptive algorithm which is near optimal and enjoys an excellent performance on real data due to its adaptive property.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Young Hun Jung (11 papers)
  2. Jack Goetz (9 papers)
  3. Ambuj Tewari (134 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.