Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Risk Bounds for Over-parameterized Maximum Margin Classification on Sub-Gaussian Mixtures (2104.13628v2)

Published 28 Apr 2021 in cs.LG, math.ST, stat.ML, and stat.TH

Abstract: Modern machine learning systems such as deep neural networks are often highly over-parameterized so that they can fit the noisy training data exactly, yet they can still achieve small test errors in practice. In this paper, we study this "benign overfitting" phenomenon of the maximum margin classifier for linear classification problems. Specifically, we consider data generated from sub-Gaussian mixtures, and provide a tight risk bound for the maximum margin linear classifier in the over-parameterized setting. Our results precisely characterize the condition under which benign overfitting can occur in linear classification problems, and improve on previous work. They also have direct implications for over-parameterized logistic regression.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yuan Cao (201 papers)
  2. Quanquan Gu (198 papers)
  3. Mikhail Belkin (76 papers)
Citations (51)

Summary

We haven't generated a summary for this paper yet.