Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Classification Rates for Localized SVMs (1905.01502v2)

Published 4 May 2019 in math.ST and stat.TH

Abstract: Localized support vector machines solve SVMs on many spatially defined small chunks and one of their main characteristics besides the computational benefit compared to global SVMs is the freedom of choosing arbitrary kernel and regularization parameter on each cell. We take advantage of this observation to derive global learning rates for localized SVMs with Gaussian kernels and hinge loss. Under certain assumptions our rates outperform known classification rates for localized SVMs, for global SVMs, and other learning algorithms based on e.g., plug-in rules, trees, or DNNs. These rates are achieved under a set of margin conditions that describe the behavior of the data-generating distribution, where no assumption on the existence of a density is made. We observe that a margin condition that relates the distance to the decision boundary to the amount of noise is crucial to obtain rates. The statistical analysis relies on a careful analysis of the excess risk which includes a separation of the input space into a subset that is close to the decision boundary and into a subset that is sufficiently far away. Moreover, we show that our rates are obtained adaptively, that is, without knowing the parameters resulting from the margin conditions.

Summary

We haven't generated a summary for this paper yet.