Papers
Topics
Authors
Recent
2000 character limit reached

Kernel-based L_2-Boosting with Structure Constraints

Published 16 Sep 2020 in cs.LG and stat.ML | (2009.07558v1)

Abstract: Developing efficient kernel methods for regression is very popular in the past decade. In this paper, utilizing boosting on kernel-based weaker learners, we propose a novel kernel-based learning algorithm called kernel-based re-scaled boosting with truncation, dubbed as KReBooT. The proposed KReBooT benefits in controlling the structure of estimators and producing sparse estimate, and is near overfitting resistant. We conduct both theoretical analysis and numerical simulations to illustrate the power of KReBooT. Theoretically, we prove that KReBooT can achieve the almost optimal numerical convergence rate for nonlinear approximation. Furthermore, using the recently developed integral operator approach and a variant of Talagrand's concentration inequality, we provide fast learning rates for KReBooT, which is a new record of boosting-type algorithms. Numerically, we carry out a series of simulations to show the promising performance of KReBooT in terms of its good generalization, near over-fitting resistance and structure constraints.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.