Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust Kernel-based Distribution Regression (2104.10637v1)

Published 21 Apr 2021 in cs.LG, math.FA, and stat.ML

Abstract: Regularization schemes for regression have been widely studied in learning theory and inverse problems. In this paper, we study distribution regression (DR) which involves two stages of sampling, and aims at regressing from probability measures to real-valued responses over a reproducing kernel Hilbert space (RKHS). Recently, theoretical analysis on DR has been carried out via kernel ridge regression and several learning behaviors have been observed. However, the topic has not been explored and understood beyond the least square based DR. By introducing a robust loss function $l_{\sigma}$ for two-stage sampling problems, we present a novel robust distribution regression (RDR) scheme. With a windowing function $V$ and a scaling parameter $\sigma$ which can be appropriately chosen, $l_{\sigma}$ can include a wide range of popular used loss functions that enrich the theme of DR. Moreover, the loss $l_{\sigma}$ is not necessarily convex, hence largely improving the former regression class (least square) in the literature of DR. The learning rates under different regularity ranges of the regression function $f_{\rho}$ are comprehensively studied and derived via integral operator techniques. The scaling parameter $\sigma$ is shown to be crucial in providing robustness and satisfactory learning rates of RDR.

Citations (10)

Summary

We haven't generated a summary for this paper yet.