Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator (2305.13588v2)

Published 23 May 2023 in stat.ML and cs.LG

Abstract: Reproducing kernel Hilbert $C*$-module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of $C*$-algebra, and the Perron-Frobenius operator is a linear operator related to the composition of functions. Combining these two concepts, we present deep RKHM, a deep learning framework for kernel methods. We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators. By virtue of $C*$-algebra, the dependency of the bound on output dimension is milder than existing bounds. We show that $C*$-algebra is a suitable tool for deep learning with kernels, enabling us to take advantage of the product structure of operators and to provide a clear connection with convolutional neural networks. Our theoretical analysis provides a new lens through which one can design and analyze deep kernel methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yuka Hashimoto (22 papers)
  2. Masahiro Ikeda (95 papers)
  3. Hachem Kadri (32 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.