Automatic reproducing kernel and regularization for learning convolution kernels
Abstract: Learning convolution kernels in operators from data arises in numerous applications and represents an ill-posed inverse problem of broad interest. With scant prior information, kernel methods offer a natural nonparametric approach with regularization. However, a major challenge is to select a proper reproducing kernel, especially as operators and data vary. We show that the input data and convolution operator themselves induce an automatic, data-adaptive RKHS (DA-RKHS), obviating manual kernel selection. In particular, when the observation data is discrete and finite, there is a finite set of automatic basis functions sufficient to represent the estimators in the DA-RKHS, including the minimal-norm least-squares, Tikhonov, and conjugate-gradient estimators. We develop both Tikhonov and scalable iterative and hybrid algorithms using the automatic basis functions. Numerical experiments on integral, nonlocal, and aggregation operators confirm that our automatic RKHS regularization consistently outperforms standard ridge regression and Gaussian process methods with preselected kernels.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.