Papers
Topics
Authors
Recent
Search
2000 character limit reached

The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy

Published 8 Nov 2011 in stat.ML | (1111.1915v1)

Abstract: The popular cubic smoothing spline estimate of a regression function arises as the minimizer of the penalized sum of squares $\sum_j(Y_j - {\mu}(t_j))2 + {\lambda}\int_ab [{\mu}"(t)]2 dt$, where the data are $t_j,Y_j$, $j=1,..., n$. The minimization is taken over an infinite-dimensional function space, the space of all functions with square integrable second derivatives. But the calculations can be carried out in a finite-dimensional space. The reduction from minimizing over an infinite dimensional space to minimizing over a finite dimensional space occurs for more general objective functions: the data may be related to the function ${\mu}$ in another way, the sum of squares may be replaced by a more suitable expression, or the penalty, $\int_ab [{\mu}"(t)]2 dt$, might take a different form. This paper reviews the Reproducing Kernel Hilbert Space structure that provides a finite-dimensional solution for a general minimization problem. Particular attention is paid to penalties based on linear differential operators. In this case, one can sometimes easily calculate the minimizer explicitly, using Green's functions.

Citations (15)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.