Reproducing Kernel Hilbert Spaces
Given that reproducing kernel Hilbert spaces (RKHS) behave more like Euclidean space than more general Hilbert spaces, it is somewhat surprising they are not better known. I have uploaded to arXiv a primer on reproducing kernel Hilbert spaces that differs from other introductory material in a number of ways.
It is perhaps the first to start by considering finite-dimensional spaces rather than jump straight to infinite-dimensional spaces. Surprisingly much can be gained by considering the finite-dimensional case first. It also does not assume any prior knowledge about Hilbert spaces and goes to great lengths to explain clearly about closed and non-closed subspaces and completions of spaces. (e.g., It is explained that a space which is not complete can be thought to have a ‘scratched surface’.)
Whereas other material refers to a RKHS theory as providing a “coordinate-free approach”, I argue that the presence of a Hilbert space already provides a coordinate-free approach, with the added benefit of a reproducing kernel Hilbert space being a canonical coordinate system associated to each point in the Hilbert space.
The primer explains that RKHS theory studies the configuration of an inner product space sitting inside a function space; it describes an extrinsic geometry because how the space is oriented inside the larger function space matters. This way of thinking leads to being able to determine when RKHS theory might be relevant to a problem at hand; if the solution of the problem changes continuously as the configuration changes continuously then it is possible the solution might be expressible in terms of the kernel in a nice way.
How RKHS theory can be used to solve linear equations is discussed along with some novel descriptions and extensions, as is the role of RKHS theory in signal detection.
My colleague contributed substantially by describing some of the recent work in the statistical learning theory literature, including explaining how RKHS theory can be used for testing for independence, for nonlinear filtering, and so forth.