Content deleted Content added
Citation bot (talk | contribs) Add: authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | #UCB_CommandLine |
Rkieferbaum (talk | contribs) m v2.05 - Fix errors for CW project (Link equal to linktext - Whitespace characters after heading) |
||
Line 1:
[[File:Different Views on RKHS.png|thumb|right|Figure illustrates related but varying approaches to viewing RKHS]]
In [[functional analysis]] (a branch of [[mathematics]]), a '''reproducing kernel Hilbert space''' ('''RKHS''') is a [[Hilbert space]] of functions in which point evaluation is a continuous linear [[Functional (mathematics)|functional]]. Roughly speaking, this means that if two functions <math>f</math> and <math>g</math> in the RKHS are close in norm, i.e., <math>\|f-g\|</math> is small, then <math>f</math> and <math>g</math> are also pointwise close, i.e., <math>|f(x)-g(x)|</math> is small for all <math>x</math>. The converse does not need to be true. Informally, this can be shown by looking at the [[Uniform norm|supremum norm]]: the sequence of functions <math>\sin^n (x)</math> converges pointwise, but do not converge [[Uniform Convergence|uniformly]] i.e. do not converge with respect to the supremum norm (this is not a counterexample because the supremum norm does not arise from any [[
It is not entirely straightforward to construct a Hilbert space of functions which is not an RKHS.<ref>Alpay, D., and T. M. Mills. "A family of Hilbert spaces which are not reproducing kernel Hilbert spaces." J. Anal. Appl. 1.2 (2003): 107–111.</ref> Some examples, however, have been found.<ref> Z. Pasternak-Winiarski, "On weights which admit reproducing kernel of Bergman type", ''International Journal of Mathematics and Mathematical Sciences'', vol. 15, Issue 1, 1992. </ref><ref> T. Ł. Żynda, "On weights which admit reproducing kernel of Szeg¨o type", ''Journal of Contemporary Mathematical Analysis'' (Armenian Academy of Sciences), 55, 2020. </ref>
Line 202:
:<math> K(x,y) = (\alpha\langle x,y \rangle + 1)^d, \qquad \alpha \in \R, d \in \N </math>
===[[Radial basis function kernel]]s===
These are another common class of kernels which satisfy <math> K(x,y) = K(\|x - y\|)</math>. Some examples include:
Line 266:
We lastly remark that the above theory can be further extended to spaces of functions with values in function spaces but obtaining kernels for these spaces is a more difficult task.<ref>Rosasco</ref>
== Connection between RKHS with ReLU function ==
The [[Rectifier (neural networks)|ReLU function]] is commonly defined as <math>f(x)=\max \{0, x\}</math> and is a mainstay in the architecture of neural networks where it is used as an activation function. One can construct a ReLU-like nonlinear function using the theory of reproducing kernel Hilbert spaces. Below, we derive this construction and show how it implies the representation power of neural networks with ReLU activations.
|