Content deleted Content added
m changed sign of \gamma, which is usually positive, see e.g. https://www.csie.ntu.edu.tw/~cjlin/libsvm/ |
m Fix names in references |
||
Line 1:
In [[machine learning]], the ('''Gaussian''') '''[[radial basis function]] kernel''', or '''RBF kernel''', is a popular [[Positive-definite kernel|kernel function]] used in various [[kernel method|kernelized]] learning algorithms. In particular, it is commonly used in [[support vector machine]] [[statistical classification|classification]].<ref name="Chang2010">Yin-Wen Chang, Cho-Jui Hsieh, Kai-Wei Chang, Michael Ringgaard and Chih-Jen Lin (2010). [http://jmlr.org/papers/v11/chang10a.html "Training and testing low-degree polynomial data mappings via linear SVM"]. ''J. Machine Learning Research'' '''11''':1471–1490.</ref>
The RBF kernel on two samples '''x''' and '''x'''', represented as feature vectors in some ''input space'', is defined as<ref name="primer">
:<math>K(\mathbf{x}, \mathbf{x'}) = \exp\left(-\frac{||\mathbf{x} - \mathbf{x'}||^2}{2\sigma^2}\right)</math>
Line 30:
where <math>\textstyle\varphi</math> is the implicit mapping embedded in the RBF kernel.
One way to construct such a ''z'' is to randomly sample from the [[Fourier transformation]] of the kernel.<ref>Ali Rahimi and Benjamin Recht (2007). [http://www.eecs.berkeley.edu/~brecht/papers/07.rah.rec.nips.pdf "Random features for large-scale kernel machines"]. ''Neural Information Processing Systems''.</ref> Another approach uses the [[Nyström method]] to approximate the [[eigendecomposition]] of the [[Gramian matrix|Gram matrix]] ''K'', using only a random sample of the training set.<ref>{{cite journal |authors=
==External links==
|