Content deleted Content added
No edit summary |
No edit summary |
||
Line 6:
Multiple kernel learning refers to a set of machine learning methods that use a predefined set of [[Kernel method|kernels]] and learn an optimal linear or non-linear combination of kernels as part of the algorithm. Reasons to use multiple kernel learning include a) the ability to select for an optimal kernel and parameters from a larger set of kernels, reducing bias due to kernel selection while allowing for more automated machine learning methods, and b) combining data from different sources (e.g. sound and images from a video) that have different notions of similarity and thus require different kernels. Instead of creating a new kernel, multiple kernel algorithms can be used to combine kernels already established for each individual data source.
Multiple kernel learning approaches been used
==Algorithms==
Line 35:
===Unsupervised learning===
[[Unsupervised learning|Unsupervised]] multiple kernel learning algorithms have also been
*The optimal kernel minimizes the approximation error over the data.
*The optimal error minimizes the distortion over all training data
==MKL Libraries==
Available MKL libraries include
* [http://www.cs.cornell.edu/~ashesh/pubs/code/SPG-GMKL/download.html SPG-GMKL]: A scalable C++ MKL SVM library that can handle a million kernels. <ref>Ashesh Jain, S. V. N. Vishwanathan and Manik Varma. SPG-GMKL: Generalized multiple kernel learning with a million kernels. In Proceedings of the ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Beijing, China, August 2012</ref>
* [http://research.microsoft.com/en-us/um/people/manik/code/GMKL/download.html GMKL]: Generalized Multiple Kernel Learning code in [[MATLAB]], does <math>\ell_1</math> and <math>\ell_2</math> regularization for supervised learning. <ref>M. Varma and B. R. Babu. More generality in efficient multiple kernel learning. In Proceedings of the International Conference on Machine Learning, Montreal, Canada, June 2009</ref>
|