Content deleted Content added
m Mikhail Ryazanov moved page Regularization perspectives on support vector machines to Regularization perspectives on support-vector machines: MOS:HYPHEN |
m punct., fmt., style |
||
Line 1:
{{context|date=May 2012}}
'''Regularization perspectives on support
Regularization perspectives on support
==Theoretical background==
In the [[statistical learning theory]] framework, an [[algorithm]] is a strategy for choosing a [[function (mathematics)|function]] <math>
<math>f = \
where <math>\mathcal{H}</math> is a [[hypothesis space]]<ref>A hypothesis space is the set of functions used to model the data in a machine
When <math>\mathcal{H}</math> is a [[reproducing kernel Hilbert space]], there exists a [[kernel function]] <math>K
==Special properties of the hinge loss==
[[File:Hinge and Misclassification Loss.png|Hinge and misclassification loss functions]]
The simplest and most intuitive loss function for categorization is the misclassification loss, or
<math>f_b(x) =
1, & p(1|x) > p(-1|x), \\ -1, & p(1|x) < p(-1|x). \end{ ==Derivation==
The Tikhonov regularization problem can be shown to be equivalent to traditional formulations of SVM by expressing it in terms of the hinge loss.<ref>For a detailed derivation, see {{cite book |last=Rifkin |first=Ryan |title=Everything Old is New Again: A Fresh Look at Historical Approaches in Machine Learning |year=2002 |publisher=MIT (PhD thesis) |url=http://web.mit.edu/~9.520/www/Papers/thesis-rifkin.pdf}}</ref> With the hinge loss
<math>
where <math>(s)_+ = \max(s, 0)</math>, the regularization problem becomes
<math>f = \
Multiplying by <math>1/(2\lambda)</math> yields
<math>f = \
with <math>C = 1/(2\lambda n)</math>, which is equivalent to the standard SVM minimization problem.
Line 41 ⟶ 42:
{{Reflist}}
* {{cite journal|last=Evgeniou|first=Theodoros |author2=Massimiliano Pontil |author3=Tomaso Poggio|title=Regularization Networks and Support Vector Machines|journal=Advances in Computational Mathematics|year=2000|volume=13|issue=1|pages=1–50|doi=10.1023/A:1018946025316|url=http://cbcl.mit.edu/projects/cbcl/publications/ps/evgeniou-reviewall.pdf}}
* {{cite web|last=Joachims|first=Thorsten|title=SVMlight|url=http://svmlight.joachims.org/}}
* {{cite book|last=Vapnik|first=Vladimir|title=The Nature of Statistical Learning Theory|year=1999|publisher=Springer-Verlag|___location=New York|isbn=0-387-98780-0|url=https://books.google.com/books?hl=en&lr=&id=sna9BaxVbj8C&oi=fnd&pg=PR7&dq=vapnik+the+nature+of+statistical+learning+theory&ots=onJeJ-it9b&sig=5g3uQT1umnkJKqcPaKUqpi10DMQ#v=onepage&q=vapnik%20the%20nature%20of%20statistical%20learning%20theory&f=false}}
[[Category:Support vector machines]]
|