Computational learning theory: Difference between revisions

Content deleted Content added
Overview: cite for VC dimension
Overview: cite for Gold
Line 21:
 
There are several different approaches to computational learning theory based on making different assumptions about the
[[inference]] principles used to generalize from limited data. This includes different definitions of [[probability]] (see [[frequency probability]], [[Bayesian probability]]) and different assumptions on the generation of samples.{{citation needed|date=October 2017}} The different approaches include:{{citation needed|date=October 2017}}
 
* Exact learning, proposed by [[Dana Angluin]]{{citation needed|date=October 2017}};
* [[Probably approximately correct learning]] (PAC learning), proposed by [[Leslie Valiant]]<ref>{{cite journal |last1=Valiant |first1=Leslie |title=A Theory of the Learnable |journal=Communications of the ACM |date=1984 |volume=27 |issue=11 |pages=1134-1142 |url=https://www.montefiore.ulg.ac.be/~geurts/Cours/AML/Readings/Valiant.pdf |ref=ValTotL}}</ref>;
* [[VC theory]], proposed by [[Vladimir Vapnik]] and [[Alexey Chervonenkis]]<ref>{{cite journal |last1=Vapnik |first1=V. |last2=Chervonenkis |first2=A. |title=On the uniform convergence of relative frequencies of events to their probabilities |journal=Theory of Probability and Its Applications |date=1971 |volume=16 |issue=2 |pages=264-280 |url=https://courses.engr.illinois.edu/ece544na/fa2014/vapnik71.pdf |ref=VCdim}}</ref>;
* [[Bayesian inference]]{{citation needed|date=October 2017}};
* [[Algorithmic learning theory]], from the work of [[E. Mark Gold]]<ref>{{Cite journal | last1 = Gold | first1 = E. Mark | year = 1967 | title = Language identification in the limit | journal = Information and Control | volume = 10 | issue = 5 | pages = 447–474 | doi = 10.1016/S0019-9958(67)91165-5 | url=http://web.mit.edu/~6.863/www/spring2009/readings/gold67limit.pdf | doi-access = free }}</ref>;
* [[Algorithmic learning theory]], from the work of [[E. Mark Gold]];
* [[Online machine learning]], from the work of Nick Littlestone{{citation needed|date=October 2017}}.
 
While its primary goal is to understand learning abstractly, computational learning theory has led to the development of practical algorithms. For example, PAC theory inspired [[Boosting (meta-algorithm)|boosting]], VC theory led to [[support vector machine]]s, and Bayesian inference led to [[belief networks]].