Sample complexity: Difference between revisions

Content deleted Content added
Gedditor (talk | contribs)
#suggestededit-add-desc 1.0
Tags: Mobile edit Mobile app edit Android app edit
m Other Settings: capitalization
Line 86:
where <math>PD(\mathcal H)</math> is [[VC dimension#Generalizations|Pollard's pseudo-dimension]] of <math>\mathcal H</math>.
 
==Other Settingssettings==
In addition to the supervised learning setting, sample complexity is relevant to [[semi-supervised learning]] problems including [[Active learning (machine learning)|active learning]],<ref name="Balcan">{{cite journal |doi = 10.1007/s10994-010-5174-y|title = The true sample complexity of active learning|journal = Machine Learning|date = 2010|volume = 80|issue = 2–3|pages = 111–139|last1=Balcan|first1=Maria-Florina|authorlink1= Maria-Florina Balcan|last2=Hanneke|first2=Steve|last3=Wortman Vaughan|first3=Jennifer|doi-access = free}}</ref> where the algorithm can ask for labels to specifically chosen inputs in order to reduce the cost of obtaining many labels. The concept of sample complexity also shows up in [[reinforcement learning]],<ref>{{citation |last = Kakade | first = Sham | title = On the Sample Complexity of Reinforcement Learning | place = University College London | publisher = Gatsby Computational Neuroscience Unit. | series = PhD Thesis | year = 2003 | url = http://www.ias.tu-darmstadt.de/uploads/Research/NIPS2006/SK.pdf}}</ref> [[online machine learning|online learning]], and unsupervised algorithms, e.g. for [[dictionary learning]].<ref>{{cite journal |last1 = Vainsencher | first1 = Daniel | last2 = Mannor | first2 = Shie | last3 = Bruckstein | first3 = Alfred | title = The Sample Complexity of Dictionary Learning | journal = Journal of Machine Learning Research | volume = 12 | pages = 3259–3281 | date = 2011 | url = http://www.jmlr.org/papers/volume12/vainsencher11a/vainsencher11a.pdf}}</ref>