Sample complexity: Difference between revisions

Content deleted Content added
No edit summary
m Category:Pages using invalid self-closed HTML tags: <div id=.../> -> <span id=...></span> or |id=... (see WP:ANCHOR for more options) using AWB
Line 39:
 
==Unrestricted hypothesis space: infinite sample complexity==
<divspan id='No Free Lunch Theorem'></span>
One can ask whether there exists a learning algorithm so that the sample complexity is finite in the strong sense, that is, there is a bound on the number of samples needed so that the algorithm can learn any distribution over the input-output space with a specified target error. More formally, one asks whether there exists a learning algorithm <math>ALG</math> such that, for all ''ε'', ''δ'' > 0, there exists a positive integer ''N'' such that for all ''n'' ≥ ''N'', we have<math display="block">
\sup_\rho\left(\Pr_{\rho^n}[\mathcal E(h_n) - \mathcal E^*_\mathcal{H}\geq\varepsilon]\right)<\delta,
Line 65:
 
=== Sample-complexity bounds ===
<divspan id='bounds'></span>
Suppose <math>\mathcal H</math> is a class of binary functions (functions to {0,1}). Then, <math>\mathcal H</math> is <math>(\epsilon,\delta)</math>-PAC-learnable with a sample of size:
<ref>{{Cite journal|title=The optimal sample complexity OF PAC learning