Content deleted Content added
Added major changes from Yongyi781/sandbox Tag: nowiki added |
m WP:CHECKWIKI error fix. Broken bracket. Do general fixes if a problem exists. - using AWB (11800) |
||
Line 3:
In [[machine learning]], the '''sample complexity''' of a machine learning algorithm is, roughly speaking, the number of training samples needed for the algorithm to successfully learn a target function. More specifically, the sample complexity is the number of samples needed for the function returned by the algorithm to be within an arbitrarily small error of the best possible function, with probability arbitrarily close to 1.
There are two variants of sample complexity. The weak variant fixes a particular input-output distribution, and the strong variant takes the worst-case sample complexity over all input-output distributions. A natural question in statistical learning is to ask, for a given hypothesis space, whether the sample complexity is finite in the strong sense, that is, there is a bound on the number of samples needed so that an algorithm can approximately solve all possible learning problems on a particular input-output space no matter the distribution of data over that space. The No Free Lunch Theorem, discussed below, says that this is always impossible if the hypothesis space is not constrained.<ref name=":0" />
==Definition==
|