K-nearest neighbors algorithm: Difference between revisions

Content deleted Content added
Habil zare (talk | contribs)
m See also: Nearest neighbor graph
Properties: add comment about tightness of bound
Line 111:
For multi-class ''k-''NN classification, [[Thomas M. Cover|Cover]] and [[Peter E. Hart|Hart]] (1967) prove an upper bound error rate of
<math display="block">R^* \ \leq \ R_{k\mathrm{NN}} \ \leq \ R^* \left(2 - \frac{MR^*}{M-1}\right)</math>
where <math>R^*</math>is the Bayes error rate (which is the minimal error rate possible), <math>R_{kNN}</math> is the asymptotic ''k-''NN error rate, and {{mvar|M}} is the number of classes in the problem. This bound is tight in the sense that both the lower and upper bounds are achievable by some distribution.<ref>Devroye, L., Gyorfi, L. & Lugosi, G. A Probabilistic Theory of Pattern Recognition. Discrete Appl Math 73, 192–194 (1997).</ref> For <math>M=2</math> and as the Bayesian error rate <math>R^*</math> approaches zero, this limit reduces to "not more than twice the Bayesian error rate".
 
==Error rates==