Content deleted Content added
Minor intro structuring Tags: Mobile edit Mobile web edit |
|||
Line 8:
* In ''k-NN regression'', the output is the property value for the object. This value is the average of the values of ''k'' nearest neighbors. If ''k'' = 1, then the output is simply assigned to the value of that single nearest neighbor.
''k''-NN is a type of [[classification]] where the function is only approximated locally and all computation is deferred until function evaluation. Since this algorithm relies on distance for classification, if the features represent different physical units or come in vastly different scales, then feature-wise [[Normalization (statistics)|normalizing]] of the training data can greatly improve its accuracy
The neighbors are taken from a set of objects for which the class (for ''k''-NN classification) or the object property value (for ''k''-NN regression) is known. This can be thought of as the training set for the algorithm, though no explicit training step is required.
A peculiarity (sometimes even a disadvantage) of the ''k''-NN algorithm is
==Statistical setting==
|