Content deleted Content added
Reverting edit(s) by 5.10.231.32 (talk) to rev. 1073696056 by 75.73.10.78: non-english (RW 16.1) |
|||
Line 193:
It is efficient to scan the training examples in order of decreasing border ratio.<ref name="MirkesKnn">Mirkes, Evgeny M.; [http://www.math.le.ac.uk/people/ag153/homepage/KNN/KNN3.html ''KNN and Potential Energy: applet''], University of Leicester, 2011</ref> The border ratio of a training example ''x'' is defined as
: {{math|1= ''a''(''x'') = {{sfrac|{{
where {{math|{{
The border ratio is in the interval [0,1] because {{math|{{
Below is an illustration of CNN in a series of figures. There are three classes (red, green and blue). Fig. 1: initially there are 60 points in each class. Fig. 2 shows the 1NN classification map: each pixel is classified by 1NN using all the data. Fig. 3 shows the 5NN classification map. White areas correspond to the unclassified regions, where 5NN voting is tied (for example, if there are two green, two red and one blue points among 5 nearest neighbors). Fig. 4 shows the reduced data set. The crosses are the class-outliers selected by the (3,2)NN rule (all the three nearest neighbors of these instances belong to other classes); the squares are the prototypes, and the empty circles are the absorbed points. The left bottom corner shows the numbers of the class-outliers, prototypes and absorbed points for all three classes. The number of prototypes varies from 15% to 20% for different classes in this example. Fig. 5 shows that the 1NN classification map with the prototypes is very similar to that with the initial data set. The figures were produced using the Mirkes applet.<ref name = "MirkesKnn" />
|