Content deleted Content added
Added a "s" to the "neighbor" word, because it should be plural |
Citation bot (talk | contribs) m Removed parameters. | You can use this bot yourself. Report bugs here. | User-activated. |
||
Line 6:
Examples of instance-based learning algorithm are the [[k-nearest neighbors algorithm|''k''-nearest neighbors algorithm]], [[kernel method|kernel machines]] and [[Radial basis function network|RBF networks]].<ref>{{cite book |author=Tom Mitchell |title=Machine Learning |year=1997 |publisher=McGraw-Hill}}</ref>{{rp|ch. 8}} These store (a subset of) their training set; when predicting a value/class for a new instance, they compute distances or similarities between this instance and the training instances to make a decision.
To battle the memory complexity of storing all training instances, as well as the risk of [[overfitting]] to noise in the training set, ''instance reduction'' algorithms have been proposed.<ref>{{cite journal |title=Reduction techniques for instance-based learning algorithms |author1=D. Randall Wilson |author2=Tony R. Martinez |journal=[[Machine Learning (journal)|Machine Learning]]
Gagliardi<ref name=Gagliardi2011>{{cite journal|last=Gagliardi|first=F|title=Instance-based classifiers applied to medical databases: Diagnosis and knowledge extraction|journal=Artificial Intelligence in Medicine|year=2011|volume=52|issue=3|pages=123–139|doi=
One of these classifiers (called ''Prototype exemplar learning classifier'' ([[PEL-C]]) is able to extract a mixture of abstracted prototypical cases (that are [[syndrome]]s) and selected atypical clinical cases.
|