Content deleted Content added
mNo edit summary |
m Replace magic links with templates per local RfC and MediaWiki RfC |
||
Line 1:
In [[machine learning]], '''instance-based learning''' (sometimes called '''memory-based learning'''<ref>{{cite book |author1=Walter Daelemans |authorlink1=Walter Daelemans |author2=Antal van den Bosch |authorlink2=Antal van den Bosch |year=2005 |title=Memory-Based Language Processing |publisher=Cambridge University Press}}</ref>) is a family of learning algorithms that, instead of performing explicit generalization, compares new problem instances with instances seen in training, which have been stored in memory.
It is called instance-based because it constructs hypotheses directly from the training instances themselves.<ref name='aima733'>[[Stuart J. Russell|Stuart Russell]] and [[Peter Norvig]] (2003). ''[[Artificial Intelligence: A Modern Approach]]'', second edition, p. 733. Prentice Hall. {{ISBN
This means that the hypothesis complexity can grow with the data:<ref name='aima733'/> in the worst case, a hypothesis is a list of ''n'' training items and the computational complexity of [[Classification (machine learning)|classifying]] a single new instance is [[Big O notation|''O'']](''n''). One advantage that instance-based learning has over other methods of machine learning is its ability to adapt its model to previously unseen data. Instance-based learners may simply store a new instance or throw an old instance away.
|