Prior knowledge for pattern recognition: Difference between revisions

Content deleted Content added
SmackBot (talk | contribs)
m Date/fix the maintenance tags or gen fixes
Definition: Improved, but did not entirely correct, the claim about the NFL theorem
Line 6:
Prior knowledge, as defined in [Scholkopf02], refers to all information about the problem available in addition to the training data. However, in this most general form, determining a [[Model (abstract)|model]] from a finite set of samples without prior knowledge is an [[ill-posed]] problem, in the sense that a unique model may not exist. Many classifiers incorporate the general smoothness assumption that a test pattern similar to one of the training samples tends to be assigned to the same class.
 
InThe importance of prior knowledge in machine learning, theis importancesuggested ofby priorits knowledgerole canin besearch seenand fromoptimization. Loosely, the [[Nono free lunch theorem]] which states that all thesearch algorithms have the same average performance over all the problems, and thus implies that to gain in performance on a certain application one must use a specialized algorithm that includes some prior knowledge about the problem. <!-- This sentence is still not right. Read the "no free lunch" article to see why.
David Wolpert actually published NFL-like results for machine learning before moving to
optimization with Bill Macready. Check his web site at NASA for a list of his publications.-->
 
The different types of prior knowledge encountered in pattern recognition are now regrouped under two main categories: class-invariance and knowledge on the data.
 
== Class-invariance ==