Weighted majority algorithm (machine learning): Difference between revisions

Content deleted Content added
No edit summary
Open access status updates in citations with OAbot #oabot
 
(21 intermediate revisions by 18 users not shown)
Line 1:
{{Short description|Method of using a pool of algorithms}}
{{Wikify-date|August 2006}}
In [[machine learning]], '''weighted majority algorithm (WMA)''' is a [[meta-learning (computer science)|meta learning]] [[algorithm]] used to construct a compound algorithm from a [[Pool (computer science)|pool]] of prediction algorithms, which could be any type of learning algorithms, classifiers, or even real human experts.<ref name="LW94">{{cite journal
| last1 = Littlestone
| first1 = N.
| last2 = Warmuth
| first2 = M.
| title = The Weighted Majority Algorithm
| journal = Information and Computation
| volume = 108
| issue = 2
| date = 1994
| pages = 212–261
| doi=10.1006/inco.1994.1009| doi-access = free
| url = http://www.dklevine.com/archive/refs4575.pdf
}}</ref><ref name="LW89">{{cite conference
| last1 = Littlestone
| first1 = N.
| last2 = Warmuth
| first2 = M.
| title = Weighted Majority Algorithm
* | Littlestone,N.conference & Warmuth,M. (1989). ''Weighted Majority Algorithm.''= IEEE Symposium on Foundations of Computer Science.
| date = 1989}}</ref>
The algorithm assumes that we have no prior knowledge about the accuracy of the algorithms in the pool, but there are sufficient reasons to believe that one or more will perform well.
 
'''WeightedAssume Majoritythat Algorithmthe (WMA)'''problem is a meta-learning algorithm used inbinary [[machinedecision learningproblem]]. toTo construct athe compound algorithm from, a poolpositive weight is given to each of predictionthe algorithms, whichin couldthe bepool. anyThe typecompound ofalgorithm learningthen algorithms,collects classifiers,weighted orvotes evenfrom realall humanthe experts.algorithms Thein algorithmthe assumespool, thatand wegives havethe noprediction priorthat knowledgehas abouta higher vote. If the accuracycompound ofalgorithm makes a mistake, the algorithms in the pool, butthat therecontributed areto sufficientthe reasonswrong topredicting believewill thatbe onediscounted orby morea willcertain ratio β performwhere well0<β<1.
 
It can be shown that the [[Upper and lower bounds|upper bounds]] on the number of mistakes made in a given sequence of predictions from a pool of algorithms <math> \mathbf{A} </math> is
Assume that the problem is a binary decision problem. To construct the compound algorithm, a positive weight is given to each of the algorithms in the pool. The compound algorithm then collects weighted votes from all the algorithms in the pool, and gives the prediction that has a higher vote. If the compound algorithm makes a mistake, the algorithms in the pool that contributed to the wrong predicting will be discounted by a certain ratio β where 0<β<1.
 
:<math>\mathbf{O(log|A|+m)}</math>
It can be shown that the upper bounds on the number of mistakes made in a given sequence of predictions from a pool of algorithms <math> \mathbf{A} </math> is
 
<math>\mathbf{O(log|A|+m)}</math>
 
if one algorithm in <math> \mathbf{x}_i </math> makes at most <math> \mathbf{m} </math> mistakes.
 
There are many variations of the Weightedweighted Majoritymajority Algorithmalgorithm to handle different situations, like shifting targets, infinite pools, or randomized predictions. The core mechanism remainremains similar, with the final performances of the compound algorithm bounded by a function of the performance of the '''specialist''' (best performing algorithm) in the pool.
 
== See also ==
* [[Randomized weighted majority algorithm]]
 
==References==
<references/>
 
[[Category:Machine learning algorithms]]
* Littlestone,N. & Warmuth,M. (1989). ''Weighted Majority Algorithm.'' IEEE Symposium on Foundations of Computer Science.
 
[[Category:Algorithms]]