Content deleted Content added
Citation bot (talk | contribs) Add: authors 1-1. Removed URL that duplicated identifier. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | #UCB_CommandLine |
massive failure of Cite Q: wrong citation type, wrong author order, etc |
||
Line 117:
=== Different cost functions and regularizations ===
There are different types of non-negative matrix factorizations.
The different types arise from using different [[Loss function|cost function]]s for measuring the divergence between {{math|'''V'''}} and {{math|'''WH'''}} and possibly by [[regularization (mathematics)|regularization]] of the {{math|'''W'''}} and/or {{math|'''H'''}} matrices.<ref name="dhillon">{{
| last1 = Dhillon | first1 = Inderjit S.
| last2 = Sra | first2 = Suvrit
| contribution = Generalized Nonnegative Matrix Approximations with Bregman Divergences
| contribution-url = https://proceedings.neurips.cc/paper/2005/hash/d58e2f077670f4de9cd7963c857f2534-Abstract.html
| pages = 283–290
| title = Advances in Neural Information Processing Systems 18 [Neural Information Processing Systems, NIPS 2005, December 5-8, 2005, Vancouver, British Columbia, Canada]
| year = 2005}}</ref>
Two simple divergence functions studied by Lee and Seung are the squared error (or [[Frobenius norm]]) and an extension of the Kullback–Leibler divergence to positive matrices (the original [[Kullback–Leibler divergence]] is defined on probability distributions).
Line 669 ⟶ 676:
| chapter = Multi-View Clustering via Joint Nonnegative Matrix Factorization
| name-list-style = amp
| date= 2013
| url = http://jialu.cs.illinois.edu/paper/sdm2013-liu.pdf
|