Content deleted Content added
No edit summary |
No edit summary |
||
Line 17:
The different types arise from using different [[cost function]]s (divergence functions) and/or by [[regularization (mathematics)|regularization]] of the '''W''' and/or '''H''' matrices<ref>[[Inderjit S. Dhillon]], [[Suvrit Sra]], "[http://books.nips.cc/papers/files/nips18/NIPS2005_0203.pdf Generalized Nonnegative Matrix Approximations with Bregman Divergences]", [[NIPS]], 2005.</ref>.
== Relatioing to Clustering ==
Although initially NMF is considered to
<ref>
Chris Ding, Xiaofeng He, and Horst D. Simon. "On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp:606-610, April 2005.</ref>
that NMF is equivalent to a relaxed [[K-means clustering]] using the Frobenius norm objective function where '''W''' is the cluster centroids and '''H''' is the cluster indicator; therefore NMF provides a new framework for data clustering. It is also known
<ref>
Chris Ding and Xiaofeng He, " Nonnegative Matrix Factorization and Probabilistic Latent Semantic Indexing: Equivalence, Chi-square Statistic, and a Hybrid Method", Proc. AAAI National Conf. on Artificial Intelligence (AAAI-06), July 2006.</ref>
that NMF is identical to [[probabilistic latent semantic analysis]] using the [[Kullback–Leibler divergence]] objective function, which can be simplified to Chi-square function at first order approximation.
== Uniqueness ==
The factorization is not unique: A matrix and its [[inverse matrix|inverse]] can be used to transform the two factorization matrices by, e.g.,
: <math>\mathbf{WH} = \mathbf{WBB}^{-1}\mathbf{H}</math>
|