Multilinear principal component analysis: Difference between revisions

Content deleted Content added
Yobot (talk | contribs)
m WP:CHECKWIKI error fixes using AWB (12016)
No edit summary
Line 1:
{{context|date=June 2012}}
'''Multilinear Principal Component Analysis''' (MPCA) concerns dimension reduction of multidimensional data. Such data is represented by an array indexed by three or more indices, referred to as a multilineartensor. dataMultilinear analysisPCA is a extensionfamily of algorithms and approaches that extend [[principal component analysis]] (PCA) to handle data of multidimensional format. ItsJust originas canfor beusual tracedPCA, backthe aim of such methods is to thecompress [[Tuckerdata, decomposition]]<ref>{{Citeand journal|last1=Tucker|to first1=Ledyardfacilitate Rits analysis. For example, it can be used for data visualization.
 
MPCA can also refer to one particular algorithm of the same name. This is one approach that can be taken to extend the PCA to multi-dimensional data. It works in the following way. One computes a set of orthonormal matrices associated with each mode of a data tensor, and the data is expressed with respect to this matrix basis. The matrices are analogues of the vector principal components that occur in usual PCA. The transformation aims to account for as much of the variance in the data as possible, given a restricted number of basis matrices.
 
As a family of methods, higher dimensional PCA can be traced back to the [[Tucker decomposition]]<ref>{{Cite journal|last1=Tucker| first1=Ledyard R
| authorlink1 = Ledyard R Tucker
| title = Some mathematical notes on three-mode factor analysis
Line 7 ⟶ 11:
|date=September 1966
| doi = 10.1007/BF02289464
}}</ref> and Peter Kroonenberg's "M-mode PCA/3-mode PCA" work.<ref name="Kroonenberg1980">P. M. Kroonenberg and J. de Leeuw, [http://www.springerlink.com/content/c8551t1p31236776/ Principal component analysis of three-mode data by means of alternating least squares algorithms], Psychometrika, 45 (1980), pp. 69–97.</ref> In 2000, De Lathauwer etal. restated Tucker and Kroonenberg's work in clear and concise numerical computational terms in their SIAM paper entitled [[Multilinear Singular Value Decomposition]],<ref name="DeLathauwer2000a">L.D. Lathauwer, B.D. Moor, J. Vandewalle (2000) [http://portal.acm.org/citation.cfm?id=354398 "A multilinear singular value decomposition"], ''SIAM Journal of Matrix Analysis and Applications'', 21 (4), 1253–1278</ref> (HOSVD) and in their paper "On the Best Rank-1 and Rank-(R<sub>1</sub>, R<sub>2</sub>, ..., R<sub>N</sub> ) Approximation of Higher-order Tensors".<ref name=DeLathauwer2000b>L. D. Lathauwer, B. D. Moor, J. Vandewalle (2000) [http://portal.acm.org/citation.cfm?id=354405 "On the best rank-1 and rank-(R1, R2, ..., RN ) approximation of higher-order tensors"], ''SIAM Journal of Matrix Analysis and Applications'' 21 (4), 1324–1342.</ref>
 
Circa 2001, Vasilescu reframed the data analysis, recognition and synthesis problems as multilinear tensor problems based on the insight that most observed data are the compositional consequence of several causal factors of data formation, and are well suited for multi-modal data tensor analysis. The power of the tensor framework was showcased in a visually and mathematically compelling manner by decomposing and representing joint angels, or images in terms of their causal factors of data formation in the following works: Human Motion Signatures
Line 23 ⟶ 27:
,<ref name="Vasilescu2004">M.A.O. Vasilescu, D. Terzopoulos (2004) [http://www.media.mit.edu/~maov/tensortextures/Vasilescu_siggraph04.pdf "TensorTextures: Multilinear Image-Based Rendering", M. A. O. Vasilescu and D. Terzopoulos, Proc. ACM SIGGRAPH 2004 Conference Los Angeles, CA, August, 2004, in Computer Graphics Proceedings, Annual Conference Series, 2004, 336-342. ]</ref> or whose observations are treated as matrix <ref name="MPCA2008">H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, (2008) [http://www.dsp.utoronto.ca/~haiping/Publication/MPCA_TNN08_rev2010.pdf "MPCA: Multilinear principal component analysis of tensor objects"], ''IEEE Trans. Neural Netw.'', 19 (1), 18–39</ref> and concatenated into a data tensor.
 
 
MPCA computes a set of orthonormal matrices associated with each mode of the data tensor which are analogous to the orthonormal row and column space of a matrix computed by the matrix SVD. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data associated with each data tensor mode(axis).
 
== The algorithm ==