Tensor decomposition: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 1:
{{Short description|Process in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' <ref name=VasilescuDSP>{{Citecite journal |lastfirst1=Sidiropoulos MAO|firstlast1=Vasilescu|first2=Nicholas D. |last2=De Lathauwer Terzopoulos|first2title=LievenMultilinear (tensor) image synthesis, analysis, and recognition [exploratory dsp]|last3work=FuIEEE Signal Processing Magazine|first3volume=Xiao 24|last4issue=Huang (6)|first4pages=Kejun118-123}}</ref><ref>{{Cite journal |last5last=PapalexakisKolda |first5first=EvangelosTamara EG. |last6last2=FaloutsosBader |first6first2=ChristosBrett W. |date=20172009-0708-0106 |title=Tensor Decomposition for Signal ProcessingDecompositions and Machine LearningApplications |url=http://ieeexploreepubs.ieeesiam.org/documentdoi/789154610.1137/07070111X |journal=IEEESIAM TransactionsReview on Signal Processing|language=en |volume=6551 |issue=133 |pages=3551–3582455–500 |doi=10.11091137/TSP.2017.269052407070111X |issn=10530036-587X1445}}</ref> <ref>{{Cite journal |last=KoldaSidiropoulos |first=TamaraNicholas GD. |last2=BaderDe Lathauwer |first2=BrettLieven W|last3=Fu |first3=Xiao |last4=Huang |first4=Kejun |last5=Papalexakis |first5=Evangelos E. |last6=Faloutsos |first6=Christos |date=20092017-0807-0601 |title=Tensor DecompositionsDecomposition for Signal Processing and ApplicationsMachine Learning |url=http://epubsieeexplore.siamieee.org/doidocument/10.11377891546/07070111X |journal=SIAMIEEE ReviewTransactions |language=enon Signal Processing |volume=5165 |issue=313 |pages=455–5003551–3582 |doi=10.11371109/07070111XTSP.2017.2690524 |issn=00361053-1445587X}}</ref> is any scheme for expressing a [["data tensor]]" (M-way array) as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some [[matrix decomposition]]s.<ref>{{Cite journal|date=2013-05-01|title=General tensor decomposition, moment matrices and applications|url=https://www.sciencedirect.com/science/article/pii/S0747717112001290|journal=Journal of Symbolic Computation|language=en|volume=52|pages=51–71|doi=10.1016/j.jsc.2012.05.012|issn=0747-7171|arxiv=1105.1229|last1=Bernardi |first1=A. |last2=Brachat |first2=J. |last3=Comon |first3=P. |last4=Mourrain |first4=B. |s2cid=14181289 }}</ref>
 
[[Tensors]] are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields <ref name="VasilescuDSP"/><ref>{{Cite journal |last=Rabanser |first=Stephan |last2=Shchur |first2=Oleksandr |last3=Günnemann |first3=Stephan |date=2017 |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |url=https://arxiv.org/abs/1711.10781 |doi=10.48550/ARXIV.1711.10781}}</ref>.
The main tensor decompositions are:
* [[Tensor rank decomposition]]<ref>{{Cite journal |last=Papalexakis |first=Evangelos E. |date=2016-06-30 |title=Automatic Unsupervised Tensor Mining with Quality Assessment |url=https://epubs.siam.org/doi/10.1137/1.9781611974348.80 |journal=Proceedings of the 2016 SIAM International Conference on Data Mining |language=en |publisher=Society for Industrial and Applied Mathematics |pages=711–719 |doi=10.1137/1.9781611974348.80 |isbn=978-1-61197-434-8}}</ref>;
* [[Higher-order singular value decomposition]];<ref >{{Cite journal|first1=M.A.O.|last1=Vasilescu
|first2=D.|last2=Terzopoulos
|url=http://www.cs.toronto.edu/~maov/tensorfaces/Springer%20ECCV%202002_files/eccv02proceeding_23500447.pdf
|title=Multilinear Analysis of Image Ensembles: TensorFaces
|series=Lecture Notes in Computer Science; (Presented at Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark)
|publisher=Springer, Berlin, Heidelberg
|volume=2350
|doi=10.1007/3-540-47969-4_30
|isbn=978-3-540-43745-1
|year=2002
}}</ref>;
* [[Tucker decomposition]];
* [[matrix product state]]s, and operators or tensor trains;
* [[Online Tensor Decompositions]]<ref>{{citeCite journal |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos E. |title=SamBaTen: Sampling-based Batch Incremental Tensor Decomposition|journal=Proceedings of the 2018 SIAM International Conference on Data Mining |date=7 May 2018 |doi=10.1137/1.9781611975321}}</ref><ref>{{citeCite journal |last1=Gujral |first1=Ekta |last2=Papalexakis |first2=Evangelos E. |title=OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors |journal=2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA) |date=9 October 2020 |doi=10.1109/DSAA49011.2020.00029}}</ref><ref name="ektagujral">{{Cite journal |last=Gujral |first=Ekta |date=2022 |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |url=https://arxiv.org/abs/2210.04404 |doi=10.48550/ARXIV.2210.04404}}</ref>
* [[hierarchical Tucker decomposition]]<ref name=Vasilescu2019>{{Citation |first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|date=2019|title=Compositional Hierarchical Tensor Factorization: Representing Hierarchical Intrinsic and Extrinsic Causal Factors|work=In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’19): Tensor Methods for Emerging Data Science Challenges|url=https://arxiv.org/pdf/1911.04180.pdf}}</ref>; and
* [[hierarchical Tucker decomposition]]; and
* [[block term decomposition]]<ref>{{Cite journal |last=De Lathauwer |first=Lieven |title=Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness |url=http://epubs.siam.org/doi/10.1137/070690729 |journal=SIAM Journal on Matrix Analysis and Applications |language=en |doi=10.1137/070690729}}</ref><ref>{{Citecitation|first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|first3=X.S.|last3=Zeng|title=CausalX: Causal eXplanations and Block Multilinear Factor Analysis|work=Conference Proc. of the 2020 25th International Conference on Pattern Recognition (ICPR 2020)}}</ref><ref name="Vasilescu2019/><ref>{{cite journal |last=Gujral |first=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos |date=2020-04-20 |title=Beyond Rank-1: Discovering Rich Community Structure in Multi-Aspect Graphs |url=https://dl.acm.org/doi/10.1145/3366423.3380129 |journal=Proceedings of The Web Conference 2020 |language=en |___location=Taipei Taiwan |publisher=ACM |pages=452–462 |doi=10.1145/3366423.3380129 |isbn=978-1-4503-7023-3}}</ref>
==Preliminary Definitions and Notation==
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table.
Line 27 ⟶ 37:
|}
==Introduction==
A multi-viewway graph with K viewsperspectives is a collection of K matrices <math>{X_1,X_2.....X_K}</math> with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.