Tensor decomposition: Difference between revisions

Content deleted Content added
DOI added
Citation bot (talk | contribs)
Added hdl. Removed URL that duplicated identifier. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 659/967
 
(28 intermediate revisions by 14 users not shown)
Line 1:
{{Short description|ProceesProcess in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' <ref>{{Cite journal |last=Sidiropoulos |first=Nicholas D. |last2=De Lathauwer |first2=Lieven |last3=Fu |first3=Xiao |last4=Huang |first4=Kejun |last5=Papalexakis |first5=Evangelos E. |last6=Faloutsos |first6=Christos |date=2017-07-01 |title=Tensor Decomposition for Signal Processing and Machine Learning |url=http://ieeexplore.ieee.org/document/7891546/ |journal=IEEE Transactions on Signal Processing |volume=65 |issue=13 |pages=3551–3582 |doi=10.1109/TSP.2017.2690524 |issn=1053-587X}}</ref> <ref>{{Cite journal |last=Kolda |first=Tamara G. |last2=Bader |first2=Brett W. |date=2009-08-06 |title=Tensor Decompositions and Applications |url=http://epubs.siam.org/doi/10.1137/07070111X |journal=SIAM Review |language=en |volume=51 |issue=3 |pages=455–500 |doi=10.1137/07070111X |issn=0036-1445}}</ref> is any scheme for expressing a [[tensor]] as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some [[matrix decomposition]]s.<ref>{{Cite journal|date=2013-05-01|title=General tensor decomposition, moment matrices and applications|url=https://www.sciencedirect.com/science/article/pii/S0747717112001290|journal=Journal of Symbolic Computation|language=en|volume=52|pages=51–71|doi=10.1016/j.jsc.2012.05.012|issn=0747-7171|arxiv=1105.1229|last1=Bernardi |first1=A. |last2=Brachat |first2=J. |last3=Comon |first3=P. |last4=Mourrain |first4=B. |s2cid=14181289 }}</ref>
 
In [[multilinear algebra]], a '''tensor decomposition''' is <ref>{{Citeany journalscheme |last=Sidiropoulosfor |first=Nicholasexpressing D.a |last2=De[[Tensor Lathauwer(machine learning)|first2=Lieven"data |last3=Futensor"]] |first3=Xiao(M-way |last4=Huangarray) |first4=Kejunas |last5=Papalexakisa |first5=Evangelossequence Eof elementary operations acting on other, often simpler tensors.<ref |last6name=FaloutsosVasilescuDSP>{{cite journal|first6first1=Christos MAO|datelast1=2017-07-01 Vasilescu|first2=D|last2=Terzopoulos|title=TensorMultilinear Decomposition(tensor) forimage Signalsynthesis, Processinganalysis, and Machinerecognition Learning |url=http://ieeexplore.ieee.org/document/7891546/[exploratory dsp]|journal=IEEE Transactions on Signal Processing Magazine|volumedate=652007 |volume=24|issue=13 6|pages=3551–3582 118–123|doi=10.1109/TSPMSP.20172007.2690524906024 |issnbibcode=1053-587X2007ISPM...24R.118V }}</ref> <ref>{{Cite journal |lastlast1=Kolda |firstfirst1=Tamara G. |last2=Bader |first2=Brett W. |date=2009-08-06 |title=Tensor Decompositions and Applications |url=http://epubs.siam.org/doi/10.1137/07070111X |journal=SIAM Review |language=en |volume=51 |issue=3 |pages=455–500 |doi=10.1137/07070111X |bibcode=2009SIAMR..51..455K |s2cid=16074195 |issn=0036-1445|url-access=subscription }}</ref><ref>{{Cite isjournal any|last1=Sidiropoulos scheme|first1=Nicholas forD. expressing|last2=De aLathauwer [[tensor]]|first2=Lieven as|last3=Fu a|first3=Xiao sequence|last4=Huang of|first4=Kejun elementary|last5=Papalexakis operations|first5=Evangelos actingE. |last6=Faloutsos |first6=Christos |date=2017-07-01 |title=Tensor Decomposition for Signal Processing and Machine Learning |journal=IEEE Transactions on other,Signal oftenProcessing simpler|volume=65 tensors|issue=13 |pages=3551–3582 |doi=10.1109/TSP.2017.2690524 |arxiv=1607.01668 |bibcode=2017ITSP...65.3551S |s2cid=16321768 |issn=1053-587X}}</ref> Many tensor decompositions generalize some [[matrix decomposition]]s.<ref>{{Cite journal|date=2013-05-01|title=General tensor decomposition, moment matrices and applications|url=https://www.sciencedirect.com/science/article/pii/S0747717112001290|journal=Journal of Symbolic Computation|language=en|volume=52|pages=51–71|doi=10.1016/j.jsc.2012.05.012|issn=0747-7171|arxiv=1105.1229|last1=Bernardi |first1=A. |last2=Brachat |first2=J. |last3=Comon |first3=P. |last4=Mourrain |first4=B. |hdl=11572/134905 |s2cid=14181289 }}</ref>
Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields <ref>{{Cite journal |last=Rabanser |first=Stephan |last2=Shchur |first2=Oleksandr |last3=Günnemann |first3=Stephan |date=2017 |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |url=https://arxiv.org/abs/1711.10781 |doi=10.48550/ARXIV.1711.10781}}</ref>.
 
[[Tensors]] are generalizations of matrices to higher dimensions (or rather to higher orders, i.e. the higher number of dimensions) and can consequently be treated as multidimensional fields.<ref name="VasilescuDSP"/><ref>{{Cite journalarXiv |lastlast1=Rabanser |firstfirst1=Stephan |last2=Shchur |first2=Oleksandr |last3=Günnemann |first3=Stephan |date=2017 |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |urlclass=https://arxivstat.org/abs/1711.10781ML |doieprint=10.48550/ARXIV.1711.10781}}</ref>.
The main tensor decompositions are:
* [[Tensor rank decomposition]];<ref>{{Cite journalbook |last=Papalexakis |first=Evangelos E. |chapter=Automatic Unsupervised Tensor Mining with Quality Assessment |date=2016-06-30 |title=AutomaticProceedings Unsupervisedof Tensorthe Mining2016 withSIAM QualityInternational AssessmentConference on Data Mining |chapter-url=https://epubs.siam.org/doi/10.1137/1.9781611974348.80 |journal=Proceedings of the 2016 SIAM International Conference on Data Mining |language=en |publisher=Society for Industrial and Applied Mathematics |pages=711–719 |doi=10.1137/1.9781611974348.80 |arxiv=1503.03355 |isbn=978-1-61197-434-8|s2cid=10147789 }}</ref>;
* [[Higher-order singular value decomposition]];<ref >{{Cite book
|first1 = M.A.O.
|last1 = Vasilescu
|first2 = D.
|last2 = Terzopoulos
|url = http://www.cs.toronto.edu/~maov/tensorfaces/Springer%20ECCV%202002_files/eccv02proceeding_23500447.pdf
|title = Multilinear Analysis of Image Ensembles: TensorFaces
|series = Lecture Notes in Computer Science; (Presented at Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark)
|publisher = Springer, Berlin, Heidelberg
|volume = 2350
|doi = 10.1007/3-540-47969-4_30
|isbn = 978-3-540-43745-1
|year = 2002
|archive-date = 2022-12-29
|access-date = 2023-03-19
|archive-url = https://web.archive.org/web/20221229090931/http://www.cs.toronto.edu/~maov/tensorfaces/Springer%20ECCV%202002_files/eccv02proceeding_23500447.pdf
|url-status = dead
}}</ref>
* [[Tucker decomposition]];
* [[matrix product state]]s, and operators or tensor trains;
* [[Online Tensor Decompositions]]<ref>{{citeCite journalbook |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos E. |titleeditor-first1=SamBaTen:Martin Sampling|editor-basedfirst2=Dino Batch|editor-last1=Ester Incremental|editor-last2=Pedreschi Tensor Decomposition|journaltitle=Proceedings of the 2018 SIAM International Conference on Data Mining |date=7 May 2018 |doi=10.1137/1.9781611975321|isbn=978-1-61197-532-1 |s2cid=21674935 |hdl=10536/DRO/DU:30109588 |hdl-access=free }}</ref><ref>{{citeCite journalbook |last1=Gujral |first1=Ekta |last2=Papalexakis |first2=Evangelos E. |title=OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors |journal=2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA) |chapter=OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors |date=9 October 2020 |pages=168–177 |doi=10.1109/DSAA49011.2020.00029|isbn=978-1-7281-8206-3 |s2cid=227123356 }}</ref><ref name="ektagujral">{{cite webCite arXiv|last1last=Gujral |first1first=Ekta |date=2022 |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |urlclass=https://arxivcs.org/abs/SI |eprint=2210.04404}}</ref>
* [[hierarchical Tucker decomposition]];<ref name=Vasilescu2019>{{cite conference |first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|date=2019|title=Compositional Hierarchical Tensor Factorization: Representing Hierarchical Intrinsic and Extrinsic Causal Factors|conference=In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’19): Tensor Methods for Emerging Data Science Challenges |eprint=1911.04180 }}</ref>
* [[hierarchical Tucker decomposition]]; and
* [[block term decomposition]]<ref>{{citeCite webjournal |last1last=De Lathauwer |first1first=Lieven De |title=Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness |url=httpshttp://epubs.siam.org/doi/abs/10.1137/070690729 |journal=SIAM Journal on Matrix Analysis and Applications |year=2008 |volume=30 |issue=3 |pages=1033–1066 |language=en |doi=10.1137/070690729|url-access=subscription }}</ref><ref>{{citation|first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|first3=X.S.|last3=Zeng|title=CausalX: Causal eXplanations and Block Multilinear Factor Analysis |work=Conference Proc. of the 2020 25th International Conference on Pattern Recognition (ICPR 2020)|year=2021 |pages=10736–10743 |doi=10.1109/ICPR48806.2021.9412780 |arxiv=2102.12853 |isbn=978-1-7281-8808-9 |s2cid=232046205 }}</ref><ref name="Vasilescu2019" /><ref>{{cite webbook |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos |title=Proceedings of the Web Conference 2020 |chapter=Beyond rankRank-1: Discovering richRich communityCommunity structureStructure in multiMulti-aspectAspect graphsGraphs |date=2020-04-20 |chapter-url=https://dl.acm.org/doi/abs/10.1145/3366423.3380129 |language=en |___location=Taipei Taiwan |publisher=ACM |pages=452–462 |doi=10.1145/3366423.3380129 |isbn=978-1-4503-7023-3|s2cid=212745714 }}</ref>.
==Preliminary Definitions and Notation==
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table <ref>{{cite web |last1=Gujral |first1=Ekta |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |url=https://arxiv.org/abs/2210.04404}}</ref>
 
{| class="wikitable"
Line 20 ⟶ 38:
! Symbols!! Definition
|-
| <math>{\mathbfa, {A\bf a},{\bf a}^T,\mathbf{aA},a{\mathcal A}}</math> ||Matrixscalar, Column vector, Scalarrow, matrix, tensor
|-
| <math>{\mathbbbf a}={R}vec(.)}</math> || Setvectorizing either a matrix ofor Reala Numberstensor
|-
| <math>{vec()\bf A}_{[m]}</math> || Vectorizationmatrixized operatortensor <math>\mathcal A</math>
|-
| <math>\times_m</math> || mode-m product
|}
==Introduction==
A multi-view graph with K views is a collection of K matrices <math>{X_1,X_2.....X_K}</math> with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
 
 
==Introduction==
A multi-viewway graph with K viewsperspectives is a collection of K matrices <math>{X_1,X_2.....X_K}</math> with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
==References==
{{Reflist}}