Content deleted Content added
m Open access bot: hdl added to citation with #oabot. |
Citation bot (talk | contribs) Added hdl. Removed URL that duplicated identifier. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 659/967 |
||
(6 intermediate revisions by 6 users not shown) | |||
Line 1:
{{Short description|Process in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' is any scheme for expressing a [[Tensor (machine learning)|"data tensor"]] (M-way array) as a sequence of elementary operations acting on other, often simpler tensors.<ref name=VasilescuDSP>{{cite journal|first1=MAO|last1=Vasilescu|first2=D|last2=Terzopoulos|title=Multilinear (tensor) image synthesis, analysis, and recognition [exploratory dsp]|journal=IEEE Signal Processing Magazine|date=2007 |volume=24|issue=6|pages=118–123|doi=10.1109/MSP.2007.906024 |bibcode=2007ISPM...24R.118V }}</ref><ref>{{Cite journal |last1=Kolda |first1=Tamara G. |last2=Bader |first2=Brett W. |date=2009-08-06 |title=Tensor Decompositions and Applications |url=http://epubs.siam.org/doi/10.1137/07070111X |journal=SIAM Review |language=en |volume=51 |issue=3 |pages=455–500 |doi=10.1137/07070111X |bibcode=2009SIAMR..51..455K |s2cid=16074195 |issn=0036-1445|url-access=subscription }}</ref>
[[Tensors]] are generalizations of matrices to higher dimensions (or rather to higher orders, i.e. the higher number of dimensions) and can consequently be treated as multidimensional fields.<ref name="VasilescuDSP"/><ref>{{Cite arXiv |last1=Rabanser |first1=Stephan |last2=Shchur |first2=Oleksandr |last3=Günnemann |first3=Stephan |date=2017 |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |class=stat.ML |eprint=1711.10781}}</ref>
The main tensor decompositions are:
* [[Tensor rank decomposition]];<ref>{{Cite
* [[Higher-order singular value decomposition]];<ref >{{Cite book
|first1 = M.A.O. |last1 = Vasilescu |first2 = D.
|last2 = Terzopoulos
|url = http://www.cs.toronto.edu/~maov/tensorfaces/Springer%20ECCV%202002_files/eccv02proceeding_23500447.pdf
|title = Multilinear Analysis of Image Ensembles: TensorFaces
|series = Lecture Notes in Computer Science; (Presented at Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark) |publisher = Springer, Berlin, Heidelberg
|volume = 2350
|doi = 10.1007/3-540-47969-4_30
|isbn = 978-3-540-43745-1
|year = 2002
|archive-date = 2022-12-29
}}</ref>▼
|access-date = 2023-03-19
|archive-url = https://web.archive.org/web/20221229090931/http://www.cs.toronto.edu/~maov/tensorfaces/Springer%20ECCV%202002_files/eccv02proceeding_23500447.pdf
|url-status = dead
* [[Tucker decomposition]];
* [[matrix product state]]s, and operators or tensor trains;
* [[Online Tensor Decompositions]]<ref>{{Cite
* [[hierarchical Tucker decomposition]];<ref name=Vasilescu2019>{{cite conference |first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|date=2019|title=Compositional Hierarchical Tensor Factorization: Representing Hierarchical Intrinsic and Extrinsic Causal Factors|conference=In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’19): Tensor Methods for Emerging Data Science Challenges |eprint=1911.04180 }}</ref>
* [[block term decomposition]]<ref>{{Cite journal |last=De Lathauwer|first=Lieven |title=Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness |url=http://epubs.siam.org/doi/10.1137/070690729 |journal=SIAM Journal on Matrix Analysis and Applications |year=2008 |volume=30 |issue=3 |pages=1033–1066 |language=en |doi=10.1137/070690729|url-access=subscription }}</ref><ref>{{citation|first1=M.A.O.|last1=Vasilescu|first2=E.|last2=Kim|first3=X.S.|last3=Zeng|title=CausalX: Causal eXplanations and Block Multilinear Factor Analysis |work=Conference Proc. of the 2020 25th International Conference on Pattern Recognition (ICPR 2020)|year=2021 |pages=10736–10743 |doi=10.1109/ICPR48806.2021.9412780 |arxiv=2102.12853 |isbn=978-1-7281-8808-9 |s2cid=232046205 }}</ref><ref name="Vasilescu2019" /><ref>{{cite book |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos |title=Proceedings of the Web Conference 2020 |chapter=Beyond Rank-1: Discovering Rich Community Structure in Multi-Aspect Graphs |date=2020-04-20 |chapter-url=https://dl.acm.org/doi/10.1145/3366423.3380129 |language=en |___location=Taipei Taiwan |publisher=ACM |pages=452–462 |doi=10.1145/3366423.3380129 |isbn=978-1-4503-7023-3|s2cid=212745714 }}</ref>
==Notation==
This section introduces basic notations and operations that are widely used in the field.
Line 41 ⟶ 49:
==Introduction==
A multi-way graph with K perspectives is a collection of K matrices <math>{X_1,X_2.....X_K}</math> with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
==References==
{{Reflist}}
|