Tensor decomposition: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Tag: references removed
Line 1:
{{Short description|Procees in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' <ref name="Sidiropoulos">{{cite web |last1=Sidiropoulos |first1=Nicholas D. |title=Tensor Decomposition for Signal Processing and Machine Learning |url=https://ieeexplore.ieee.org/abstract/document/7891546 |publisher=IEEE Transactions on Signal Processing}}</ref><ref>{{cite journal |last1=Kolda |first1=T. G. |title=Tensor Decompositions and Applications |journal=SIAM Review |doi=10.1137/07070111X}}</ref> is any scheme for expressing a [[tensor]] as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some [[matrix decomposition]]s.<ref>{{Cite journal|date=2013-05-01|title=General tensor decomposition, moment matrices and applications|url=https://www.sciencedirect.com/science/article/pii/S0747717112001290|journal=Journal of Symbolic Computation|language=en|volume=52|pages=51–71|doi=10.1016/j.jsc.2012.05.012|issn=0747-7171|arxiv=1105.1229|last1=Bernardi |first1=A. |last2=Brachat |first2=J. |last3=Comon |first3=P. |last4=Mourrain |first4=B. |s2cid=14181289 }}</ref>
 
Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields <ref>{{cite web |last1=Rabanser |first1=Stephan |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |url=https://arxiv.org/pdf/1711.10781.pdf}}</ref>.
The main tensor decompositions are:
* [[Tensor rank decomposition]]<ref>{{cite web |last1=Papalexakis |first1=Evangelos E. |title=Automatic unsupervised tensor mining with quality assessment |url=https://epubs.siam.org/doi/abs/10.1137/1.9781611974348.80|doi=10.1137/1.9781611974348.80}}</ref>;
* [[Higher-order singular value decomposition]];
* [[Tucker decomposition]];
* [[matrix product state]]s, and operators or tensor trains;
* [[Online Tensor Decompositions]]<ref>{{cite journal |last1=Zhou |first1=Shuo |last2=Vinh |first2=Nguyen Xuan |last3=Bailey |first3=James |last4=Jia |first4=Yunzhe |last5=Davidson |first5=Ian |title=Accelerating Online CP Decompositions for Higher Order Tensors |journal=Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining |date=13 August 2016 |pages=1375–1384 |doi=10.1145/2939672.2939763}}</ref><ref>{{cite journal |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos E. |title=SamBaTen: Sampling-based Batch Incremental Tensor Decomposition|journal=Proceedings of the 2018 SIAM International Conference on Data Mining |date=7 May 2018 |doi=10.1137/1.9781611975321}}</ref><ref>{{cite journal |last1=Gujral |first1=Ekta |last2=Papalexakis |first2=Evangelos E. |title=OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors |journal=2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA) |date=9 October 2020 |doi=10.1109/DSAA49011.2020.00029}}</ref><ref>{{cite journalweb |last1=Gujral |first1=Ekta |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |dateurl=2022 |doi=10https://arxiv.48550org/abs/arXiv.2210.04404}}</ref>
* [[hierarchical Tucker decomposition]]; and
* [[block term decomposition]]<ref>{{cite web |last1=Lathauwer |first1=Lieven De |title=Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness |url=https://epubs.siam.org/doi/abs/10.1137/070690729|doi=10.1137/070690729}}</ref><ref>{{cite journalweb |last1=Gujral |first1=Ekta |last2=Pasricha |first2=Ravdeep |last3=Papalexakis |first3=Evangelos |title=Beyond Rankrank-1: Discovering Richrich Communitycommunity Structurestructure in Multimulti-Aspectaspect Graphsgraphs |journalurl=Proceedings of The Web Conference 2020 |date=20 April 2020 |pages=452–462 https://dl.acm.org/doi/abs/10.1145/3366423.3380129|doi=10.1145/3366423.3380129}}</ref>.
 
==Preliminary Definitions and Notation==
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table <ref>{{cite journalweb |last1=Gujral |first1=Ekta |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |dateurl=2022 |doi=10https://arxiv.48550org/abs/arXiv.2210.04404}}</ref>
 
{| class="wikitable"
Line 31 ⟶ 32:
==References==
{{Reflist}}
 
[[Category:Tensors]]
 
 
{{linear-algebra-stub}}