Content deleted Content added
mNo edit summary |
No edit summary |
||
Line 1:
{{Short description|Procees in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' <ref name="Sidiropoulos">{{cite web |last1=Sidiropoulos |first1=Nicholas D. |title=Tensor Decomposition for Signal Processing and Machine Learning |url=https://ieeexplore.ieee.org/abstract/document/7891546 |publisher=IEEE Transactions on Signal Processing}}</ref><ref>{{cite journal |last1=Kolda |first1=T. G. |title=Tensor Decompositions and Applications |journal=SIAM Review |doi=10.1137/07070111X}}</ref> is any scheme for expressing a [[tensor]] as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some [[matrix decomposition]]s.<ref>{{Cite journal|date=2013-05-01|title=General tensor decomposition, moment matrices and applications|url=https://www.sciencedirect.com/science/article/pii/S0747717112001290|journal=Journal of Symbolic Computation|language=en|volume=52|pages=51–71|doi=10.1016/j.jsc.2012.05.012|issn=0747-7171|arxiv=1105.1229|last1=Bernardi |first1=A. |last2=Brachat |first2=J. |last3=Comon |first3=P. |last4=Mourrain |first4=B. |s2cid=14181289 }}</ref>
Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields <ref>{{cite web |last1=Rabanser |first1=Stephan |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |url=https://arxiv.org/pdf/1711.10781.pdf}}</ref>.
Line 12:
* [[hierarchical Tucker decomposition]]; and
* [[block term decomposition]]<ref>{{cite web |last1=Lathauwer |first1=Lieven De |title=Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness |url=https://epubs.siam.org/doi/abs/10.1137/070690729}}</ref><ref>{{cite web |last1=Gujral |first1=Ekta |title=Beyond rank-1: Discovering rich community structure in multi-aspect graphs |url=https://dl.acm.org/doi/abs/10.1145/3366423.3380129}}</ref>.
==Preliminary Definitions and Notation==
This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table <ref>{{cite web |last1=Gujral |first1=Ekta |title=Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition |url=https://arxiv.org/abs/2210.04404}}</ref>
{| class="wikitable"
|+ Table of symbols and their description.
|-
! Symbols!! Definition
|-
| <math>{\mathbf{A}, \mathbf{a},a}</math> ||Matrix, Column vector, Scalar
|-
| <math>{\mathbb{R}}</math> || Set of Real Numbers
|-
| <math>{vec()}</math> || Vectorization operator
|}
==Introduction==
A multi-view graph with K views is a collection of K matrices <math>{X_1,X_2.....X_K}</math> with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.
==References==
{{Reflist}}
|