Content deleted Content added
m Open access bot: hdl added to citation with #oabot. |
No edit summary |
||
Line 1:
{{Short description|Process in algebra}}
{{Refimprove|date=June 2021}}
In [[multilinear algebra]], a '''tensor decomposition''' is any scheme for expressing a "data tensor" (M-way array) as a sequence of elementary operations acting on other, often simpler tensors.<ref name=VasilescuDSP>{{cite journal|first1=MAO|last1=Vasilescu|first2=D|last2=Terzopoulos|title=Multilinear (tensor) image synthesis, analysis, and recognition [exploratory dsp]|journal=IEEE Signal Processing Magazine|volume=24|issue=6|pages=118–123}}</ref><ref>{{Cite journal |last1=Kolda |first1=Tamara G. |last2=Bader |first2=Brett W. |date=2009-08-06 |title=Tensor Decompositions and Applications |url=http://epubs.siam.org/doi/10.1137/07070111X |journal=SIAM Review |language=en |volume=51 |issue=3 |pages=455–500 |doi=10.1137/07070111X |bibcode=2009SIAMR..51..455K |s2cid=16074195 |issn=0036-1445}}</ref>
[[Tensors]] are generalizations of matrices to higher dimensions (or rather to higher orders, i.e. the higher number of dimensions) and can consequently be treated as multidimensional fields.<ref name="VasilescuDSP"/><ref>{{Cite arXiv |last1=Rabanser |first1=Stephan |last2=Shchur |first2=Oleksandr |last3=Günnemann |first3=Stephan |date=2017 |title=Introduction to Tensor Decompositions and their Applications in Machine Learning |class=stat.ML |eprint=1711.10781}}</ref>
|