Tensor: Difference between revisions

Content deleted Content added
m As multidimensional arrays: fixed punctuation nit
We added modern tensor decomposition methods, which find application in neural networks.
Tags: Reverted Visual edit
Line 388:
 
===Machine learning===
The properties of [[Tensor (machine learning)|tensors]], especially [[tensor decomposition]], have enabled their use in [[machine learning]] to embed higher dimensional data in [[artificial neural networks]]. This notion of tensor differs significantly from that in other areas of mathematics and physics, in the sense that a tensor is usually regarded as a numerical quantity in a fixed basis, and the dimension of the spaces along the different axes of the tensor need not be the same. Among the most applicable tensor decompositions are CP,<ref>{{Cite journal |last=Zhou |first=Mingyi |last2=Liu |first2=Yipeng |last3=Long |first3=Zhen |last4=Chen |first4=Longxi |last5=Zhu |first5=Ce |date=April 2019 |title=Tensor rank learning in CP decomposition via convolutional neural network |url=https://linkinghub.elsevier.com/retrieve/pii/S0923596518302741 |journal=Signal Processing: Image Communication |series=Tensor Image Processing |volume=73 |pages=12–21 |doi=10.1016/j.image.2018.03.017 |issn=0923-5965}}</ref> Tucker,<ref>{{Cite journal |last=Liu |first=Ye |last2=Ng |first2=Michael K. |date=April 2022 |title=Deep neural network compression by Tucker decomposition with nonlinear response |url=https://linkinghub.elsevier.com/retrieve/pii/S0950705122000326 |journal=Knowledge-Based Systems |volume=241 |pages=108171 |doi=10.1016/j.knosys.2022.108171 |issn=0950-7051}}</ref> Tensor-Train,<ref>{{Cite journal |last=Oseledets |first=I. V. |date=January 2011 |title=Tensor-Train Decomposition |url=http://epubs.siam.org/doi/10.1137/090752286 |journal=SIAM Journal on Scientific Computing |language=en |volume=33 |issue=5 |pages=2295–2317 |doi=10.1137/090752286 |issn=1064-8275}}</ref> Hierarchical Tucker,<ref>{{Cite journal |last=Fonał |first=Krzysztof |last2=Zdunek |first2=Rafał |date=July 2021 |title=Fast hierarchical tucker decomposition with single-mode preservation and tensor subspace analysis for feature extraction from augmented multimodal data |url=https://linkinghub.elsevier.com/retrieve/pii/S0925231221003453 |journal=Neurocomputing |volume=445 |pages=231–243 |doi=10.1016/j.neucom.2021.02.087 |issn=0925-2312}}</ref> Tensor-Ring,<ref>{{Cite journal |last=Wang |first=Wei |last2=Sun |first2=Guoqiang |last3=Zhao |first3=Siwen |last4=Li |first4=Yujun |last5=Zhao |first5=Jianli |date=May 2023 |title=Tensor Ring decomposition for context-aware recommendation |url=https://linkinghub.elsevier.com/retrieve/pii/S0957417423000349 |journal=Expert Systems with Applications |volume=217 |pages=119533 |doi=10.1016/j.eswa.2023.119533 |issn=0957-4174}}</ref> Block term,<ref>{{Cite journal |last=Lai |first=Yujing |last2=Chen |first2=Chuan |last3=Zheng |first3=Zibin |last4=Zhang |first4=Yangqing |date=September 2022 |title=Block term decomposition with distinct time granularities for temporal knowledge graph completion |url=https://linkinghub.elsevier.com/retrieve/pii/S0957417422004511 |journal=Expert Systems with Applications |volume=201 |pages=117036 |doi=10.1016/j.eswa.2022.117036 |issn=0957-4174}}</ref> ADATucker.<ref>{{Cite journal |last=Zhong |first=Zhisheng |last2=Wei |first2=Fangyin |last3=Lin |first3=Zhouchen |last4=Zhang |first4=Chao |date=February 2019 |title=ADA-Tucker: Compressing deep neural networks via adaptive dimension adjustment tucker decomposition |url=https://linkinghub.elsevier.com/retrieve/pii/S0893608018303010 |journal=Neural Networks |volume=110 |pages=104–115 |doi=10.1016/j.neunet.2018.10.016 |issn=0893-6080}}</ref> All these tensor decomposition apporaches find application in convolutional neural networks.<ref>{{Cite journal |last=Abdulkadirov |first=Ruslan |last2=Lyakhov |first2=Pavel |last3=Butusov |first3=Denis |last4=Nagornov |first4=Nikolay |last5=Reznikov |first5=Dmitry |last6=Bobrov |first6=Anatoly |last7=Kalita |first7=Diana |date=March 2025 |title=Enhancing Unmanned Aerial Vehicle Object Detection via Tensor Decompositions and Positive–Negative Momentum Optimizers |url=https://www.mdpi.com/2227-7390/13/5/828 |journal=Mathematics |language=en |volume=13 |issue=5 |pages=828 |doi=10.3390/math13050828 |issn=2227-7390}}</ref> Besides the ususal tensor product, many researchers develope machine learning models, containing t-prodcut.
The properties of [[Tensor (machine learning)|tensors]], especially [[tensor decomposition]], have enabled their use in [[machine learning]] to embed higher dimensional data in [[artificial neural networks]]. This notion of tensor differs significantly from that in other areas of mathematics and physics, in the sense that a tensor is usually regarded as a numerical quantity in a fixed basis, and the dimension of the spaces along the different axes of the tensor need not be the same.
 
== Generalizations ==