CUR matrix approximation: Difference between revisions

Content deleted Content added
m arxivify URL / redundant url using AWB
m Tensor: removed duplicated id parameter
Line 13:
 
==Tensor==
Tensor-CURT decomposition<ref>{{cite arXiv|title=Relative Error Tensor Low Rank Approximation|eprint=1704.08246|arxiv = 1704.08246|last1=Song|first1=Zhao|last2=Woodruff|first2=David P.|last3=Zhong|first3=Peilin|class=cs.DS|year=2017}}</ref>
is a generalization of matrix-CUR decomposition. Formally, a CURT tensor approximation of a tensor ''A'' is three matrices and a (core-)tensor ''C'', ''R'', ''T'' and ''U'' such that ''C'' is made from columns of ''A'', ''R'' is made from rows of ''A'', ''T'' is made from tubes of ''A'' and that the product ''U(C,R,T)'' (where the <math>i,j,l</math>-th entry of it is <math>\sum_{i',j',l'}U_{i',j',l'}C_{i,i'}R_{j,j'}T_{l,l'} </math>) closely approximates ''A''. Usually the CURT is selected to be a [[Rank (linear algebra)|rank]]-''k'' approximation, which means that ''C'' contains ''k'' columns of ''A'', ''R'' contains ''k'' rows of ''A'', ''T'' contains tubes of ''A'' and ''U'' is a ''k''-by-''k''-by-''k'' (core-)tensor.