Symmetric tensor: Difference between revisions

Content deleted Content added
Jaspervdg (talk | contribs)
Definition: Moved symmetrization out of the definition and clarified the symmetric product notation.
Jaspervdg (talk | contribs)
Moved examples up a bit, worked on references and removed some left-overs.
Line 30:
We then construct Sym(''V'') as the [[direct sum of vector spaces|direct sum]] of Sym<sup>''k''</sup>(''V'') for ''k'' = 0,1,2,…
:<math>\operatorname{Sym}(V)= \bigoplus_{k=0}^\infty \operatorname{Sym}^k(V).</math>
 
==Examples==
Many [[material properties]] and [[field (physics)|fields]] used in physics and engineering can be represented as symmetric tensor fields; for example,: [[stress (physics)|stress]], [[strain tensor|strain]], and [[anisotropic]] [[Electrical resistivity and conductivity|conductivity]]. Also, in [[diffusion MRI]] one often uses symmetric tensors to describe diffusion in the brain or other parts of the body.
 
Ellipsoids are examples of [[algebraic varieties]]; and so, for general rank, symmetric tensors, in the guise of [[homogeneous polynomial]]s, are used to define [[projective varieties]], and are often studied as such.
 
==Symmetric part of a tensor==
Line 50 ⟶ 55:
:<math>v_1\odot v_2\odot\cdots\odot v_r := \frac{1}{r!}\sum_{\sigma\in\mathfrak{S}_r} v_{i_{\sigma 1}}\otimes v_{i_{\sigma 2}}\otimes\cdots\otimes v_{i_{\sigma r}}.</math>
 
In general we can turn Sym(''V'') into an [[algebra]] by defining the commutative and associative product '<math>\odot</math>'<ref name="Kostrikin1997">{{cite book
| last1 = Kostrikin | first1 = Alexei I.
| last2 = Manin | first2 = Iurii Ivanovich
Line 62 ⟶ 67:
| pages = 276--279
| isbn = 9056990497
| ref = Kostrikin1997
}}</ref>. Given two tensors ''T''<sub>1</sub>&isin;Sym<sup>''k''<sub>1</sub></sup>(''V'') and ''T''<sub>2</sub>&isin;Sym<sup>''k''<sub>2</sub></sup>(''V''), we use the symmetrization operator to define:
:<math>T_1\odot T_2 = \operatorname{Sym}(T_1\otimes T_2)\quad\left(\in\operatorname{Sym}^{k_1+k_2}(V)\right).</math>
It can be verified (as is done by [[#Kostrikin1997|Kostrikin and Manin]]<ref name="Kostrikin1997"></ref>) that the resulting product is in fact commutative and associative. In some cases the operator is not written at all: ''T''<sub>1</sub>''T''<sub>2</sub> = ''T''<sub>1</sub><math>\odot</math>''T''<sub>2</sub>.
 
In some cases an exponential notation is used:
:<math>v^{\odot k}=\underbrace{v\odot v\odot\cdots\odot v}_{k \text{ times}}.</math>
Or simply:
:<math>v^k=\underbrace{v\,v\,\cdots\,v}_{k \text{ times}}=\underbrace{v\odot v\odot\cdots\odot v}_{k \text{ times}}.</math>
 
==Examples==
Many [[material properties]] and [[field (physics)|fields]] used in physics and engineering can be represented as symmetric tensor fields; for example, [[stress (physics)|stress]], [[strain tensor|strain]], and [[anisotropic]] [[Electrical resistivity and conductivity|conductivity]].
 
Ellipsoids are examples of [[algebraic varieties]]; and so, for general rank, symmetric tensors, in the guise of [[homogeneous polynomial]]s, are used to define [[projective varieties]], and are often studied as such.
 
==Decomposition==
In full analogy with the theory of [[symmetric matrix|symmetric matrices]], a (real) symmetric tensor of order 2 can be "diagonalized". More precisely, for any tensor ''T''&nbsp;&isin;&nbsp;Sym<sup>2</sup>(''V''), there is an integer ''r'' and non-zero vectors ''v''<sub>1</sub>,...,''v''<sub>''r''</sub>&nbsp;&isin;&nbsp;''V'' such that
:<math>T = \sum_{i=1}^r \pm v_i\otimes v_i.</math>
This is [[Sylvester's law of inertia]]. The minimum number ''r'' for which such a decomposition is possible is the rank of ''T''. The vectors appearing in this minimal expression are the ''[[principal axes]]'' of the tensor, and generally have an important physical meaning. For example, the principal axes of the [[inertia tensor]] define the [[Poinsot's ellipsoid]] representing the moment of inertia.
 
[[Ellipsoid]]s are examples of [[algebraic varieties]]; and so, for general rank, symmetric tensors, in the guise of [[homogeneous polynomial]]s, are used to define [[projective varieties]], and are often studied as such.
 
For symmetric tensors of arbitrary order ''k'', decompositions
:<math>T = \sum_{i=1}^r \lambda_i \, \underbrace{v_i\otimes v_i\otimes\cdots\otimes v_i}_{k\text{ times}}</math>
are also possible. The minimum number ''r'' for which such a decomposition is possible is the ''symmetric'' [[Tensor_(intrinsic_definition)#Tensor_rank|rank]] of ''T''<ref name="Comon2008"></ref>. For second order tensors this corresponds to the rank of the matrix representing the tensor in any basis, and it is well-known that the maximum rank is equal to the dimension of the underlying vector space. However, for higher orders this need not hold: the rank can be higher than the number of dimensions in the underlying vector space. The [[higher-order singular value decomposition]] of a symmetric tensor is a special decomposition of this form <ref name="Comon2008">{{Cite doi|10.1137/060661569}}</ref> (often called the [[CP decomposition|canonical decomposition]].)
 
==See also==
Line 101 ⟶ 94:
==References==
* {{citation|first = Nicolas|last=Bourbaki|authorlink=Nicolas Bourbaki | title = Elements of mathematics, Algebra I| publisher = Springer-Verlag | year = 1989|isbn=3-540-64243-9}}.
* {{citation|first = Nicolas|last=Bourbaki|authorlink=Nicolas Bourbaki | title = Elements of mathematics, Algebra II| publisher = Springer-Verlag | year = 1990|isbn=3-540-19375-8}}.
* {{Citation | last1=Greub | first1=Werner Hildbert | title=Multilinear algebra | publisher=Springer-Verlag New York, Inc., New York | series=Die Grundlehren der Mathematischen Wissenschaften, Band 136 | id={{MathSciNet | id = 0224623}} | year=1967}}.
* {{Citation | last1=Sternberg | first1=Shlomo | author1-link=Shlomo Sternberg | title=Lectures on differential geometry | publisher=Chelsea | ___location=New York | isbn=978-0-8284-0316-0 | year=1983}}.