Tensor: Difference between revisions

Content deleted Content added
Undid revision 1296100551 by TASPlasma (talk) seems like this decreased rather than increased consistency
Tags: Undo Reverted
 
(5 intermediate revisions by 4 users not shown)
Line 88:
{{Main|Multilinear map}}
A downside to the definition of a tensor using the multidimensional array approach is that it is not apparent from the definition that the defined object is indeed basis independent, as is expected from an intrinsically geometric object. Although it is possible to show that transformation laws indeed ensure independence from the basis, sometimes a more intrinsic definition is preferred. One approach that is common in [[differential geometry]] is to define tensors relative to a fixed (finite-dimensional) vector space ''V'', which is usually taken to be a particular vector space of some geometrical significance like the [[tangent space]] to a manifold.<ref>{{citation|last=Lee|first=John|title=Introduction to smooth manifolds|url={{google books |plainurl=y |id=4sGuQgAACAAJ|page=173}}|page=173|year=2000|publisher=Springer|isbn=978-0-387-95495-0}}</ref> In this approach, a type {{nowrap|(''p'', ''q'')}} tensor ''T'' is defined as a [[multilinear map]],
:<math> T: \underbrace{V^* \times\dots\times V^*}_{p \text{ copies}} \times \underbrace{ V \times\dots\times V}_{q \text{ copies}} \rightarrow \mathbfmathbb{R}, </math>
 
where ''V''<sup>∗</sup> is the corresponding [[dual space]] of covectors, which is linear in each of its arguments. The above assumes ''V'' is a vector space over the [[real number]]s, {{tmath|\R}}. More generally, ''V'' can be taken over any [[Field (mathematics)|field]] ''F'' (e.g. the [[complex number]]s), with ''F'' replacing {{tmath|\R}} as the codomain of the multilinear maps.
Line 388:
 
===Machine learning===
{{Main|Tensor (machine learning)}}
The properties of [[Tensor (machine learning)|tensors]], especially [[tensor decomposition]], have enabled their use in [[machine learning]] to embed higher dimensional data in [[artificial neural networks]]. This notion of tensor differs significantly from that in other areas of mathematics and physics, in the sense that a tensor is usuallythe regardedsame thing as a numericalmultidimensional quantityarray. inAbstractly, a tensor belongs to tensor product of spaces, each of which has a fixed basis, and the dimensiondimensions of the factor spaces alongcan thebe different. Thus, an example of a tensor in this context is a rectangular matrix. Just as a rectangular matrix has two axes, a horizontal and vertical axis to indicate the position of each entry, a more general tensor has as many axes as there are factors in the tensor needproduct notto bewhich it belongs, and an entry of the sametensor is referred to be a tuple of integers. The various axes have different dimensions in general.
 
== Generalizations ==