Content deleted Content added
m Open access bot: doi added to citation with #oabot. |
m Open access bot: url-access updated in citation with #oabot. |
||
(23 intermediate revisions by 17 users not shown) | |||
Line 3:
In [[mathematics]], a '''logarithm of a matrix''' is another [[matrix (mathematics)|matrix]] such that the [[matrix exponential]] of the latter matrix equals the original matrix. It is thus a generalization of the scalar [[logarithm]] and in some sense an [[inverse function]] of the [[matrix exponential]]. Not all matrices have a logarithm and those matrices that do have a logarithm may have more than one logarithm. The study of logarithms of matrices leads to [[Lie theory]] since when a matrix has a logarithm then it is in an element of a [[Lie group]] and the logarithm is the corresponding element of the vector space of the [[Lie algebra]].
== Definition ==
The [[Matrix_exponential|exponential of a matrix]] ''A'' is defined by
: <math>e^{A} \equiv \sum_{n=0}^{\infty} \frac{A^{n}}{n!}</math>.
Given a matrix ''B'', another matrix ''A'' is said to be a '''matrix logarithm''' of {{math|''B'' if ''e''<sup>''A''</sup> {{=}} ''B''}}
Because the exponential function is not [[bijective]] for [[complex number]]s (e.g. <math>e^{\pi i} = e^{3 \pi i} = -1</math>), numbers can have multiple complex logarithms, and as a consequence of this, some matrices may have more than one logarithm, as explained below. If the matrix logarithm of <math>B</math> exists and is unique, then it is written as <math>\log B,</math> in which case <math>e^{\log B} = B.</math>
==Power series expression==▼
If ''B'' is sufficiently close to the identity matrix, then a logarithm of ''B'' may be computed by means of the following power series:▼
:<math>\log(B)= \sum_{k=1}^\infty{(-1)^{k+1}\frac{(B-I)^k}{k}} =(B-I)-\frac{(B-I)^2}{2}+\frac{(B-I)^3}{3}-\frac{(B-I)^4}{4}+\cdots</math>.▼
Specifically, if <math>\left\|B-I\right\|<1</math>, then the preceding series converges and <math>e^{\log(B)}=B</math>.<ref>{{harvnb|Hall|2015}} Theorem 2.8</ref>▼
▲== Power series expression ==
==Example: Logarithm of rotations in the plane==▼
▲If ''B'' is sufficiently close to the identity matrix, then a logarithm of ''B'' may be computed by means of the
:<math>\log(B) = \log(I + (B - I)) = \sum_{k=1}^{\infty} \frac{(-1)^{k + 1}}{k} (B - I)^k = (B - I) - \frac{(B - I)^2}{2} + \frac{(B - I)^3}{3} - \cdots</math>,
which can be rewritten as
▲:<math>\log(B) = -\sum_{k=1}^{\infty
▲Specifically, if <math>\left\|
▲== Example: Logarithm of rotations in the plane ==
The rotations in the plane give a simple example. A rotation of angle ''α'' around the origin is represented by the 2×2-matrix
: <math> A =▼
▲:<math> A =
\begin{pmatrix}
\cos(\alpha) & -\sin(\alpha) \\
Line 25 ⟶ 28:
For any integer ''n'', the matrix
: <math>▼
▲:<math>
B_n=(\alpha+2\pi n)
\begin{pmatrix}
Line 33 ⟶ 35:
\end{pmatrix},
</math>
▲is a logarithm of ''A''. <br>
<math> \log(A) =B_n~</math>⇔<math>~~e^{B_n} =A </math>▼
▲{{Collapse top|title=Proof}}
<math> e^{B_n} = \sum_{k=0}^\infty{1 \over k!}B_n^k ~</math> where▼
<math>▼
▲\log(A) =B_n~</math>⇔<math>~~e^{B_n} =A
<math> (B_n)^0 = 1~I_2 ,</math>
<math>▼
▲e^{B_n} = \sum_{k=0}^\infty{1 \over k!}B_n^k
<math>
Line 57 ⟶ 51:
+1 & 0\\
\end{pmatrix},
</math
<math>
Line 65 ⟶ 59:
0 & -1 \\
\end{pmatrix},
</math
<math>
(B_n)^3=
(\alpha+2\pi n)^3\begin{pmatrix}
0 & +1 \\
-1 & 0\\
\end{pmatrix},
</math>
<math>
(B_n)^4=
(\alpha+2\pi n)^4~I_2
</math
...
<math>
\sum_{k=0}^\infty{1 \over k!}B_n^k =\begin{pmatrix}
\sum_{k=0}^\infty{(-1)^k \over 2k!}(\alpha+2\pi n)^{2k} & -\sum_{k=0}^\infty{(-1)^k \over (2k+1)!}(\alpha+2\pi n)^{2k+1} \\
\sum_{k=0}^\infty{(-1)^k \over (2k+1)!}(\alpha+2\pi n)^{2k+1} & \sum_{k=0}^\infty{(-1)^k \over 2k!}(\alpha+2\pi n)^{2k} \\
\end{pmatrix} =\begin{pmatrix}
\cos(\alpha) & -\sin(\alpha) \\
\sin(\alpha) & \cos(\alpha) \\
\end{pmatrix} =A~.
</math>
qed.
▲{{Collapse bottom}}<br>
Thus, the matrix ''A'' has infinitely many logarithms. This corresponds to the fact that the rotation angle is only determined up to multiples of 2''π''.
In the language of Lie theory, the rotation matrices ''A'' are elements of the Lie group [[circle group|SO(2)]]. The corresponding logarithms ''B'' are elements of the Lie algebra so(2), which consists of all [[skew-symmetric matrix|skew-symmetric matrices]]. The matrix
▲: <math>
\begin{pmatrix}
0 & 1 \\
Line 105 ⟶ 101:
is a generator of the [[Lie algebra]] so(2).
== Existence ==
The question of whether a matrix has a logarithm has the easiest answer when considered in the complex setting. A complex matrix has a logarithm [[if and only if]] it is [[invertible matrix|invertible]].<ref>{{harvtxt|Higham|2008}}, Theorem 1.27</ref> The logarithm is not unique, but if a matrix has no negative real [[eigenvalue]]s, then there is a unique logarithm that has eigenvalues all lying in the strip <math> \{
The answer is more involved in the real setting. A real matrix has a real logarithm if and only if it is invertible and each [[Jordan block]] belonging to a negative eigenvalue occurs an even number of times.<ref>{{harvtxt|Culver|1966}}</ref> If an invertible real matrix does not satisfy the condition with the Jordan blocks, then it has only non-real logarithms. This can already be seen in the scalar case: no branch of the logarithm can be real at -1. The existence of real matrix logarithms of real
== Properties ==
If ''A'' and ''B'' are both [[positive-definite matrices]], then
: <math>\operatorname{tr}{\log{(AB)}} = \operatorname{tr}{\log{(A)}} + \operatorname{tr}{\log{(B)}}.</math>
Suppose that ''A'' and ''B'' commute, meaning that ''AB'' = ''BA''. Then
: <math>\log{(AB)} = \log{(A)}+\log{(B)}
if and only if <math>\operatorname{arg}(\mu_j) + \operatorname{arg}(\nu_j) \in (- \pi, \pi]</math>, where <math>\mu_j</math> is an [[eigenvalue]] of <math>A</math> and <math>\nu_j</math> is the corresponding [[eigenvalue]] of <math>B</math>.<ref>{{cite journal |last1=APRAHAMIAN |first1=MARY |last2=HIGHAM |first2=NICHOLAS J. |title=The Matrix Unwinding Function, with an Application to Computing the Matrix Exponential |journal=SIAM Journal on Matrix Analysis and Applications |year=2014 |volume=35 |issue=1 |page=97 |doi=10.1137/130920137
: <math> \log{(A^{-1})} = -\log{(A)}.</math>
Similarly, for non-commuting <math>A</math> and <math>B</math>, one can show that<ref>[https://www.ias.edu/sites/default/files/sns/files/1-matrixlog_tex(1).pdf Unpublished memo] by S Adler (IAS)</ref>
: <math>\log{(A+tB)} = \log{(A)} + t\int_0^\infty dz ~\frac{I}{A+zI} B \frac{I}{A+zI} + O(t^2).</math>
More generally, a series expansion of <math>\log{(A+tB)}</math> in powers of <math>t</math> can be obtained using the integral definition of the logarithm
: <math>\log{(X + \lambda I)} - \log{(X)} = \int_0^\lambda dz \frac{I}{X + zI},</math>
applied to both <math>X=A</math> and <math>X=A+tB</math> in the limit <math>\lambda\rightarrow\infty</math>.
== Further example: Logarithm of rotations in 3D space ==
A rotation {{mvar|R}} ∈ SO(3) in
The logarithm of such a rotation matrix {{mvar|R}} can be readily computed from the antisymmetric part of [[Rodrigues' rotation formula]], explicitly in [[
Further note that, given rotation matrices ''A'' and ''B'',
: <math> d_g(A,B) := \| \log(A^\
is the geodesic distance on the 3D manifold of rotation matrices.
== Calculating the logarithm of a diagonalizable matrix ==
A method for finding
: Find the matrix ''V'' of [[eigenvector]]s of ''A'' (each column of ''V'' is an eigenvector of ''A'').
: Find the [[matrix inverse|inverse]] ''V''<sup>−1</sup> of ''V''.
: Let
:: <math> A' = V^{-1} A V
: Then ''
: Replace each diagonal element of ''
: Then
:: <math> \log A = V ( \log A' ) V^{-1} .
That the logarithm of ''A'' might be a complex matrix even if ''A'' is real then follows from the fact that a matrix with real and positive entries might nevertheless have negative or even complex eigenvalues (this is true for example for [[rotation matrix|rotation matrices]]). The non-uniqueness of the logarithm of a matrix follows from the non-uniqueness of the logarithm of a complex number.
==
The algorithm illustrated above does not work for non-diagonalizable matrices, such as
: <math>\begin{bmatrix}1 & 1\\ 0 & 1\end{bmatrix}. </math>▼
▲:<math>\begin{bmatrix}1 & 1\\ 0 & 1\end{bmatrix}. </math>
For such matrices one needs to find its [[Jordan normal form|Jordan decomposition]] and, rather than computing the logarithm of diagonal entries as above, one would calculate the logarithm of the [[Jordan matrix|Jordan block]]s.
The latter is accomplished by noticing that one can write a Jordan block as
: <math>B=\begin{pmatrix}
\lambda & 1 & 0 & 0 & \cdots & 0 \\
0 & \lambda & 1 & 0 & \cdots & 0 \\
Line 176 ⟶ 171:
Then, by the [[Mercator series]]
: <math> \log (1+x)=x-\frac{x^2}{2}+\frac{x^3}{3}-\frac{x^4}{4}+\cdots</math>▼
▲:<math> \log (1+x)=x-\frac{x^2}{2}+\frac{x^3}{3}-\frac{x^4}{4}+\cdots</math>
one gets
: <math>\log B=\log \big(\lambda(I+K)\big)=\log (\lambda I) +\log (I+K)= (\log \lambda) I + K-\frac{K^2}{2}+\frac{K^3}{3}-\frac{K^4}{4}+\cdots </math>▼
This [[series (mathematics)|series]] has a finite number of terms (''K''<sup>''m''</sup> is zero if ''m'' is equal to or greater than the dimension of ''K''), and so its sum is well-defined.▼
▲:<math>\log B=\log \big(\lambda(I+K)\big)=\log (\lambda I) +\log (I+K)= (\log \lambda) I + K-\frac{K^2}{2}+\frac{K^3}{3}-\frac{K^4}{4}+\cdots </math>
'''Example.''' Using this approach, one finds▼
▲This [[series (mathematics)|series]] has a finite number of terms (''K''<sup>''m''</sup> is zero if ''m'' is the dimension of ''K''), and so its sum is well-defined.
: <math>\log \begin{bmatrix}1 & 1\\ 0 & 1\end{bmatrix}▼
=\begin{bmatrix}0 & 1\\ 0 & 0\end{bmatrix},</math>
▲Using this approach one finds
which can be verified by plugging the right-hand side into the matrix exponential:
<math display="block">
▲:<math>\log \begin{bmatrix}1 & 1\\ 0 & 1\end{bmatrix}
= I
+ \begin{bmatrix}0 & 1\\ 0 & 0\end{bmatrix}
+ \frac{1}{2}\underbrace{\begin{bmatrix}0 & 1\\ 0 & 0\end{bmatrix}^2}_{=0} + \cdots
= \begin{bmatrix}1 & 1\\ 0 & 1\end{bmatrix}.
▲</math>
== A functional analysis perspective ==
Line 196 ⟶ 195:
Using the tools of [[holomorphic functional calculus]], given a [[holomorphic function]] ''f'' defined on an [[open set]] in the [[complex plane]] and a bounded linear operator ''T'', one can calculate ''f''(''T'') as long as ''f'' is defined on the [[spectrum of an operator|spectrum]] of ''T''.
The function ''f''(''z'') = log ''z'' can be defined on any [[simply connected]] open set in the complex plane not containing the origin, and it is holomorphic on such a ___domain. This implies that one can define ln ''T'' as long as the spectrum of ''T'' does not contain the origin and there is a path going from the origin to infinity not crossing the spectrum of ''T'' (e.g., if the spectrum of ''T'' is a circle with the origin inside of it, it is impossible to define ln ''T'').
The spectrum of a linear operator on '''R'''<sup>''n''</sup> is the set of eigenvalues of its matrix, and so is a finite set. As long as the origin is not in the spectrum (the matrix is invertible), the path condition from the previous paragraph is satisfied, and ln ''T'' is well-defined. The non-uniqueness of the matrix logarithm follows from the fact that one can choose more than one branch of the logarithm which is defined on the set of eigenvalues of a matrix.
Line 203 ⟶ 202:
In the theory of [[Lie group]]s, there is an [[exponential map (Lie theory)|exponential map]] from a [[Lie algebra]] <math>\mathfrak{g}</math> to the corresponding Lie group ''G''
: <math> \exp : \mathfrak{g} \rightarrow G. </math>
Line 209 ⟶ 207:
Note that the exponential map is a local diffeomorphism between a neighborhood ''U'' of the zero matrix <math> \underline{0} \in \mathfrak{g}</math> and a neighborhood ''V'' of the identity matrix <math>\underline{1}\in G</math>.<ref>{{harvnb|Hall|2015}} Theorem 3.42</ref>
Thus the (matrix) logarithm is well-defined as a map,
: <math> \log: G\supset V \rightarrow U\subset \mathfrak{g}.</math>
An important corollary of [[Jacobi's formula]] then is
: <math>\log (\det(A)) = \mathrm{tr}(\log A)~. </math>
== Constraints in the 2
If a 2 × 2 real matrix has a negative [[determinant]], it has no real logarithm. Note first that any 2 × 2 real matrix can be considered one of the three types of the complex number ''z'' = ''x'' + ''y'' ''ε'', where ε
The case where the determinant is negative only arises in a plane with ε
For example, let ''a'' = log 2 ; then cosh ''a'' = 5/4 and sinh ''a'' = 3/4.
For matrices, this means that
: <math>A=\exp \begin{pmatrix}0 & a \\ a & 0 \end{pmatrix} =
\begin{pmatrix}\cosh a & \sinh a \\ \sinh a & \cosh a \end{pmatrix} =
\begin{pmatrix}1.25 & 0.75\\ 0.75 & 1.25 \end{pmatrix}</math>.
So this last matrix has logarithm
: <math>\log A = \begin{pmatrix}0 & \log 2 \\ \log 2 & 0 \end{pmatrix}</math>.
These matrices, however, do not have a logarithm:
: <math>\begin{pmatrix}3/4 & 5/4 \\ 5/4 & 3/4 \end{pmatrix},\
\begin{pmatrix}-3/4 & -5/4 \\ -5/4 & -3/4\end{pmatrix}, \
\begin{pmatrix}-5/4 & -3/4\\ -3/4 & -5/4 \end{pmatrix}</math>.
They represent the three other conjugates by the four-group of the matrix above that does have a logarithm.
A non-singular 2
It also follows, that, e.g., a [[Square root of a 2 by 2 matrix|square root of this matrix]] ''A'' is obtainable directly from exponentiating (log''A'')/2,
: <math>\sqrt{A}= \begin{pmatrix}\cosh ((\log 2)/2) & \sinh ((\log 2)/2) \\ \sinh ((\log 2)/2) & \cosh ((\log 2)/2) \end{pmatrix} =
\begin{pmatrix}1.06 & 0.35\\ 0.35 & 1.06 \end{pmatrix} ~. </math>
For a richer example, start with a [[Pythagorean triple]] (''p,q,r'')
and let {{math|''a'' {{=}} log(''p'' + ''r'') − log ''q''}}. Then
: <math>e^a = \frac {p + r} {q} = \cosh a + \sinh a</math>.
Now
: <math>\exp \begin{pmatrix}0 & a \\ a & 0 \end{pmatrix} =
\begin{pmatrix}r/q & p/q \\ p/q & r/q \end{pmatrix}</math>.
Thus
: <math>\tfrac{1}{q}\begin{pmatrix}r & p \\ p & r \end{pmatrix}</math>
has the logarithm matrix
: <math>\begin{pmatrix}0 & a \\ a & 0 \end{pmatrix}</math> ,
where {{math| ''a'' {{=}} log(''p'' + ''r'') − log ''q''}}.
== See also ==
* [[Matrix function]]
* [[Square root of a matrix]]
* [[Matrix exponential]]
* [[Baker–Campbell–Hausdorff formula]]
* [[Derivative of the exponential map]]
== Notes ==
{{reflist}}
== References ==
* {{
* {{citation|year=2015|first=Brian C.|last=Hall|title=Lie Groups, Lie Algebras, and Representations An Elementary Introduction|edition=2nd|series=Graduate Texts in Mathematics|volume=222|publisher=Springer|isbn= 978-3319134666}}
* {{
* {{
* {{
| last=Engø | first=Kenth | author-link=
|
| title=On the BCH-formula in '''so'''(3)
| journal=BIT Numerical Mathematics
Line 281 ⟶ 277:
| issue=3
| s2cid=126053191
| url-access=subscription
}}
|