Quadratic eigenvalue problem: Difference between revisions

Content deleted Content added
m Reverted edits by 122.53.231.173 (talk) to last version by 78.145.143.119
Link suggestions feature: 3 links added.
 
(21 intermediate revisions by 12 users not shown)
Line 1:
In [[mathematics]], the '''quadratic eigenvalue problem<ref>F. Tisseur and K. Meerbergen, The quadratic eigenvalue problem, SIAM
Rev., 43 (2001), pp. 235–286.</ref> (QEP)''', is to find [[scalar (mathematics)|scalar]] [[eigenvalue]]s <math>\lambda\,</math>, left [[eigenvector]]s <math>y\,</math> and right eigenvectors <math>x\,</math> such that
 
:<math> Q(\lambda)x = 0 ~ \text{ and } ~ y^\ast Q(\lambda) = 0,\, </math>
 
where <math>Q(\lambda)=\lambda^2 A_2M + \lambda A_1C + A_0\,K</math>, with matrix coefficients <math>A_2M, \, A_1C, A_0 \in \mathbb{C}^{n \times n}</math> and <math>A_0\,K \in \mathbb{C}^{n \times n}</math> and we require that <math>A_2M\,\neq 0</math>, (so that we have a nonzero leading coefficient). There are <math>2n\,</math> eigenvalues that may be ''infinite'' or finite, and possibly zero. This is a special case of a [[nonlinear eigenproblem]]. <math>Q(\lambda)</math> is also known as a quadratic matrix [[polynomial matrix]].
 
==Spectral theory==
 
A QEP is said to be <em>regular</em> if <math>\text{det} (Q(\lambda)) \not \equiv 0</math> identically. The coefficient of the <math>\lambda^{2n}</math> term in <math>\text{det}(Q(\lambda))</math> is <math>\text{det}(M)</math>, implying that the QEP is regular if <math>M</math> is nonsingular.
 
Eigenvalues at infinity and eigenvalues at 0 may be exchanged by considering the reversed polynomial, <math> \lambda^2 Q(\lambda^{-1}) = \lambda^2 K + \lambda C + M </math>. As there are <math> 2n</math> eigenvectors in a <math>n</math> dimensional space, the eigenvectors cannot be orthogonal. It is possible to have the same eigenvector attached to different eigenvalues.
 
==Applications==
=== Systems of differential equations ===
A QEP can result in part of the dynamic analysis of structures discretized by the [[finite element method]]. In this case the quadratic, <math>Q(\lambda)\,</math> has the form <math>Q(\lambda)=\lambda^2 M + \lambda C + K\,</math>, where <math>M\,</math> is the mass matrix, <math>C\,</math> is the damping matrix and <math>K\,</math> is the stiffness matrix.
Quadratic eigenvalue problems arise naturally in the solution of systems of second order [[Linear differential equation|linear differential equations]] without forcing:
Other applications include vibro-acoustics and fluid dynamics.
 
:<math> M q''(t) +C q'(t) + K q(t) = 0 </math>
==Methods of Solution==
 
Where <math> q(t) \in \mathbb{R}^n </math>, and <math> M, C, K \in \mathbb{R}^{n\times n}</math>. If all quadratic eigenvalues of <math> Q(\lambda) = \lambda^2 M + \lambda C + K </math> are distinct, then the solution can be written in terms of the quadratic eigenvalues and right quadratic eigenvectors as
Direct methods for solving the standard or generalized eigenvalue problems <math> Ax = \lambda x</math> and <math> Ax = \lambda B x </math>
 
are based on transforming the problem to Schur or Generalized Schur form. However, there is no analogous form for quadratic matrix polynomials.
:<math>
One approach is to transform the quadratic matrix polynomial to a linear matrix pencil (<math> A-\lambda B</math>), and solve a generalized
q(t) = \sum_{j=1}^{2n} \alpha_j x_j e^{\lambda_j t} = X e^{\Lambda t} \alpha
</math>
Where <math>\Lambda = \text{Diag}([\lambda_1, \ldots, \lambda_{2n}]) \in \mathbb{R}^{2n \times 2n} </math> are the quadratic eigenvalues, <math> X = [x_1, \ldots, x_{2n}] \in \mathbb{R}^{n \times 2n} </math> are the <math> 2n</math> right quadratic eigenvectors, and <math> \alpha = [\alpha_1, \cdots, \alpha_{2n}]^\top \in \mathbb{R}^{2n}</math> is a parameter vector determined from the initial conditions on <math> q</math> and <math> q'</math>.
[[Stability theory]] for linear systems can now be applied, as the behavior of a solution depends explicitly on the (quadratic) eigenvalues.
 
=== Finite element methods ===
 
A QEP can result in part of the dynamic analysis of structures [[Discretization|discretized]] by the [[finite element method]]. In this case the quadratic, <math>Q(\lambda)\,</math> has the form <math>Q(\lambda)=\lambda^2 M + \lambda C + K\,</math>, where <math>M\,</math> is the [[mass matrix]], <math>C\,</math> is the [[damping matrix]] and <math>K\,</math> is the [[stiffness matrix]].
Other applications include vibro-acoustics and [[fluid dynamics]].
 
==Methods of Solutionsolution==
 
Direct methods for solving the standard or [[Generalized eigenvalue problem|generalized eigenvalue problems]] <math> Ax = \lambda x</math> and <math> Ax = \lambda B x </math>
are based on transforming the problem to [[Schur form|Schur]] or [[Schur decomposition#Generalized Schur decomposition|Generalized Schur]] form. However, there is no analogous form for quadratic matrix polynomials.
One approach is to transform the quadratic [[matrix polynomial]] to a linear [[matrix pencil]] (<math> A-\lambda B</math>), and solve a generalized
eigenvalue problem. Once eigenvalues and eigenvectors of the linear problem have been determined, eigenvectors and eigenvalues of the quadratic can be determined.
 
The most common linearization is the first companion[[Companion matrix|companion]] linearization
:<math>
LL1(\lambda) =
\lambda
\begin{bmatrix}
M0 & 0N \\
0-K & I_n-C
\end{bmatrix}
-
+
\lambda\begin{bmatrix}
CN & K0 \\
-I_n0 & 0M
\end{bmatrix},
</math>
where <math>I_n</math> is the <math>n</math>-by-<math>n</math> identity matrix, with corresponding eigenvector
:<math>
z =
\begin{bmatrix}
\lambda x \\
\lambda x
x
\end{bmatrix}.
</math>
For convenience, one often takes <math>N</math> to be the <math>n\times n</math> [[identity matrix]]. We solve <math> L(\lambda) z = 0 </math> for <math> \lambda </math> and <math>z</math>, for example by computing the Generalized Schur form. We can then
take the first <math>n</math> components of <math>z</math> as the eigenvector <math>x</math> of the original quadratic <math>Q(\lambda)</math>.
 
Another common linearization is given by
:<math>
L2(\lambda)= \begin{bmatrix}
-K & 0 \\
0 & N
\end{bmatrix}
-
\lambda\begin{bmatrix}
C & M \\
N & 0
\end{bmatrix}.
</math>
 
In the case when either <math>A</math> or <math>B</math> is a [[Hamiltonian matrix]] and the other is a [[skew-Hamiltonian matrix]], the following linearizations can be used.
:<math>
L3(\lambda)= \begin{bmatrix}
K & 0 \\
C & K
\end{bmatrix}
-
\lambda\begin{bmatrix}
0 & K \\
-M & 0
\end{bmatrix}.
</math>
:<math>
L4(\lambda)= \begin{bmatrix}
0 & -K \\
M & 0
\end{bmatrix}
-
\lambda\begin{bmatrix}
M & C \\
0 & M
\end{bmatrix}.
</math>
{{mathapplied-stub}}