Eigenvalue problems

Eigenvalue problems are a well known problem from linear algebra that can be found in many scientific fields. In structural engineering vibration analysis involves solving eigenvalue problems to obtain natural frequencies of the system. For the stability analysis of dynamical systems the sign of the eigenvalues indicate if a system will converge to a stable point or not.

In electronic structure calculations the governing Schroedinger equation also constitutes an eigenvalue problem and the eigenvalues correspond to different energy levels of the quantum mechanical system.

From a mathematical point of view a generalized eigenvalue problem is given by \(A \mathbf{v} = \lambda B \mathbf{v},\)

where \(A, B \in \mathbb{C}^{n \times n}\) are matrices, \(\mathbf{v} \in \mathbb{C}^{n \times 1}\) is a vector and \(\lambda \in \mathbb{C}\) is a scalar. By inverting matrix \(B\) or using a suitable matrix factorization the problem can be transformed into a standard eigenvalue problem of the form

\[A \mathbf{v} = \lambda \mathbf{v}.\]

The vector \(\mathbf{v}\) is called an eigenvector to the eigenvalue \(\lambda\). Intuitively, eigenvectors represent those vectors that do not get rotated by the linear transformation represented by the matrix \(A\) but only scaled by a factor of \(\lambda\). Per definition \(\mathbf{v} = \mathbf{0}\) is not an eigenvector.

In theory eigenvalues and eigenvectors could be computed by solving the homogeneous system of linear equations

\[(A - \lambda I) \mathbf{v} = \mathbf{0}\]

Since the trivial solution \(\mathbf{v} = \mathbf{0}\) is excluded per definition this system is only solvable if \(\chi_A := det(A - \lambda I) = 0\). \(\chi_A\) is called the characteristic polynomial and its roots are the eigenvalues \(\lambda\). However, already for polynomials of degree 5 or higher there is no explicit formula to compute its roots.

In scientific applications eigenvalues and eigenvectors need to be computed for matrices with a size in the tens of thousands or even in the millions.

Thus numerical methods commonly transform the matrix \(A\) into a simpler form from where it is known how to compute the eigenvalues and eigenvectors.

For example by performing a series of orthogonal transformations (e.g. Householder transformation or Givens rotations) represented by a matrix \(Q\) a matrix can be transformed into a tridiagonal matrix \(T\) as follows

\[T = Q A Q^T.\]

The eigenvalues of the tridiagonal matrix \(T\) can then be found by divide-and-conquer or bisection methods.

A very important application of eigenvalues and eigenvectors is matrix diagonalization. If a linear independent system of \(n\) eigenvectors can be found then these eigenvectors form an eigenbasis of the underlying vector space.

In this case the eigenvectors can be assembled into a matrix \(E\) such that

\[D = E^{-1} A E,\]

where \(D\) is a diagonal matrix containing the eigenvalues of \(A\).

Related algorithm(s): Arnoldi's method