Eigenvalue problems are a well known problem from linear algebra that can be found in many scientific fields. In structural engineering vibration analysis involves solving eigenvalue problems to obtain natural frequencies of the system. For the stability analysis of dynamical systems the sign of the eigenvalues indicate if a system will converge to a stable point or not.

In electronic structure calculations the governing Schroedinger equation also constitutes an eigenvalue problem and the eigenvalues correspond to different energy levels of the quantum mechanical system.

From a mathematical point of view a *generalized* eigenvalue problem is given by

where are matrices, is a vector and is a
scalar. By inverting matrix or using a suitable matrix factorization the
problem can be transformed into a *standard* eigenvalue problem of the form

The vector is called an *eigenvector* to the *eigenvalue*
. Intuitively, eigenvectors represent those vectors that do not get
rotated by the linear transformation represented by the matrix but only
scaled by a factor of . Per definition
is **not** an eigenvector.

In theory eigenvalues and eigenvectors could be computed by solving the homogeneous system of linear equations

Since the trivial solution is excluded per
definition this system is only solvable if . is called the *characteristic polynomial* and its roots are
the eigenvalues . However, already for polynomials of degree 5 or
higher there is no explicit formula to compute its roots.

In scientific applications eigenvalues and eigenvectors need to be computed for matrices with a size in the tens of thousands or even in the millions.

Thus numerical methods commonly transform the matrix into a simpler form from where it is known how to compute the eigenvalues and eigenvectors.

For example by performing a series of orthogonal transformations (e.g. Householder transformation or Givens rotations) represented by a matrix a matrix can be transformed into a tridiagonal matrix as follows

The eigenvalues of the tridiagonal matrix can then be found by divide-and-conquer or bisection methods.

A very important application of eigenvalues and eigenvectors is matrix
diagonalization. If a linear independent system of eigenvectors can be
found then these eigenvectors form an *eigenbasis* of the underlying vector
space.

In this case the eigenvectors can be assembled into a matrix such that

where is a diagonal matrix containing the eigenvalues of .

Related programs: Related algorithms: Arnoldi's method ·