Preconditioned conjugate gradient method

This is an old revision of this page, as edited by 212.143.214.50 (talk) at 18:59, 22 October 2007. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The conjugate gradient method is a numerical algorithm that solves a system of linear equations

where is symmetric [positive definite]. If the matrix is ill-conditioned, i.e. it has a large condition number , it is often useful to use a preconditioning matrix that is chosen such that and solve the system

instead.

To save computional time it is important to take a symmetric preconditioner. This can be Jacobi, symmetric Gauss-Seidel or Symmetric Successive Over Relaxation (SSOR).

The simplest preconditioner is a diagonal matrix that has just the diagonal elements of . This is known as Jacobi preconditioning or diagonal scaling. Since diagonal matrices are trivial to invert and store in memory, a diagonal preconditioner is a good starting point. More sophisticated choices must trade-off the reduction in , and hence faster convergence, with the time spent computing .