Conjugate gradient squared method: Difference between revisions

Content deleted Content added
Background: Link to vector
Background: Further explanation
Line 14:
 
== Background ==
A system of linear equations <math>A{\bold x} = {\bold b}</math> consists of a known [[Matrix (mathematics)|matrix]] <math>A</math> and a known [[Vector (mathematics)|vector]] <math>{\bold b}</math>. To solve the system is to find the value of the unknown vector <math>{\bold x}</math>. A direct method for solving a system of linear equations is to take the inverse of the matrix <math>A</math>, then calculate <math>\bold x = A^{-1}\bold b</math>. However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess <math>\bold x^{(0)}</math>, and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution.
 
== The Algorithm ==