Conjugate gradient squared method: Difference between revisions

Content deleted Content added
Started bolding vectors to be more consistent with other linear algebra pages
The Algorithm: Bolding vectors
Line 21:
 
# Choose an initial guess <math>{\bold x}_0</math>
# <math>r_0{\bold r}_0 = {\bold b} - A{\bold x}_0</math>
# Choose <math>\tilde r_0{\bold r}_0 = r_0{\bold r}_0</math>
# For <math>i = 1, 2, 3, \dots</math> do:
## <math>\rho_{i-1} = \tilde {\bold r}^T_{i-1}r_{\bold r}_{i-1}</math>
## If <math>\rho_{i-1} = 0</math>, the method fails.
## If <math>i=1</math>:
### <math>p_1{\bold p}_1 = u_1{\bold u}_1 = r_0{\bold r}_0</math>
## Else:
### <math>\beta_{i-1} = \rho_{i-1}/\rho_{i-2}</math>
### <math>u_i{\bold u}_i = r_{\bold r}_{i-1} + \beta_{i-1}q_{\bold q}_{i-1}</math>
### <math>p_i{\bold p}_i = u_i{\bold u}_i + \beta_{i-1}(q_{\bold q}_{i-1} + \beta_{i-1}p_{\bold p}_{i-1})</math>
## Solve <math>M\hat {\bold p}=p_i{\bold p}_i</math>, where <math>M</math> is a pre-conditioner.
## <math>\hat {\bold v} = A\hat {\bold p}</math>
## <math>\alpha_i = \rho_{i-1} / \tilde {\bold r}^T \hat {\bold v}</math>
## <math>q_i{\bold q}_i = u_i{\bold u}_i - \alpha_i\hat {\bold v}</math>
## Solve <math>M\hat {\bold u} = u_i{\bold u}_i + q_i{\bold q}_i</math>
## <math>{\bold x}_i = {\bold x}_{i-1} + \alpha_i \hat {\bold u}</math>
## <math>\hat {\bold q} = A\hat {\bold u}</math>
## <math>r_i{\bold r}_i = r_{\bold r}_{i-1} - \alpha_i\hat {\bold q}</math>
## Check for convergence: if there is convergence, end the loop and return the result