Conjugate gradient squared method: Difference between revisions

Content deleted Content added
Added link
The Algorithm: Converted to superscript notation
Line 20:
The algorithm is as follows:<ref>{{cite book|author1=R. Barrett|author2=M. Berry|author3=T. F. Chan|author4=J. Demmel|author5=J. Donato|author6=J. Dongarra|author7=V. Eijkhout|author8=R. Pozo|author9=C. Romine|author10=H. Van der Vorst|title=Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd Edition|publisher=SIAM|year=1994|url=https://netlib.org/linalg/html_templates/Templates.html}}</ref>
 
# Choose an initial guess <math>{\bold x}_0^{(0)}</math>
# Compute the residual <math>{\bold r}_0^{(0)} = {\bold b} - A{\bold x}_0^{(0)}</math>
# Choose <math>\tilde {\bold r}_0^{(0)} = {\bold r}_0^{(0)}</math>
# For <math>i = 1, 2, 3, \dots</math> do:
## <math>\rho_rho^{(i-1)} = \tilde {\bold r}^T_{T(i-1)}{\bold r}_^{(i-1)}</math>
## If <math>\rho_rho^{(i-1)} = 0</math>, the method fails.
## If <math>i=1</math>:
### <math>{\bold p}_1^{(1)} = {\bold u}_1^{(1)} = {\bold r}_0^{(0)}</math>
## Else:
### <math>\beta_beta^{(i-1)} = \rho_rho^{(i-1)}/\rho_rho^{(i-2)}</math>
### <math>{\bold u}_i^{(i)} = {\bold r}_^{(i-1)} + \beta_{i-1}{\bold q}_^{(i-1)}</math>
### <math>{\bold p}_i^{(i)} = {\bold u}_i^{(i)} + \beta_beta^{(i-1)}({\bold q}_^{(i-1)} + \beta_beta^{(i-1)}{\bold p}_^{(i-1)})</math>
## Solve <math>M\hat {\bold p}={\bold p}_i^{(i)}</math>, where <math>M</math> is a pre-conditioner.
## <math>\hat {\bold v} = A\hat {\bold p}</math>
## <math>\alpha_ialpha^{(i)} = \rho_rho^{(i-1)} / \tilde {\bold r}^T \hat {\bold v}</math>
## <math>{\bold q}_i^{(i)} = {\bold u}_i^{(i)} - \alpha_ialpha^{(i)}\hat {\bold v}</math>
## Solve <math>M\hat {\bold u} = {\bold u}_i^{(i)} + {\bold q}_i^{(i)}</math>
## <math>{\bold x}_i^{(i)} = {\bold x}_^{(i-1)} + \alpha_ialpha^{(i)} \hat {\bold u}</math>
## <math>\hat {\bold q} = A\hat {\bold u}</math>
## <math>{\bold r}_i^{(i)} = {\bold r}_^{(i-1)} - \alpha_ialpha^{(i)}\hat {\bold q}</math>
## Check for convergence: if there is convergence, end the loop and return the result