Conjugate gradient squared method: Difference between revisions

Content deleted Content added
m ce in lead section
The Algorithm: Added more steps
Line 10:
# Compute the residual <math>r_0 = b - Ax_0</math>
# Choose another residual <math>\tilde r_0 = r_0</math>
# Repeat the following untilfor convergence<math>i is= reached1, or2, a maximum number of iterations is3, exceeded\dots</math>:
## <math>\rhorho_{i-1} = \tilde r^TrT_{i-1}r_{i-1}</math>
## If <math>\rho_{i-1} = 0</math>, the method fails.
 
## If <math>i=1</math>:
### <math>p_1 = u_1 = r_0</math>
## Else:
### <math>\beta_{i-1} = \rho_{i-1}/\rho_{i-2}</math>
### <math>u_i = r_{i-1} + \beta_{i-1}q_{i-1}</math>
<!-- To be completed -->