# Iterative methods

(Difference between revisions)
Jump to: navigation, search
 Revision as of 11:00, 19 October 2005 (view source)Praveen (Talk | contribs) (→Stationary Iterative Methods)← Older edit Revision as of 20:43, 14 December 2005 (view source)Tsaad (Talk | contribs) m (Basic concept of iterative solutions moved to Iterative methods)Newer edit →

## Revision as of 20:43, 14 December 2005

For solving a set of linear equations, we seek the solution to the problem:

$AX = Q$

After k iterations we obtain an approaximation to the solution as:

$Ax^{(k)} = Q - r^{(k)}$

where $r^{(k)}$ is the residual after k iterations.
Defining:

$\varepsilon ^{(k)} = x - x^{(k)}$

as the difference between the exact and approaximate solution.
we obtain :

$A\varepsilon ^{(k)} = r^{(k)}$

the purpose of iterations is to drive this residual to zero.

### Stationary Iterative Methods

Iterative methods that can be expressed in the simple form

$x^{(k+1)} = Bx^{(k)} + c$

when neither B nor c depend upon the iteration count (k), the iterative method is called stationary iterative method. Some of the stationary iterative methods are

1. Jacobi method
2. Gauss-Seidel method
3. Successive Overrelaxation (SOR) method and
4. Symmetric Successive Overrelaxation (SSOR) method

The convergence of such iterative methods can be investigated using the Fixed point theorem.

### Nonstationary Iterative Methods

When during the iterations B and c changes during the iterations, the method is called Nonstationary Iterative Method. Typically, constants B and c are computed by taking inner products of residuals or other vectors arising from the iterative method.

Some examples are:

1. Conjugate Gradient Method (CG)
2. MINRES and SYMMLQ
3. Generalized Minimal Residual (GMRES)
4. BiConjugate Gradient (BiCG)
5. Quasi-Minimal Residual (QMR)
6. Conjugate Gradient Squared Method (CGS)
7. BiConjugate Gradient Stabilized (Bi-CGSTAB)
8. Chebyshev Iteration

Return to Numerical Methods