# Tridiagonal matrix algorithm - TDMA (Thomas algorithm)

(Difference between revisions)
 Revision as of 08:16, 27 September 2005 (view source)Zxaar (Talk | contribs)← Older edit Revision as of 12:24, 26 November 2012 (view source)Gazpoole (Talk | contribs) (→Variants)Newer edit → (23 intermediate revisions not shown) Line 1: Line 1: == Introduction == == Introduction == - The Thomas Algorithm is a special form of Gauss elimination that can be used to solve tridiagonal + The tridiagonal matrix algorithm (TDMA), also known as the Thomas algorithm, is a simplified form of Gaussian elimination that can be used to solve ''tridiagonal'' systems of equations. A tridiagonal system may be written as - systems of equations. When the matrix is tridiagonal, the solution can be obtained in O(n) operations, + - instead of O(n3/3). Example of such matrices are matrices arising from descretisation of 1D problems. + - We can write the tri-diagonal system in the form:
+ :$- :math> + a_i x_{i - 1} + b_i x_i + c_i x_{i + 1} = d_i, - a_i x_{i - 1} + b_i x_i + c_i x_{i + 1} = d_i +$
[/itex]
- Where $a_1 = 0$ and $c_n = 0$
+ + where $a_1 = 0$ and $c_n = 0$.  In matrix form, this system is written as + + : + \left[ + \begin{matrix} + {b_1} & {c_1} & {  } & {  } & { 0 } \\ + {a_2} & {b_2} & {c_2} & {  } & {  } \\ + {  } & {a_3} & {b_3} & \cdot & {  } \\ + {  } & {  } & \cdot & \cdot & {c_{n-1}}\\ + { 0 } & {  } & {  } & {a_n} & {b_n}\\ + \end{matrix} + \right] + \left[ + \begin{matrix} + {x_1 }  \\ + {x_2 }  \\ + \cdot  \\ + \cdot  \\ + {x_n }  \\ + \end{matrix} + \right] + = + \left[ + \begin{matrix} + {d_1 }  \\ + {d_2 }  \\ + \cdot  \\ + \cdot  \\ + {d_n }  \\ + \end{matrix} + \right]. + [/itex] + + For such systems, the solution can be obtained in $O(n)$ operations instead of $O(n^3)$ required by Gaussian Elimination. A first sweep eliminates the $a_i$'s, and then an (abbreviated) backward substitution produces the solution.  Example of such matrices commonly arise from the discretization of 1D problems (e.g. the 1D Poisson problem). == Algorithm == == Algorithm == - :    for k:= 2 step until n do
+ + The following algorithm performs the TDMA, overwriting the original arrays.  In some situations this is not desirable, so some prefer to copy the original arrays beforehand. + + Forward elimination phase + :    for k = 2 step until n do
::    $m = {{a_k } \over {b_{k - 1} }}$
::    $m = {{a_k } \over {b_{k - 1} }}$
- ::    $b_k^' = b_k - mc_{k - 1}$
+ ::    $b_k = b_k - mc_{k - 1}$
- ::    $d_k^' = d_k - md_{k - 1}$
+ ::    $d_k = d_k - md_{k - 1}$
:    end loop (k) :    end loop (k) - :    then
+ Backward substitution phase - ::    $x_n = {{d_n^' } \over {b_n }}$
+ :    $x_n = {{d_n } \over {b_n }}$
- :    for k := n-1 stepdown until 1 do
+ :    for k = n-1 stepdown until 1 do
- ::    $x_k = {{d_k^' - c_k x_{k + 1} } \over {b_k }}$
+ ::    $x_k = {{d_k - c_k x_{k + 1} } \over {b_k }}$
:    end loop (k) :    end loop (k) - ---- + + == Assessments == + + This algorithm is only applicable to matrices that are diagonally dominant, which is to say + + :$\left | b_i \right \vert > \left | a_i \right \vert + \left | c_{i} \right \vert \quad \quad i \in {1,...,n}$ + + == Variants == + In some situations, particularly those involving periodic boundary conditions, a slightly perturbed form of the tridiagonal system may need to be solved: + + :$+ a_1 x_{n} + b_1 x_1 + c_1 x_2 = d_1, +$
+ :$+ a_i x_{i - 1} + b_i x_i + c_i x_{i + 1} = d_i,\, i = 2,\ldots,n-1 +$
+ :$+ a_n x_{n-1} + b_n x_n + c_n x_1 = d_n. +$
+ + In matrix form, this is + + :$+ \left[ + \begin{matrix} + {b_1} & {c_1} & { } & { } & {a_1} \\ + {a_2} & {b_2} & {c_2} & { } & { } \\ + { } & {a_3} & {b_3} & \cdot & { } \\ + { } & { } & \cdot & \cdot & {c_{n-1}}\\ + {c_n} & { } & { } & {a_n} & {b_n}\\ + \end{matrix} + \right] + \left[ + \begin{matrix} + {x_1 } \\ + {x_2 } \\ + \cdot \\ + \cdot \\ + {x_n } \\ + \end{matrix} + \right] + = + \left[ + \begin{matrix} + {d_1 } \\ + {d_2 } \\ + \cdot \\ + \cdot \\ + {d_n } \\ + \end{matrix} + \right]. +$ + + In this case, we can make use of the Sherman-Morrison formula to avoid the additional operations of Gaussian elimination and still use the Thomas algorithm.  We are now solving a problem of the form + + :$(A' + uv^T)x = d$ + + where + + :$u^T = [-b_1\ 0\ 0\ ...\ 0\ c_n],\ v^T = [1\ 0\ 0\ ...\ 0\ -a_1/b_1].$ + + $A'$ is a slightly different tridiagonal system than above, and the solution to the perturbed system is obtained by solving + + :$A'y=d,\ A'q=u$ + + and compute $x$ as + + :$x = y - \{(v^T y)/(1 + (v^T q))\} q$ + + In other situations, the system of equations may be ''block tridiagonal'', with smaller submatrices arranged as the individual elements in the above matrix system.  Simplified forms of Gaussian elimination have been developed for these situations. + + == References == + + #{{reference-book|author=Thomas, L.H.|year=1949|title=Elliptic Problems in Linear Differential Equations over a Network|rest=Watson Sci. Comput. Lab Report, Columbia University, New York.}} + #{{reference-book|author=Conte, S.D., and deBoor, C.|year=1972|title=Elementary Numerical Analysis|rest= McGraw-Hill, New York.}} + + ''Still TODO: Add more references, more on the variants, make things nicer looking, and maybe more performance type info'' + --[[User:Jasond|Jasond]] 22:59, 7 April 2006 (MDT)

## Introduction

The tridiagonal matrix algorithm (TDMA), also known as the Thomas algorithm, is a simplified form of Gaussian elimination that can be used to solve tridiagonal systems of equations. A tridiagonal system may be written as

$a_i x_{i - 1} + b_i x_i + c_i x_{i + 1} = d_i,$

where $a_1 = 0$ and $c_n = 0$. In matrix form, this system is written as

$\left[ \begin{matrix} {b_1} & {c_1} & { } & { } & { 0 } \\ {a_2} & {b_2} & {c_2} & { } & { } \\ { } & {a_3} & {b_3} & \cdot & { } \\ { } & { } & \cdot & \cdot & {c_{n-1}}\\ { 0 } & { } & { } & {a_n} & {b_n}\\ \end{matrix} \right] \left[ \begin{matrix} {x_1 } \\ {x_2 } \\ \cdot \\ \cdot \\ {x_n } \\ \end{matrix} \right] = \left[ \begin{matrix} {d_1 } \\ {d_2 } \\ \cdot \\ \cdot \\ {d_n } \\ \end{matrix} \right].$

For such systems, the solution can be obtained in $O(n)$ operations instead of $O(n^3)$ required by Gaussian Elimination. A first sweep eliminates the $a_i$'s, and then an (abbreviated) backward substitution produces the solution. Example of such matrices commonly arise from the discretization of 1D problems (e.g. the 1D Poisson problem).

## Algorithm

The following algorithm performs the TDMA, overwriting the original arrays. In some situations this is not desirable, so some prefer to copy the original arrays beforehand.

Forward elimination phase

for k = 2 step until n do
$m = {{a_k } \over {b_{k - 1} }}$
$b_k = b_k - mc_{k - 1}$
$d_k = d_k - md_{k - 1}$
end loop (k)

Backward substitution phase

$x_n = {{d_n } \over {b_n }}$
for k = n-1 stepdown until 1 do
$x_k = {{d_k - c_k x_{k + 1} } \over {b_k }}$
end loop (k)

## Assessments

This algorithm is only applicable to matrices that are diagonally dominant, which is to say

$\left | b_i \right \vert > \left | a_i \right \vert + \left | c_{i} \right \vert \quad \quad i \in {1,...,n}$

## Variants

In some situations, particularly those involving periodic boundary conditions, a slightly perturbed form of the tridiagonal system may need to be solved:

$a_1 x_{n} + b_1 x_1 + c_1 x_2 = d_1,$
$a_i x_{i - 1} + b_i x_i + c_i x_{i + 1} = d_i,\, i = 2,\ldots,n-1$
$a_n x_{n-1} + b_n x_n + c_n x_1 = d_n.$

In matrix form, this is

$\left[ \begin{matrix} {b_1} & {c_1} & { } & { } & {a_1} \\ {a_2} & {b_2} & {c_2} & { } & { } \\ { } & {a_3} & {b_3} & \cdot & { } \\ { } & { } & \cdot & \cdot & {c_{n-1}}\\ {c_n} & { } & { } & {a_n} & {b_n}\\ \end{matrix} \right] \left[ \begin{matrix} {x_1 } \\ {x_2 } \\ \cdot \\ \cdot \\ {x_n } \\ \end{matrix} \right] = \left[ \begin{matrix} {d_1 } \\ {d_2 } \\ \cdot \\ \cdot \\ {d_n } \\ \end{matrix} \right].$

In this case, we can make use of the Sherman-Morrison formula to avoid the additional operations of Gaussian elimination and still use the Thomas algorithm. We are now solving a problem of the form

$(A' + uv^T)x = d$

where

$u^T = [-b_1\ 0\ 0\ ...\ 0\ c_n],\ v^T = [1\ 0\ 0\ ...\ 0\ -a_1/b_1].$

$A'$ is a slightly different tridiagonal system than above, and the solution to the perturbed system is obtained by solving

$A'y=d,\ A'q=u$

and compute $x$ as

$x = y - \{(v^T y)/(1 + (v^T q))\} q$

In other situations, the system of equations may be block tridiagonal, with smaller submatrices arranged as the individual elements in the above matrix system. Simplified forms of Gaussian elimination have been developed for these situations.

## References

1. Thomas, L.H. (1949), Elliptic Problems in Linear Differential Equations over a Network, Watson Sci. Comput. Lab Report, Columbia University, New York..
2. Conte, S.D., and deBoor, C. (1972), Elementary Numerical Analysis, McGraw-Hill, New York..

Still TODO: Add more references, more on the variants, make things nicer looking, and maybe more performance type info --Jasond 22:59, 7 April 2006 (MDT)