# Introduction to turbulence/Statistical analysis/Estimation from a finite number of realizations

### From CFD-Wiki

(→Bias and convergence of estimators) |
(→Bias and convergence of estimators) |
||

Line 28: | Line 28: | ||

It is easy to see that since the operations of averaging adding commute, | It is easy to see that since the operations of averaging adding commute, | ||

+ | |||

<table width="100%"><tr><td> | <table width="100%"><tr><td> | ||

:<math> | :<math> | ||

Line 35: | Line 36: | ||

& = & \frac{1}{N} NX = X \\ | & = & \frac{1}{N} NX = X \\ | ||

\end{matrix} | \end{matrix} | ||

+ | </math> | ||

+ | </td><td width="5%">(2)</td></tr></table> | ||

+ | |||

+ | (Note that the expected value of each <math>x_{n}</math> is just <math>X</math> since the <math>x_{n}</math> are assumed identically distributed). Thus <math>x_{N}</math> is, in fact, an ''unbiased estimator for the mean''. | ||

+ | |||

+ | The question of ''convergence'' of the estimator can be addressed by defining the square of '''variability of the estimator''', say <math>\epsilon^{2}_{X_{N}}</math>, to be: | ||

+ | |||

+ | <table width="100%"><tr><td> | ||

+ | :<math> | ||

+ | \epsilon^{2}_{X_{N}}\equiv \frac{var \left\{ X_{N} \right\} }{X^{2}} = \frac{\left\langle \left( X_{N}- X \right)^{2} \right\rangle }{X^{2}} | ||

</math> | </math> | ||

</td><td width="5%">(2)</td></tr></table> | </td><td width="5%">(2)</td></tr></table> |

## Revision as of 11:01, 8 June 2006

## Estimators for averaged quantities

Since there can never an infinite number of realizations from which ensemble averages (and probability densities) can be computed, it is essential to ask: *How many realizations are enough?* The answer to this question must be sought by looking at the statistical properties of estimators based on a finite number of realization. There are two questions which must be answered. The first one is:

- Is the expected value (or mean value) of the estimator equal to the true ensemble mean? Or in other words, is yje estimator
*unbiased?*

The second question is

- Does the difference between the and that of the true mean decrease as the number of realizations increases? Or in other words, does the estimator
*converge*in a statistical sense (or converge in probability). Figure 2.9 illustrates the problems which can arise.

## Bias and convergence of estimators

A procedure for answering these questions will be illustrated by considerind a simple **estimator** for the mean, the arithmetic mean considered above, . For independent realizations where is finite, is given by:

| (2) |

Now, as we observed in our simple coin-flipping experiment, since the are random, so must be the value of the estimator . For the estimator to be *unbiased*, the mean value of must be true ensemble mean, , i.e.

| (2) |

It is easy to see that since the operations of averaging adding commute,

| (2) |

(Note that the expected value of each is just since the are assumed identically distributed). Thus is, in fact, an *unbiased estimator for the mean*.

The question of *convergence* of the estimator can be addressed by defining the square of **variability of the estimator**, say , to be:

| (2) |