# Introduction to turbulence/Statistical analysis/Estimation from a finite number of realizations

(Difference between revisions)
 Revision as of 10:18, 8 June 2006 (view source)Michail (Talk | contribs) (→Bias and convergence of estimators)← Older edit Revision as of 11:01, 8 June 2006 (view source)Michail (Talk | contribs) (→Bias and convergence of estimators)Newer edit → Line 28: Line 28: It is easy to see that since the operations of averaging adding commute, It is easy to see that since the operations of averaging adding commute, +
:$:[itex] Line 35: Line 36: & = & \frac{1}{N} NX = X \\ & = & \frac{1}{N} NX = X \\ \end{matrix} \end{matrix} +$ + (2)
+ + (Note that the expected value of each $x_{n}$ is just $X$ since the $x_{n}$ are assumed identically distributed). Thus $x_{N}$ is, in fact, an ''unbiased estimator for the mean''. + + The question of ''convergence'' of the estimator can be addressed by defining the square of '''variability of the estimator''', say $\epsilon^{2}_{X_{N}}$, to be: + +
+ :$+ \epsilon^{2}_{X_{N}}\equiv \frac{var \left\{ X_{N} \right\} }{X^{2}} = \frac{\left\langle \left( X_{N}- X \right)^{2} \right\rangle }{X^{2}}$ [/itex] (2)
(2)

## Estimators for averaged quantities

Since there can never an infinite number of realizations from which ensemble averages (and probability densities) can be computed, it is essential to ask: How many realizations are enough? The answer to this question must be sought by looking at the statistical properties of estimators based on a finite number of realization. There are two questions which must be answered. The first one is:

• Is the expected value (or mean value) of the estimator equal to the true ensemble mean? Or in other words, is yje estimator unbiased?

The second question is

• Does the difference between the and that of the true mean decrease as the number of realizations increases? Or in other words, does the estimator converge in a statistical sense (or converge in probability). Figure 2.9 illustrates the problems which can arise.

## Bias and convergence of estimators

A procedure for answering these questions will be illustrated by considerind a simple estimator for the mean, the arithmetic mean considered above, $X_{N}$. For $N$ independent realizations $x_{n}, n=1,2,...,N$ where $N$ is finite, $X_{N}$ is given by:

 $X_{N}=\frac{1}{N}\sum^{N}_{n=1} x_{n}$ (2)

Now, as we observed in our simple coin-flipping experiment, since the $x_{n}$ are random, so must be the value of the estimator $X_{N}$. For the estimator to be unbiased, the mean value of $X_{N}$ must be true ensemble mean, $X$, i.e.

 $\lim_{N\rightarrow\infty} X_{N} = X$ (2)

It is easy to see that since the operations of averaging adding commute,

 $\begin{matrix} \left\langle X_{N} \right\rangle & = & \left\langle \frac{1}{N} \sum^{N}_{n=1} x_{n} \right\rangle \\ & = & \frac{1}{N} \sum^{N}_{n=1} \left\langle x_{n} \right\rangle \\ & = & \frac{1}{N} NX = X \\ \end{matrix}$ (2)

(Note that the expected value of each $x_{n}$ is just $X$ since the $x_{n}$ are assumed identically distributed). Thus $x_{N}$ is, in fact, an unbiased estimator for the mean.

The question of convergence of the estimator can be addressed by defining the square of variability of the estimator, say $\epsilon^{2}_{X_{N}}$, to be:

 $\epsilon^{2}_{X_{N}}\equiv \frac{var \left\{ X_{N} \right\} }{X^{2}} = \frac{\left\langle \left( X_{N}- X \right)^{2} \right\rangle }{X^{2}}$ (2)