# Introduction to turbulence/Statistical analysis/Estimation from a finite number of realizations

(Difference between revisions)
 Revision as of 10:02, 8 June 2006 (view source)Michail (Talk | contribs) (→Bias and convergence of estimators)← Older edit Revision as of 10:18, 8 June 2006 (view source)Michail (Talk | contribs) (→Bias and convergence of estimators)Newer edit → Line 24: Line 24: :$:[itex] \lim_{N\rightarrow\infty} X_{N} = X \lim_{N\rightarrow\infty} X_{N} = X +$ +
(2)
+ + It is easy to see that since the operations of averaging adding commute, +
+ :$+ \begin{matrix} + \left\langle X_{N} \right\rangle & = & \left\langle \frac{1}{N} \sum^{N}_{n=1} x_{n} \right\rangle \\ + & = & \frac{1}{N} \sum^{N}_{n=1} \left\langle x_{n} \right\rangle \\ + & = & \frac{1}{N} NX = X \\ + \end{matrix}$ [/itex] (2)
(2)

## Estimators for averaged quantities

Since there can never an infinite number of realizations from which ensemble averages (and probability densities) can be computed, it is essential to ask: How many realizations are enough? The answer to this question must be sought by looking at the statistical properties of estimators based on a finite number of realization. There are two questions which must be answered. The first one is:

• Is the expected value (or mean value) of the estimator equal to the true ensemble mean? Or in other words, is yje estimator unbiased?

The second question is

• Does the difference between the and that of the true mean decrease as the number of realizations increases? Or in other words, does the estimator converge in a statistical sense (or converge in probability). Figure 2.9 illustrates the problems which can arise.

## Bias and convergence of estimators

A procedure for answering these questions will be illustrated by considerind a simple estimator for the mean, the arithmetic mean considered above, $X_{N}$. For $N$ independent realizations $x_{n}, n=1,2,...,N$ where $N$ is finite, $X_{N}$ is given by:

 $X_{N}=\frac{1}{N}\sum^{N}_{n=1} x_{n}$ (2)

Now, as we observed in our simple coin-flipping experiment, since the $x_{n}$ are random, so must be the value of the estimator $X_{N}$. For the estimator to be unbiased, the mean value of $X_{N}$ must be true ensemble mean, $X$, i.e.

 $\lim_{N\rightarrow\infty} X_{N} = X$ (2)

It is easy to see that since the operations of averaging adding commute,

 $\begin{matrix} \left\langle X_{N} \right\rangle & = & \left\langle \frac{1}{N} \sum^{N}_{n=1} x_{n} \right\rangle \\ & = & \frac{1}{N} \sum^{N}_{n=1} \left\langle x_{n} \right\rangle \\ & = & \frac{1}{N} NX = X \\ \end{matrix}$ (2)