CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Wiki > Introduction to turbulence/Statistical analysis/Estimation fro...

Introduction to turbulence/Statistical analysis/Estimation from a finite number of realizations

From CFD-Wiki

Jump to: navigation, search
Introduction to turbulence
Nature of turbulence
Statistical analysis
Reynolds averaged equation
Turbulence kinetic energy
Stationarity and homogeneity
Homogeneous turbulence
Free turbulent shear flows
Wall bounded turbulent flows
Study questions

... template not finished yet!

Estimators for averaged quantities

Since there can never an infinite number of realizations from which ensemble averages (and probability densities) can be computed, it is essential to ask: How many realizations are enough? The answer to this question must be sought by looking at the statistical properties of estimators based on a finite number of realization. There are two questions which must be answered. The first one is:

  • Is the expected value (or mean value) of the estimator equal to the true ensemble mean? Or in other words, is yje estimator unbiased?

The second question is

  • Does the difference between the and that of the true mean decrease as the number of realizations increases? Or in other words, does the estimator converge in a statistical sense (or converge in probability). Figure 2.9 illustrates the problems which can arise.

Bias and convergence of estimators

A procedure for answering these questions will be illustrated by considerind a simple estimator for the mean, the arithmetic mean considered above, X_{N}. For N independent realizations x_{n}, n=1,2,...,N where N is finite, X_{N} is given by:

X_{N}=\frac{1}{N}\sum^{N}_{n=1} x_{n}

Figure 2.9 not uploaded yet

Now, as we observed in our simple coin-flipping experiment, since the x_{n} are random, so must be the value of the estimator X_{N}. For the estimator to be unbiased, the mean value of X_{N} must be true ensemble mean, X, i.e.

\lim_{N\rightarrow\infty} X_{N} = X

It is easy to see that since the operations of averaging adding commute,

    
\begin{matrix}
\left\langle X_{N} \right\rangle & = & \left\langle \frac{1}{N} \sum^{N}_{n=1} x_{n} \right\rangle \\
& = & \frac{1}{N} \sum^{N}_{n=1} \left\langle  x_{n} \right\rangle \\
& = & \frac{1}{N} NX = X \\
\end{matrix}

(Note that the expected value of each x_{n} is just X since the x_{n} are assumed identically distributed). Thus x_{N} is, in fact, an unbiased estimator for the mean.

The question of convergence of the estimator can be addressed by defining the square of variability of the estimator, say \epsilon^{2}_{X_{N}}, to be:

    
\epsilon^{2}_{X_{N}}\equiv \frac{var \left\{ X_{N} \right\} }{X^{2}} = \frac{\left\langle  \left( X_{N}- X \right)^{2} \right\rangle }{X^{2}}

Now we want to examine what happens to \epsilon_{X_{N}} as the number of realizations increases. For the estimator to converge it is clear that \epsilon_{x} should decrease as the number of sample increases. Obviously, we need to examine the variance of X_{N} first. It is given by:

    
\begin{matrix}
var \left\{ X_{N} \right\} & = & \left\langle X_{N} - X^{2} \right\rangle \\
& = & \left\langle \left[ \lim_{N\rightarrow\infty} \frac{1}{N} \sum^{N}_{n=1} \left( x_{n} - X \right) \right]^{2} \right\rangle - X^{2}\\
\end{matrix}

since \left\langle X_{N} \right\rangle = X from the equation for \langle X_{N} \rangle above. Using the fact that operations of averaging and summation commute, the squared summation can be expanded as follows:

    
\begin{matrix}
\left\langle \left[ \lim_{N\rightarrow\infty} \sum^{N}_{n=1} \left( x_{n} - X \right) \right]^{2} \right\rangle & = & \lim_{N\rightarrow\infty}\frac{1}{N^{2}} \sum^{N}_{n=1} \sum^{N}_{m=1} \left\langle \left( x_{n} - X \right) \left(  x_{m} - X \right) \right\rangle \\
& = & \lim_{N\rightarrow\infty}\frac{1}{N^{2}}\sum^{N}_{n=1}\left\langle \left(  x_{n} - X \right)^{2} \right\rangle \\
& = & \frac{1}{N} var \left\{ x \right\} \\
\end{matrix}

where the next to last step follows from the fact that the x_{n} are assumed to be statistically independent samples (and hence uncorrelated), and the last step from the definition of the variance. It follows immediately by substitution into the equation for \epsilon^{2}_{X_{N}} above that the square of the variability of the estimator, X_{N}, is given by:

    
\begin{matrix}
\epsilon^{2}_{X_{N}}& =& \frac{1}{N}\frac{var\left\{x\right\}}{X^{2}} \\
& = & \frac{1}{N} \left[ \frac{\sigma_{x}}{X} \right]^{2} \\ 
\end{matrix}

Thus the variability of the estimator depends inversely on the number of independent realizations, N, and linearly on the relative fluctuation level of the random variable itself \sigma_{x}/ X. Obviously if the relative fluctuation level is zero (either because there the quantity being measured is constant and there are no measurement errors), then a single measurement will suffice. On the other hand, as soon as there is any fluctuation in the x itself, the greater the fluctuation ( relative to the mean of x, \left\langle x \right\rangle = X ), then the more independent samples it will take to achieve a specified accuracy.

Example: In a given ensemble the relative fluctuation level is 12% (i.e. \sigma_{x}/ X = 0.12). What is the fewest number of independent samples that must be acquired to measure the mean value to within 1%?

AnswerUsing the equation for \epsilon^{2}_{X_{N}} above, and taking \epsilon_{X_{N}}=0.01, it follows that:

    
\left(0.01 \right)^{2} = \frac{1}{N}\left(0.12 \right)^{2}

or N \geq 144.

Credits

This text was based on "Lectures in Turbulence for the 21st Century" by Professor William K. George, Professor of Turbulence, Chalmers University of Technology, Gothenburg, Sweden.

Multivariate random variables · Generalization to the estimator of any quantity
Multivariate random variables · Introduction to turbulence/Statistical analysis · Generalization to the estimator of any quantity
My wiki