CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Wiki > Introduction to turbulence/Statistical analysis/Generalization...

Introduction to turbulence/Statistical analysis/Generalization to the estimator of any quantity

From CFD-Wiki

Jump to: navigation, search

Similar relations can be formed for the estimator of any function of the random variable say f(x). For example, an estimator for the average of f based on N realizations is given by:

    
F_{N}\equiv\frac{1}{N}\sum^{N}_{n=1}f_{n}
(2)

where f_{n}\equiv f(x_{n}). It is straightforward to show that this estimator is unbiased, and its variability (squared) is given by:

    
\epsilon^{2}_{F_{N}}= \frac{1}{N} \frac{var \left\{f \left( x \right) \right\}}{\left\langle f \left( x \right) \right\rangle^{2} }
(2)

Example: Suppose it is desired to estimate the variability of an estimator for the variance based on a finite number of samples as:

    
var_{N} \left\{x \right\} \equiv \frac{1}{N} \sum^{N}_{n=1} \left( x_{n} - X \right)^{2}
(2)

(Note that this estimator is not really very useful since it presumes that the mean value, X, is known, whereas in fact usually only X_{N} is obtainable).

Answer

Let f=(x-X)^2 in equation 2.55 so that F_{N}= var_{N}\left\{ x \right\}, \left\langle f \right\rangle = var \left\{ x \right\} and var \left\{f \right\} = var \left\{ \left( x-X \right)^{2} - var \left[ x-X \right] \right\}. Then:

    
\epsilon^{2}_{F_{N}}= \frac{1}{N} \frac{var \left\{ \left( x-X \right)^{2} - var \left[x \right] \right\} }{ \left( var \left\{ x \right\} \right)^{2} }
(2)

This is easiest to understand if we first expand only the numerator to oblain:

    
var \left\{ \left( x- X \right)^{2} - var\left[x \right] \right\} = \left\langle \left( x- X \right)^{4} \right\rangle  - \left[ var \left\{ x \right\} \right]^2
(2)

Thus

    
\epsilon^{2}_{var_{N}} = \frac{\left\langle \left( x- X \right)^4 \right\rangle}{\left[ var \left\{ x \right\} \right]^2 } - 1
(2)

Obviuosly to proceed further we need to know how the fourth central moment relates to the second central moment. As noted earlier, in general thi is not known. If, however, it is reasonable to assume that x is a Gaussian distributed random variable, we know from section 2.3.4 that the kirtosis is 3. Then for Gaussian distributed random variables,

    
\epsilon^{2}_{var_{N}} = \frac{2}{N}
(2)

Thus the number of independnt data required to produce the same level of convergence for an estimate of the variance of a Gaussian distributed random variable is \sqrt{2} times that of mean, It is easy to show that the higher the moment, the more the amount of data required.

As noted earlier, turbulence problems are not usually Gaussian, and in fact values of the kurtosis substantionally greater than 3 are commonly encountered, especially for the moments of differentiated quantities. Clearly the non-Gaussian nature of random variables can affect the planning of experiments, since substantially greater amounts of data can be required to achieved the necessary statistical accuracy.

My wiki