Introduction to turbulence/Statistical analysis/Multivariate random variables
From CFD-Wiki
(→Statistical independence and lack of correlation) |
|||
(8 intermediate revisions not shown) | |||
Line 1: | Line 1: | ||
- | + | {{Introduction to turbulence menu}} | |
+ | == Joint pdfs and joint moments == | ||
Often it is importamt to consider more than one random variable at a time. For example, in turbulence the three components of the velocity vector are interralated and must be considered together. In addition to the ''marginal'' (or single variable) statistical moments already considered, it is necessary to consider the '''joint''' statistical moments. | Often it is importamt to consider more than one random variable at a time. For example, in turbulence the three components of the velocity vector are interralated and must be considered together. In addition to the ''marginal'' (or single variable) statistical moments already considered, it is necessary to consider the '''joint''' statistical moments. | ||
Line 5: | Line 6: | ||
For example if <math>u</math> and <math>v</math> are two random variables, there are three second-order moments which can be defined <math>\left\langle u^{2} \right\rangle </math> , <math>\left\langle v^{2} \right\rangle </math> , and <math>\left\langle uv \right\rangle </math>. The product moment <math>\left\langle uv \right\rangle </math> is called the ''cross-correlation'' or ''cross-covariance''. The moments <math>\left\langle u^{2} \right\rangle </math> and <math>\left\langle v^{2} \right\rangle </math> are referred to as the ''covariances'', or just simply the ''variances''. Sometimes <math>\left\langle uv \right\rangle </math> is also referred to as the ''correlation''. | For example if <math>u</math> and <math>v</math> are two random variables, there are three second-order moments which can be defined <math>\left\langle u^{2} \right\rangle </math> , <math>\left\langle v^{2} \right\rangle </math> , and <math>\left\langle uv \right\rangle </math>. The product moment <math>\left\langle uv \right\rangle </math> is called the ''cross-correlation'' or ''cross-covariance''. The moments <math>\left\langle u^{2} \right\rangle </math> and <math>\left\langle v^{2} \right\rangle </math> are referred to as the ''covariances'', or just simply the ''variances''. Sometimes <math>\left\langle uv \right\rangle </math> is also referred to as the ''correlation''. | ||
- | In a manner similar to that used to build-up the probabilility density function from its measurable counterpart, the histogram, a '''joint probability density function''' (or '''jpdf'''),<math>B_{uv}</math> , can be built-up from the ''joint histogram''. Figure 2.5 illustrates several examples of jpdf's which have different cross correlations. For convenience the fluctuating variables <math>u'</math> and <math>v'</math> can be defined as | + | In a manner similar to that used to build-up the probabilility density function from its measurable counterpart, the histogram, a '''joint probability density function''' (or '''jpdf'''),<math>B_{uv}</math> , can be built-up from the ''joint histogram''. <font color="orange">Figure 2.5</font> illustrates several examples of jpdf's which have different cross correlations. For convenience the fluctuating variables <math>u'</math> and <math>v'</math> can be defined as |
+ | |||
+ | :<math>u' = u - U</math> | ||
- | + | :<math>v' = v - V</math> | |
- | :<math> | + | |
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | v' = v - V | + | |
- | </math> | + | |
- | + | ||
where as before capital letters are usd to represent the mean values. Clearly the fluctuating quantities <math>u'</math> and <math>v'</math> are random variables with zero mean. | where as before capital letters are usd to represent the mean values. Clearly the fluctuating quantities <math>u'</math> and <math>v'</math> are random variables with zero mean. | ||
Line 25: | Line 18: | ||
It is sometimes more convinient to deal with values of the cross-variances which have ben normalized by the appropriate variances. Thus the ''correlation coefficient'' is defined as: | It is sometimes more convinient to deal with values of the cross-variances which have ben normalized by the appropriate variances. Thus the ''correlation coefficient'' is defined as: | ||
- | |||
:<math> | :<math> | ||
\rho_{uv}\equiv \frac{ \left\langle u'v' \right\rangle}{ \left[ \left\langle u'^{2} \right\rangle \left\langle v'^{2} \right\rangle \right]^{1/2}} | \rho_{uv}\equiv \frac{ \left\langle u'v' \right\rangle}{ \left[ \left\langle u'^{2} \right\rangle \left\langle v'^{2} \right\rangle \right]^{1/2}} | ||
</math> | </math> | ||
- | < | + | |
+ | <font color="orange" size="3">Figure 2.5 not uploaded yet</font> | ||
The correlation coefficient is bounded by plus or minus one, the former representing perfect correlation and the latter perfect anti-correlation. | The correlation coefficient is bounded by plus or minus one, the former representing perfect correlation and the latter perfect anti-correlation. | ||
Line 35: | Line 28: | ||
As with the single-variable pdf, there are certain conditions the joint probability density function must satisfy. If <math>B_{uv}\left( c_{1}c_{2} \right)</math> indicates the jpdf of the random variables <math>u</math> and <math>v</math>, then: | As with the single-variable pdf, there are certain conditions the joint probability density function must satisfy. If <math>B_{uv}\left( c_{1}c_{2} \right)</math> indicates the jpdf of the random variables <math>u</math> and <math>v</math>, then: | ||
- | * '''Property 1''' | + | |
- | + | * '''Property 1''': <math> | |
- | :<math> | + | |
B_{uv}\left( c_{1}c_{2} \right) > 0 | B_{uv}\left( c_{1}c_{2} \right) > 0 | ||
- | </math> | + | </math>, always |
- | + | ||
- | |||
- | + | * '''Property 2''': <math> | |
- | * '''Property 2''' | + | |
- | + | ||
- | :<math> | + | |
Prob \left\{ c_{1} < u < c_{1} + dc_{1} , c_{2} < v < c_{2} + dc_{2} \right\} = B_{uv}\left( c_{1}c_{2} \right) dc_{1}, dc_{2} | Prob \left\{ c_{1} < u < c_{1} + dc_{1} , c_{2} < v < c_{2} + dc_{2} \right\} = B_{uv}\left( c_{1}c_{2} \right) dc_{1}, dc_{2} | ||
</math> | </math> | ||
- | |||
- | * '''Property 3''' | + | |
- | + | * '''Property 3''': <math> | |
- | :<math> | + | |
\int^{\infty}_{ - \infty} \int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{1} dc_{2} = 1 | \int^{\infty}_{ - \infty} \int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{1} dc_{2} = 1 | ||
</math> | </math> | ||
- | |||
- | * '''Property 4''' | + | |
- | + | * '''Property 4''': <math> | |
- | :<math> | + | |
\int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{2} = B_{u}\left( c_{1} \right) | \int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{2} = B_{u}\left( c_{1} \right) | ||
- | </math> | + | </math>, where <math>B_{u}</math> is a function of <math>c_{1}</math> only |
- | < | + | |
- | |||
- | * '''Property 5''' | + | * '''Property 5''': <math> |
- | + | ||
- | :<math> | + | |
\int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{1} = B_{v}\left( c_{2} \right) | \int^{\infty}_{ - \infty} B_{uv}\left( c_{1}c_{2} \right) dc_{1} = B_{v}\left( c_{2} \right) | ||
- | </math> | + | </math>, where <math>B_{v}</math> is a function of <math>c_{2}</math> only |
- | < | + | |
- | |||
- | The functions <math>B_{u}</math> and <math>B_{v}</math> are called the ''marginal probability density functions'' and they are simply the single variable pdf's defined earlier. The subscript is used to indicate which variable is left after the others are integrated out. Note that <math> B_{u}\left( c_{1} \right) </math> is not the same as <math> B_{uv}\left( c_{1},0 \right) </math>. The latter is only a slice through the <math>c_{2}</math> - axis, whale the marginal distribution is weighted by the integral of the distribution of the other variable. Figure 2.6. illustrates these differences. | + | The functions <math>B_{u}</math> and <math>B_{v}</math> are called the ''marginal probability density functions'' and they are simply the single variable pdf's defined earlier. The subscript is used to indicate which variable is left after the others are integrated out. Note that <math> B_{u}\left( c_{1} \right) </math> is not the same as <math> B_{uv}\left( c_{1},0 \right) </math>. The latter is only a slice through the <math>c_{2}</math> - axis, whale the marginal distribution is weighted by the integral of the distribution of the other variable. <font color="orange">Figure 2.6</font>. illustrates these differences. |
If the joint probability density function is known, the ''joint moments'' of all orders can be determined. Thus the <math>m,n</math> -th joint moment is | If the joint probability density function is known, the ''joint moments'' of all orders can be determined. Thus the <math>m,n</math> -th joint moment is | ||
- | + | ||
- | + | ||
:<math> | :<math> | ||
\left\langle \left( u- U \right)^{m} \left( v - V \right)^n \right\rangle = \int^{\infty}_{-\infty} \int^{\infty}_{-\infty} \left( c_{1} - U \right)^{m} \left( c_{2} - V \right)^{n} B_{uv}\left( c_{1} , c_{2} \right) dc_{1} dc_{2} | \left\langle \left( u- U \right)^{m} \left( v - V \right)^n \right\rangle = \int^{\infty}_{-\infty} \int^{\infty}_{-\infty} \left( c_{1} - U \right)^{m} \left( c_{2} - V \right)^{n} B_{uv}\left( c_{1} , c_{2} \right) dc_{1} dc_{2} | ||
</math> | </math> | ||
- | < | + | |
+ | <font color="orange" size="3">Figure 2.6 not uploaded yet</font> | ||
In the preceding discussions, only two random variables have been considered. The definitions, however, can easily be geberalized to accomodate any number of random variables. In addition, the joint statistics of a single random at different times or at different points in space could be considered. This will be done later when stationary and homogeneous random processes are considered. | In the preceding discussions, only two random variables have been considered. The definitions, however, can easily be geberalized to accomodate any number of random variables. In addition, the joint statistics of a single random at different times or at different points in space could be considered. This will be done later when stationary and homogeneous random processes are considered. | ||
- | + | == The bi-variate normal (or Gaussian) distribution == | |
If <math>u</math> and <math>v</math> are ''normally'' distributed random variables with standard deviations given by <math>\sigma_{u}</math> and <math>\sigma_{v}</math> respectively , with correlation coefficient <math>\rho_{uv}</math>, then their joint probability density function is given by | If <math>u</math> and <math>v</math> are ''normally'' distributed random variables with standard deviations given by <math>\sigma_{u}</math> and <math>\sigma_{v}</math> respectively , with correlation coefficient <math>\rho_{uv}</math>, then their joint probability density function is given by | ||
- | + | ||
- | + | ||
:<math> | :<math> | ||
B_{uvG} \left(c_{1},c_{2} \right) = \frac{1}{2 \pi \sigma_{u} \sigma_{v} }exp \left[ \frac{ \left( c_{1} - U \right)^{2} }{ 2\sigma^{2}_{u} } + \frac{ \left( c_{2}-V \right)^{2}}{2\sigma^{2}_{v} } - \rho_{uv}\frac{c_{1}c_{2}}{\sigma_{u} \sigma_{v}} \right] | B_{uvG} \left(c_{1},c_{2} \right) = \frac{1}{2 \pi \sigma_{u} \sigma_{v} }exp \left[ \frac{ \left( c_{1} - U \right)^{2} }{ 2\sigma^{2}_{u} } + \frac{ \left( c_{2}-V \right)^{2}}{2\sigma^{2}_{v} } - \rho_{uv}\frac{c_{1}c_{2}}{\sigma_{u} \sigma_{v}} \right] | ||
</math> | </math> | ||
- | |||
- | This distribution is plotted in Figure 2.7. for several values of <math>\rho_{uv}</math> where <math>u</math> and <math>v</math> are assumed to be identically distributed (i.e., <math> \left\langle u^{2} \right\rangle = \left\langle v^{2} \right\rangle </math> ). | + | This distribution is plotted in <font color="orange">Figure 2.7</font>. for several values of <math>\rho_{uv}</math> where <math>u</math> and <math>v</math> are assumed to be identically distributed (i.e., <math> \left\langle u^{2} \right\rangle = \left\langle v^{2} \right\rangle </math> ). |
It is straightforward to show (by completing the square and integrating) that this yields the single variable Gaussian distribution for the marginal distributions. It is also possible to write a ''multivariate Gaussian'' probability density function for any number of random variables. | It is straightforward to show (by completing the square and integrating) that this yields the single variable Gaussian distribution for the marginal distributions. It is also possible to write a ''multivariate Gaussian'' probability density function for any number of random variables. | ||
- | === Statistical independence and lack of correlation | + | <font color="orange" size="3">Figure 2.7 not uploaded yet</font> |
+ | |||
+ | == Statistical independence and lack of correlation == | ||
'''Definition: Statistical Independence''' Two random variables are said to be ''statistically independent'' if their joint probability density is equal to the product of their marginal probability density functions. That is, | '''Definition: Statistical Independence''' Two random variables are said to be ''statistically independent'' if their joint probability density is equal to the product of their marginal probability density functions. That is, | ||
- | + | ||
- | + | ||
:<math> | :<math> | ||
- | B_{uv}\left(c_{1}c_{2} \right) = B_{u}\left(c_{1} \right) B_{v} \left( c_{2} \right) | + | B_{uv}\left(c_{1}, c_{2} \right) = B_{u}\left(c_{1} \right) B_{v} \left( c_{2} \right) |
</math> | </math> | ||
- | |||
It is easy to see that statistical independence implies a complete lack of correlation; i.e., <math> \rho_{uv} \equiv 0 </math>. From the definition of the cross-correlation | It is easy to see that statistical independence implies a complete lack of correlation; i.e., <math> \rho_{uv} \equiv 0 </math>. From the definition of the cross-correlation | ||
- | + | ||
- | + | ||
:<math> | :<math> | ||
\begin{matrix} | \begin{matrix} | ||
Line 124: | Line 98: | ||
\end{matrix} | \end{matrix} | ||
</math> | </math> | ||
- | |||
- | where we have used equation 2 | + | where we have used the equation for <math>B_{uv}\left(c_{1}, c_{2} \right)</math> above since the first central moments are zero by definiion. |
It is important to note that the inverse is not true - ''lack of correlation does not imply statistical independence!'' To see this consider two identically distributed random variables, <math>u'</math> and <math>v'</math>, which have zero means and non-zero correlation <math> \left\langle u'v' \right\rangle </math>. From these two correlated random variables two other random variables <math>x</math> and <math>y</math>, can be formed as | It is important to note that the inverse is not true - ''lack of correlation does not imply statistical independence!'' To see this consider two identically distributed random variables, <math>u'</math> and <math>v'</math>, which have zero means and non-zero correlation <math> \left\langle u'v' \right\rangle </math>. From these two correlated random variables two other random variables <math>x</math> and <math>y</math>, can be formed as | ||
- | + | ||
- | + | :<math>x = u' + v'</math> | |
- | :<math> | + | |
- | x= u' + v' | + | :<math>y = u' - v'</math> |
- | </math> | + | |
- | + | ||
- | + | ||
- | + | ||
- | :<math> | + | |
- | y= u' - v' | + | |
- | </math> | + | |
- | + | ||
Clearly <math>x</math> and <math>y</math> are ''not'' statistically independent. They are, however, ''uncorrelated'' because: | Clearly <math>x</math> and <math>y</math> are ''not'' statistically independent. They are, however, ''uncorrelated'' because: | ||
- | + | ||
- | + | ||
:<math> | :<math> | ||
\begin{matrix} | \begin{matrix} | ||
- | \left\langle xy \right\rangle & = & | + | \left\langle xy \right\rangle & = & \left\langle \left( u'+ v' \right) \left( u' - v' \right) \right\rangle \\ |
- | & = & | + | & = & \left\langle u'^{2} \right\rangle + \left\langle u'v' \right\rangle - \left\langle u'v' \right\rangle - \left\langle v'^{2} \right\rangle \\ |
& = & 0 \\ | & = & 0 \\ | ||
\end{matrix} | \end{matrix} | ||
</math> | </math> | ||
- | </ | + | |
+ | since <math>u'</math> and <math>v'</math> are identically distributed (and as a consequence <math> \left\langle u'^{2} \right\rangle = \left\langle v'^{2} \right\rangle </math> ). | ||
+ | |||
+ | <font color="orange">Figure 2.8</font> illustrates the change of variables carried out above. The jpdf resulting from the transformation is symmetric about both axes, thereby eliminating the correlation. Transformation, however, does not insure that the distribution is separable, i.e., <math> B_{x,y} \left( a_{1},a_{2} \right) = B_{x} \left( a_{1} \right) B_{y} \left( a_{2} \right) </math>, as required for statistical independence. | ||
+ | |||
+ | <font color="orange" size="3">Figure 2.8 not uploaded yet</font> | ||
+ | |||
+ | {{Turbulence credit wkgeorge}} | ||
+ | |||
+ | {{Chapter navigation|Probability|Estimation from a finite number of realizations}} |
Latest revision as of 12:41, 21 June 2007
Contents |
Joint pdfs and joint moments
Often it is importamt to consider more than one random variable at a time. For example, in turbulence the three components of the velocity vector are interralated and must be considered together. In addition to the marginal (or single variable) statistical moments already considered, it is necessary to consider the joint statistical moments.
For example if and are two random variables, there are three second-order moments which can be defined , , and . The product moment is called the cross-correlation or cross-covariance. The moments and are referred to as the covariances, or just simply the variances. Sometimes is also referred to as the correlation.
In a manner similar to that used to build-up the probabilility density function from its measurable counterpart, the histogram, a joint probability density function (or jpdf), , can be built-up from the joint histogram. Figure 2.5 illustrates several examples of jpdf's which have different cross correlations. For convenience the fluctuating variables and can be defined as
where as before capital letters are usd to represent the mean values. Clearly the fluctuating quantities and are random variables with zero mean.
A positive value of indicates that and tend to vary together. A negative value indicates value indicates that when one variable is increasing the other tends to be decreasing. A zero value of indicates that there is no correlation between and . As will be seen below, it does not mean that they are statistically independent.
It is sometimes more convinient to deal with values of the cross-variances which have ben normalized by the appropriate variances. Thus the correlation coefficient is defined as:
Figure 2.5 not uploaded yet
The correlation coefficient is bounded by plus or minus one, the former representing perfect correlation and the latter perfect anti-correlation.
As with the single-variable pdf, there are certain conditions the joint probability density function must satisfy. If indicates the jpdf of the random variables and , then:
- Property 1: , always
- Property 2:
- Property 3:
- Property 4: , where is a function of only
- Property 5: , where is a function of only
The functions and are called the marginal probability density functions and they are simply the single variable pdf's defined earlier. The subscript is used to indicate which variable is left after the others are integrated out. Note that is not the same as . The latter is only a slice through the - axis, whale the marginal distribution is weighted by the integral of the distribution of the other variable. Figure 2.6. illustrates these differences.
If the joint probability density function is known, the joint moments of all orders can be determined. Thus the -th joint moment is
Figure 2.6 not uploaded yet
In the preceding discussions, only two random variables have been considered. The definitions, however, can easily be geberalized to accomodate any number of random variables. In addition, the joint statistics of a single random at different times or at different points in space could be considered. This will be done later when stationary and homogeneous random processes are considered.
The bi-variate normal (or Gaussian) distribution
If and are normally distributed random variables with standard deviations given by and respectively , with correlation coefficient , then their joint probability density function is given by
This distribution is plotted in Figure 2.7. for several values of where and are assumed to be identically distributed (i.e., ).
It is straightforward to show (by completing the square and integrating) that this yields the single variable Gaussian distribution for the marginal distributions. It is also possible to write a multivariate Gaussian probability density function for any number of random variables.
Figure 2.7 not uploaded yet
Statistical independence and lack of correlation
Definition: Statistical Independence Two random variables are said to be statistically independent if their joint probability density is equal to the product of their marginal probability density functions. That is,
It is easy to see that statistical independence implies a complete lack of correlation; i.e., . From the definition of the cross-correlation
where we have used the equation for above since the first central moments are zero by definiion.
It is important to note that the inverse is not true - lack of correlation does not imply statistical independence! To see this consider two identically distributed random variables, and , which have zero means and non-zero correlation . From these two correlated random variables two other random variables and , can be formed as
Clearly and are not statistically independent. They are, however, uncorrelated because:
since and are identically distributed (and as a consequence ).
Figure 2.8 illustrates the change of variables carried out above. The jpdf resulting from the transformation is symmetric about both axes, thereby eliminating the correlation. Transformation, however, does not insure that the distribution is separable, i.e., , as required for statistical independence.
Figure 2.8 not uploaded yet
Credits
This text was based on "Lectures in Turbulence for the 21st Century" by Professor William K. George, Professor of Turbulence, Chalmers University of Technology, Gothenburg, Sweden.