# Probability density function

(Difference between revisions)
 Revision as of 11:31, 18 October 2005 (view source)Salva (Talk | contribs)← Older edit Revision as of 13:37, 19 October 2005 (view source)Salva (Talk | contribs) Newer edit → Line 26: Line 26: \int P(\Phi) d \Phi = 1 \int P(\Phi) d \Phi = 1 [/itex] [/itex] - Integrating over all the possible values of $\phi$. + Integrating over all the possible values of $\phi$, + $\Phi$ is the sample space of the scalar variable $\phi$. The PDF of any stochastic variable depends "a-priori" on space and time. The PDF of any stochastic variable depends "a-priori" on space and time. :$P(\Phi;x,t)$ :$P(\Phi;x,t)$ + for clarity of notation, the space and time dependence is dropped. + $P(\Phi) \equiv P(\Phi;x,t)$ + From the PDF of a variable, one can define its $n$th moment as From the PDF of a variable, one can define its $n$th moment as Line 55: Line 59: For two variables (or more) a joint-PDF  of $\phi$ and $\psi$ is defined For two variables (or more) a joint-PDF  of $\phi$ and $\psi$ is defined - :$P(\Phi,\Psi;x,t)$ + :$P(\Phi,\Psi;x,t) \equiv P (\Phi,\Psi)$ - and the marginal PDF's are obatined by integration over the sample space of one variable. + where  $\Phi \mbox{ and } \Psi$ form the phase-space for + $\phi \mbox{ and } \psi$. + The marginal PDF's are obtained by integration over the sample space of one variable. :$:[itex] P(\Phi) = \int P(\Phi,\Psi) d\Psi P(\Phi) = \int P(\Phi,\Psi) d\Psi Line 75: Line 81:$ [/itex] where  $P(\Phi|\Psi)$ is the conditional PDF. where  $P(\Phi|\Psi)$ is the conditional PDF. + + The conditional average of a scalar  can be expressed as a function of the + conditional PDF + :$+ <\phi | \Psi > = \int \phi P(\Phi|\Psi) d \Phi +$ + and the mean value of a scalar can be expressed + + :$+ \overline{\phi} = \int <\phi | \Psi > P(\Psi) d \Psi +$ + only if $\phi$ and $\psi$ are correlated. If two variables are uncorrelated then they are statistically independent and their joint PDF can be expressed as a product of their marginal PDFs. If two variables are uncorrelated then they are statistically independent and their joint PDF can be expressed as a product of their marginal PDFs.

## Revision as of 13:37, 19 October 2005

Stochastic methods use distribution functions to decribe the fluctuacting scalars in a turbulent field.

The distribution function $F_\phi(\Phi)$ of a scalar $\phi$ is the probability $p$ of finding a value of $\phi < \Phi$

$F_\phi(\Phi) = p(\phi < \Phi)$

The probability of finding $\phi$ in a range $\Phi_1,\Phi_2$ is

$p(\Phi_1 <\phi < \Phi_2) = F_\phi(\Phi_2)-F_\phi(\Phi_1)$

The probability density function (PDF) is

$P(\Phi)= \frac{d F_\phi(\Phi)} {d \Phi}$

where $P(\Phi) d\Phi$ is the probability of $\phi$ being in the range $(\Phi,\Phi+d\Phi)$. It follows that

$\int P(\Phi) d \Phi = 1$

Integrating over all the possible values of $\phi$, $\Phi$ is the sample space of the scalar variable $\phi$. The PDF of any stochastic variable depends "a-priori" on space and time.

$P(\Phi;x,t)$

for clarity of notation, the space and time dependence is dropped. $P(\Phi) \equiv P(\Phi;x,t)$

From the PDF of a variable, one can define its $n$th moment as

$\overline{\phi}^n = \int \phi^n P(\Phi) d \Phi$

the $n = 1$ case is called the "mean".

$\overline{\phi} = \int \phi P(\Phi) d \Phi$

Similarly the mean of a function can be obtained as

$\overline{f} = \int f(\phi) P(\Phi) d \Phi$

Where the second central moment is called the "variance"

$\overline{u'^2} = \int (\phi-\overline{\phi}) P(\Phi) d \Phi$

For two variables (or more) a joint-PDF of $\phi$ and $\psi$ is defined

$P(\Phi,\Psi;x,t) \equiv P (\Phi,\Psi)$

where $\Phi \mbox{ and } \Psi$ form the phase-space for $\phi \mbox{ and } \psi$. The marginal PDF's are obtained by integration over the sample space of one variable.

$P(\Phi) = \int P(\Phi,\Psi) d\Psi$

For two variables the correlation is given by

$\overline{\phi' \psi'} = \int (\phi-\overline{\phi}) (\psi-\overline{\psi}) P(\Phi,\Psi) d \Phi d\Psi$

This term often appears in turbulent flows the averaged Navier-Stokes (with $u, v$) and is unclosed.

Using Bayes' theorem a joint-pdf can be expressed as

$P(\Phi,\Psi) = P(\Phi|\Psi) P(\Psi)$

where $P(\Phi|\Psi)$ is the conditional PDF.

The conditional average of a scalar can be expressed as a function of the conditional PDF

$<\phi | \Psi > = \int \phi P(\Phi|\Psi) d \Phi$

and the mean value of a scalar can be expressed

$\overline{\phi} = \int <\phi | \Psi > P(\Psi) d \Psi$

only if $\phi$ and $\psi$ are correlated.

If two variables are uncorrelated then they are statistically independent and their joint PDF can be expressed as a product of their marginal PDFs.

$P(\Phi,\Psi)= P(\Phi) P(\Psi)$