CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Help to make LES simulation faster (https://www.cfd-online.com/Forums/openfoam-solving/127033-help-make-les-simulation-faster.html)

Pj. December 2, 2013 04:54

Help to make LES simulation faster
 
Hi everybody,

I'l trying to run a wind tunnel simulation with an LES turbulence model with pisoFoam. The domain is made of 17M cells. The simulation in made of a channel with some roughness blocks in the firsts 3/4 of it, and with a building model in the last quarter.

Actually I'm solving with a time step equal to 0.0001s (10'000 Hz). This is done to keep the maxCo below one. Actually my maxCo is ~0.88.

The problem is that the meanCo is something like 0.007 and the simulation is running very slowly. With 252 CPUs it's taking more or less 2.9s per iterations. This means that it takes 1 day to solve 1.25 seconds.

Since i need to simulate at least 2-3 minutes and i can't use more CPU, what can i do?
I was thinking to increase the time step since the meanCo is way smaller than 1, but this would cause the maxCo to be bigger than 1 in some points. If this point is in the roughness region (where i don't really care if the solution is wrong in some small portions) do you think the solution will still converge?

How can i know "where" the Courant is bigger than 1?

Do you have other ideas other than increasing the timestep? Maybe use pimpleFoam?

Thanks a lot. Regards,
Luca

Bernhard December 2, 2013 05:34

It is a bit difficult to give any advise, could you maybe share fvSchemes and a snippet of the log file?

Quote:

How can i know "where" the Courant is bigger than 1?
Use to Co utility.

haakon December 2, 2013 07:22

A low timestep and high resolution is some of the nature of LES. It's unavoidable. You can of course adjust the mesh in problematic areas, and perhaps end up with a slightly larger timestep, however, in the end you cannot overcome the fact that LES is computationally intensive by nature. If you only can afford a RANS simulation, stick to that. Perhaps you can use a RANS model to initiate a physical sane initial condition to save some time?

Lieven December 2, 2013 07:55

I'm with haakon on this one. LES is by definition expensive so don't expect to be able to do it cheaply. If you find a way to do this, it will probably make you a very rich man ;-). I would even recommend to further reduce your time step such that also maxCo is significantly smaller than 1.0 (for time integration accuracy). I would not switch to pimpleFoam since underrelaxation is quite unphysical in an LES context.

As I see it, there are two things you can do:
1. adjust your mesh. If you can't afford a simulation of 17M cells, just don't do it then. A converged solution on a 4M cells mesh is probably better than a halfway converged solution on a 17M cells.
2. Use a cheaper LES turbulence model (if you can) e.g. dynamics models are more expensive to compute than the classical smagorisky model.

Using a RANS model to compute an initial field could help you speed up the conversion but this is certainly not guaranteed.

Cheers,

L

Pj. December 2, 2013 08:34

Hi Bernhard. Thank you for reading.

Here is my fvSchemes:
Code:

ddtSchemes
{
    default        backward;
}

gradSchemes
{
    default        Gauss linear;
    grad(p)        Gauss linear;
    grad(U)        Gauss linear;
}

divSchemes
{
    default        none;
    div(phi,U)      Gauss linear;
    div(phi,k)      Gauss limitedLinear 1;
    div(phi,B)      Gauss limitedLinear 1;
    div(phi,nuTilda) Gauss limitedLinear 1;
    div(B)          Gauss linear;
    div((nuEff*dev(T(grad(U))))) Gauss linear;
}

laplacianSchemes
{
    default        none;
    laplacian(nuEff,U) Gauss linear corrected;
    laplacian((1|A(U)),p) Gauss linear corrected;
    laplacian(DkEff,k) Gauss linear corrected;
    laplacian(DBEff,B) Gauss linear corrected;
    laplacian(DnuTildaEff,nuTilda) Gauss linear corrected;
}

interpolationSchemes
{
    default        linear;
    interpolate(U)  linear;
}

snGradSchemes
{
    default        corrected;
}

fluxRequired
{
    default        no;
    p              ;
}

and a piece of the log

Code:

Time = 1.2372

Courant Number mean: 0.00771093 max: 0.898045
DILUPBiCG:  Solving for Ux, Initial residual = 4.38996e-05, Final residual = 8.81599e-09, No Iterations 1
DILUPBiCG:  Solving for Uy, Initial residual = 0.0012287, Final residual = 2.04545e-07, No Iterations 1
DILUPBiCG:  Solving for Uz, Initial residual = 0.000869898, Final residual = 1.5147e-07, No Iterations 1
DICPCG:  Solving for p, Initial residual = 0.00549978, Final residual = 0.000262121, No Iterations 3
time step continuity errors : sum local = 2.52331e-10, global = -5.10701e-14, cumulative = 2.4203e-09
DICPCG:  Solving for p, Initial residual = 0.000364781, Final residual = 9.84024e-07, No Iterations 97
time step continuity errors : sum local = 9.47303e-13, global = -4.74805e-14, cumulative = 2.42025e-09
ExecutionTime = 84873 s  ClockTime = 85168 s

forceCoeffs output:
    Cm    = 2.37554
    Cd    = 39.8978
    Cl    = 64.6208
    Cl(f) = 34.6859
    Cl(r) = 29.9349

Time = 1.23725

Courant Number mean: 0.00771094 max: 0.89373
DILUPBiCG:  Solving for Ux, Initial residual = 4.39007e-05, Final residual = 8.81516e-09, No Iterations 1
DILUPBiCG:  Solving for Uy, Initial residual = 0.00122869, Final residual = 2.04562e-07, No Iterations 1
DILUPBiCG:  Solving for Uz, Initial residual = 0.0008699, Final residual = 1.51441e-07, No Iterations 1
DICPCG:  Solving for p, Initial residual = 0.0055296, Final residual = 0.000264223, No Iterations 3
time step continuity errors : sum local = 2.54366e-10, global = -5.4272e-14, cumulative = 2.4202e-09
DICPCG:  Solving for p, Initial residual = 0.000368202, Final residual = 9.85297e-07, No Iterations 115
time step continuity errors : sum local = 9.48546e-13, global = -5.09298e-14, cumulative = 2.42015e-09
ExecutionTime = 84875.5 s  ClockTime = 85170 s

forceCoeffs output:
    Cm    = 2.38226
    Cd    = 39.8857
    Cl    = 64.572
    Cl(f) = 34.6683
    Cl(r) = 29.9038

Time = 1.2373

Courant Number mean: 0.00771095 max: 0.889406
DILUPBiCG:  Solving for Ux, Initial residual = 4.39019e-05, Final residual = 8.81608e-09, No Iterations 1
DILUPBiCG:  Solving for Uy, Initial residual = 0.00122868, Final residual = 2.04589e-07, No Iterations 1
DILUPBiCG:  Solving for Uz, Initial residual = 0.000869913, Final residual = 1.51427e-07, No Iterations 1
DICPCG:  Solving for p, Initial residual = 0.00552333, Final residual = 0.000261467, No Iterations 3
time step continuity errors : sum local = 2.51721e-10, global = -5.30145e-14, cumulative = 2.4201e-09
DICPCG:  Solving for p, Initial residual = 0.00036565, Final residual = 9.99106e-07, No Iterations 94
time step continuity errors : sum local = 9.61889e-13, global = -5.46456e-14, cumulative = 2.42004e-09
ExecutionTime = 84877.9 s  ClockTime = 85172 s

forceCoeffs output:
    Cm    = 2.36583
    Cd    = 39.8604
    Cl    = 64.5751
    Cl(f) = 34.6534
    Cl(r) = 29.9217

At the moment i'm looking into pimpleFoam to have a more robust solution with bigger timesteps, but i can't find a proper description of how to set it up. I'm trying more or less with no clue.

Thank you very much

Bernhard December 2, 2013 09:01

Where I said fvSchemes I meant fvSolution, excuse me. Did you ever try to solve the pressure equation using a GAMG method? On 100-200 cpus it is said to be more efficient in solving the pressure equation (generally) than PCG. You are using more processors, but it might be worth experimenting a bit with these settings.

Pj. December 2, 2013 09:15

I know LES is computationally expensive. I don't expect to make it quickly and easily. I was just asking if there was a way to make it a little faster. Maybe with some improvements I could achieve to solve the case twice or four time faster. That would mean a lot of gain, even if the computation is still very expensive.

I will try to initialise the case with a RANS or with a coarser mesh.
I will also try to switch to GAMG and see if i find any improvements.

About the LES model I already use a Smagorinsky one, so I can't use a cheaper one.

Lastly, there are many papers that suggest that RANS is not so good in my field of study (wind flow around a low rise building), so i can't switch to that.

PS: this is my fvSolution:
Code:

solvers
{
    p
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance      1e-06;
        relTol          0.05;
    }

    pFinal
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance      1e-06;
        relTol          0;
    }

    U
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }

    k
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }

    nuTilda
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }
}

PISO
{
    nCorrectors    2;
    nNonOrthogonalCorrectors 0;
    pRefCell        0;
    pRefValue      0;
}


Lieven December 2, 2013 10:40

Hi PJ,

Two remarks about the fvSchemes:
1. you should set relTol of p also to 0.
2. Are you sure nCorrectors = 2 is enough for your mesh? If you have a fully orthogonal mesh, you can set this to 0. If not, 2 is most likely too little...

Cheers,

L

Bernhard December 3, 2013 02:07

Quote:

Originally Posted by Lieven (Post 464419)
1. you should set relTol of p also to 0.

This does not make sense to me. Why would you want to solve an intermediate pressure to full convergence?

Lieven December 3, 2013 03:41

How sorry, your fully correct Bernhard!
I wasn't paying attention to the pFinal entry :-D (the solver I'm using doesn't have it).

Cheers,

Lieven

Pj. December 3, 2013 05:03

Thank everybody for your kind help.
Now i'm running some tests on a smaller 2M cells case to benchmark a bit the solutions you proposed.

I know that LES is expensive, I'm not asking to run it on my laptop in few hours. But in my lab we already ran 15-20M cells cases that were solving about 10-15 seconds every day. In a week we had our 1-2 minutes simulation. I could accept to run the simulation in 2 weeks, but not in 8.

The problem with this case is that I have some very small cells near the model and there the Co number is 100 times bigger than everywhere else. To maintain the Co < 1 there i have to solve with a timestep 100 times smaller and though this slow down the simulation a lot.

I was looking for the best way to solve this. Make a coarser mesh is of course and option, but it's not good for our purpose. I was therefore looking is a better solution exists.

Thank you very much. I will post the results of the benchmark as soon as i have them

RodriguezFatz December 4, 2013 04:27

Hi, you mentioned that you use LES because RANS doesn't work. Did you try the SAS model?

lonewanderer December 9, 2019 06:33

:confused:How can we convert LES simulation to RANS simulation

cryabroad December 10, 2019 02:38

If other people have used same amount of cells as yours and their cases are running way faster than yous, I don't think it means that much. It really depends on the details of the mesh. You mentioned something about very small cells near the model (I suppose you mean the walls of a building or something similar), do you really need that refined mesh there? What is the yPlus value? Obviously in LES you want to have a very refined mesh, expecially near the wall, but if the wall areas are not that important, maybe a wall model is enough. Typically, walls are crucial for internal flows, but may not be that important for external flows. My experience is that the spalding wall model (nutUSpaldingWallFunction) works really well.

I always initiate my LES from an unsteady RANS simulation. For the unsteady RANS I use very large time steps, and because of this sometimes I have to switch my time scheme to first order (If not it easily diverges). Not sure if it's the correct way of doing things though. Note that sometimes it does make your case run faster but sometimes it does not. Again, it depends on the actual problem.

Santiago December 10, 2019 04:13

Quote:

Originally Posted by Pj. (Post 464401)
I know LES is computationally expensive. I don't expect to make it quickly and easily. I was just asking if there was a way to make it a little faster. Maybe with some improvements I could achieve to solve the case twice or four time faster. That would mean a lot of gain, even if the computation is still very expensive.

I will try to initialise the case with a RANS or with a coarser mesh.
I will also try to switch to GAMG and see if i find any improvements.

About the LES model I already use a Smagorinsky one, so I can't use a cheaper one.

Lastly, there are many papers that suggest that RANS is not so good in my field of study (wind flow around a low rise building), so i can't switch to that.

PS: this is my fvSolution:
Code:

solvers
{
    p
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance      1e-06;
        relTol          0.05;
    }

    pFinal
    {
        solver          PCG;
        preconditioner  DIC;
        tolerance      1e-06;
        relTol          0;
    }

    U
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }

    k
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }

    nuTilda
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance      1e-05;
        relTol          0;
    }
}

PISO
{
    nCorrectors    2;
    nNonOrthogonalCorrectors 0;
    pRefCell        0;
    pRefValue      0;
}


For the inversion of symmetric matrices (e.g. The pressure Poisson Equation) you can gain A LOT by using Multigrid (GAMG) instead of PCG. It might be less robust but, since you are doing Smagorinsky LES correctly I assume, it will not be a problem because you have a structured grid anyway.


All times are GMT -4. The time now is 01:55.