CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Lag while running DES with pimpleFoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   January 29, 2013, 10:47
Default Lag while running DES with pimpleFoam
  #1
New Member
 
David Bergman
Join Date: Jun 2012
Posts: 4
Rep Power: 13
dabe is on a distinguished road
Hello fellow Foamers!

I was wondering if anyone has experienced a sort of "lag" upon reaching the nuTilda calculation step while doing a DDES simulation with pimpleFoam?

Im running my case in parallel on 128 cpu's with approx 50k cells on each. All other variables seem to be calculated almost instantantly but when it reaches nuTilda it seems to freeze or lag for a second or two and then continue. And it only does one iteration on nuTilda, it shouldn't take so long right? This lag causes my simulation time to pretty much double which is really not good since it already is quite long.

Anyone experienced this or something similar? Any known solutions on how to prevent this?

I suspect there could be something wrong with my fvSolution or fvSchemes setup so I'm posting these below.

Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.0.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSchemes;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

ddtSchemes
{
    default         CrankNicholson 0.5;
}

gradSchemes
{
    default         Gauss linear;
    grad(p)         Gauss linear;
    grad(U)         Gauss linear;
}

divSchemes   // filteredLinear(2) is best according to Eugene de Villiers
{
    default         none;
    div(phi,U)      Gauss limitedLinearV 1;
    div(phi,nuTilda) Gauss upwind;
    div((nuEff*dev(T(grad(U))))) Gauss linear;
}

laplacianSchemes
{
    default         none;
    laplacian(nuEff,U) Gauss linear corrected; //limited 0.5;
    laplacian((1|A(U)),p) Gauss linear corrected; // limited 0.5;
    laplacian(DnuTildaEff,nuTilda) Gauss linear corrected; // limited 0.5;
    laplacian(1,p)  Gauss linear corrected; // limited 0.5;
}

interpolationSchemes
{
    default         linear;
    interpolate(U)  linear;
}

snGradSchemes
{
    default         corrected;
}

fluxRequired
{
    default         no;
    p               ;
}


// ************************************************************************* //
Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.0.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSolution;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

solvers
{
    p
    {
        solver          GAMG;
        tolerance       1e-04;
        //relTol          0.01;
        smoother        GaussSeidel;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;

//         solver          PCG;
//         preconditioner  FDIC;
//         tolerance       1e-04;
//         //relTol          0.1;
// 	maxIter		500;
    }

    pFinal
    {
        solver          GAMG;
        tolerance       1e-05;
        //relTol          0.01;
        smoother        GaussSeidel;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;

//         solver          PCG;
//         preconditioner  FDIC;
//         tolerance       1e-05;
//         //relTol          0.1;
// 	maxIter		500;
    }

    "(U|nuTilda)"
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance       1e-06;
        //relTol          0.1;
    }

    "(U|nuTilda)Final"
    {
        solver          PBiCG;
        preconditioner  DILU;
        tolerance       1e-06;
        //relTol          0.1;
    }
}

// PISO
// {
//     nCorrectors     2;
//     nNonOrthogonalCorrectors 0;
//     pRefCell        0;
//     pRefValue       0;
// }

PIMPLE
{
  nOuterCorrectors 2;
    nCorrectors     2;
    nNonOrthogonalCorrectors 0;
    pRefCell        0;
    pRefValue       0;
}

relaxationFactors
{
    "p.*"               0.3;
    "U.*"               0.5;
    "nuTilda.*"         0.5;
}


// ************************************************************************* //
Thanks!
dabe is offline   Reply With Quote

Old   January 30, 2013, 05:06
Default
  #2
New Member
 
David Bergman
Join Date: Jun 2012
Posts: 4
Rep Power: 13
dabe is on a distinguished road
Edit, just tried running the case with the turbulence model turned off. The problem was still present and seams to occur between the timesteps.
dabe is offline   Reply With Quote

Old   January 30, 2013, 14:34
Default
  #3
Senior Member
 
sail's Avatar
 
Vieri Abolaffio
Join Date: Jul 2010
Location: Always on the move.
Posts: 308
Rep Power: 16
sail is on a distinguished road
maybe you set the controlDict to save every iteration and the lag you are seeing is the time taken to write the files on the (slow, compared to ram) disk? or it can be an issue due to latencies on your network fabric. can you try running the case on 32 or 64 cores and see if the speedup is actually good? 50k cells per core is starting to be a low value, in my experience.
__________________
http://www.leadingedge.it/
Naval architecture and CFD consultancy
sail is offline   Reply With Quote

Old   January 31, 2013, 03:09
Default
  #4
New Member
 
David Bergman
Join Date: Jun 2012
Posts: 4
Rep Power: 13
dabe is on a distinguished road
Hello Vieri and thanks for your reply,

The controlDict is unfortunately not set to save every iteration but with a fairly large interval. The cluster I'm running on is equipped with infiniband so there shouldn't be any latencies due to the network, I think.

I've tried running the case decomposed on 64, 72, 100 and also 144 cores. They all pretty much behave like when decomposed on 128 cores except the 64 and 72 configurations which improve the calculation time for one time step by approx 1s down to ~3 seconds per timstep compared to ~4 seconds with the other configurations. Now the problem is that still out of these 4 seconds approximately 3 seconds are due to the lag while the U and p calculations are carried out really smoothly. For the 64 and 72 case the lag seems to be less but still constitutes the major part of the time taken for calculating one time step.

Now i thought this could either be coupled to the SA turbulence model or to the DDES model so I tried running the case in URANS with both kOmegaSST and SA as turbulence models. Both of them run really smoothly and scaled very whell which means that there is no problem with the SA-model itself. This brings me to the thought that either there could be a scaling problem with DDES or LES in OpenFOAM?

Any thoughts?
dabe is offline   Reply With Quote

Old   January 31, 2013, 10:22
Default
  #5
New Member
 
David Bergman
Join Date: Jun 2012
Posts: 4
Rep Power: 13
dabe is on a distinguished road
Problem solved!

The lag was caused by using vanDriest as delta. By switching to cubeRootVol the lag completely dissapeared.

However should using vanDriest be so computationally expensive?
dabe is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Running stably with a large Courant number using pimpleFoam mlawson OpenFOAM 11 November 29, 2014 10:37
Problem running IDDES with pimpleFoam charlie OpenFOAM Running, Solving & CFD 7 July 11, 2013 11:45
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 09:38
How to use DES well? Daniel Main CFD Forum 0 October 26, 2008 05:59
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 07:52


All times are GMT -4. The time now is 21:15.