CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Divergence in parallel running with large number of processors (https://www.cfd-online.com/Forums/openfoam-solving/248820-divergence-parallel-running-large-number-processors.html)

nguyenhung97 April 4, 2023 02:06

Divergence in parallel running with large number of processors
 
2 Attachment(s)
Hi guys,

I have trouble running my case with a larger number of processors. My case first succeeded with 36 processors. However, when I increased the number of processors to 72, it diverged with the same time step, fvSchemes, fvSolution, etc.

As you can see in the attachments, at the same time of 0.45316s, the results are totally different. Does anyone know why this is the case? I used the scotch decomposition method. Thank you very much.

The fvSchemes and fvSolution are set up as follows:
fvSchemes
Code:

ddtSchemes
{
    default        backward; //Euler;
}

gradSchemes
{
    default        Gauss linear;//Gauss linear;
    grad(U)        Gauss linear;
}

divSchemes
{
    default            none;

    div(phi,U)          Gauss LUST grad(U); //linear;
    div((nuEff*dev2(T(grad(U)))))  Gauss linear; //linear;

}

laplacianSchemes
{
    default        Gauss linear limited corrected 0.333;//Gauss linear corrected;
}

interpolationSchemes
{
    default        linear;
}

snGradSchemes
{
    default        limited corrected 0.333;//corrected;
}

wallDist
{
    method meshWave;
}

fvSolution
Code:

solvers
{
    p
    {
        solver          GAMG;
        tolerance      1e-5;
        relTol          0.01;
        smoother        GaussSeidel;
    }

    pFinal
    {
        $p;
        smoother        DICGaussSeidel;
        tolerance      1e-05;
        relTol          0;
    }

    "(U|k|epsilon|omega|R|nut|nuTilda)"
    {
        solver          smoothSolver;
        smoother        symGaussSeidel;
        tolerance      1e-05;
        relTol          0;
    }

    "(U|k|omega|nut|nuTilda)Final"
    {
        $U;
        tolerance      1e-05;
        relTol          0;
    }
}

PISO
{
    nCorrectors    5;//1; pressure corrector
    nNonOrthogonalCorrectors 2;//2;
    nOuterCorrectors 100;//momentum corrector

    innerCorrectorResidualControl
    {
        p
        {
            relTol      0;

            // If this inital tolerance is reached, leave
            tolerance  1e-5;
        }

        U
        {
            relTol      0;

            // If this initial tolerance is reached, leave
            tolerance  1e-5;
        }
    }

    residualControl
    {
        p      1e-5;
        U      1e-5;
        "(nut|k|epsilon|omega|f|v2)" 1e-5;
    }
}

relaxationFactors
{
    p              0.5;
    U              0.5;
}


dlahaye April 4, 2023 02:26

Possibly an issue with the linear system solve accuracy.

Try tolerance 1e-12; and relTol 1e-7; for all fields.

nguyenhung97 April 10, 2023 01:48

1 Attachment(s)
Quote:

Originally Posted by dlahaye (Post 847541)
Possibly an issue with the linear system solve accuracy.

Try tolerance 1e-12; and relTol 1e-7; for all fields.

Thank you for your suggestions and sorry for the late reply because my simulation is quite large and took some time.

Unfortunately, it still didn't work following your suggestions. The Courant is still very high (attached file).

Do you have any other ideas? Thank you.

quarkz August 28, 2023 10:27

Hi, I have similar problems - prob diverged around the starting pt. What I do is I run with a small no. of procs for a short interval like 2e-5. Then I restart from there with a larger no. of procs. FYI, I am running an incompressible case with pressure gradient at all boundaries equal to zero. Hence, I have to denote a location with p = 0.

Similarly, I have also experienced the opposite case:
https://www.cfd-online.com/Forums/op...cs-number.html

Anton21 September 8, 2023 14:02

Maybe using a smaller time step would help


All times are GMT -4. The time now is 04:45.