CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Divergence in parallel running with large number of processors

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 4, 2023, 02:06
Unhappy Divergence in parallel running with large number of processors
  #1
New Member
 
Hung
Join Date: Dec 2021
Posts: 3
Rep Power: 4
nguyenhung97 is on a distinguished road
Hi guys,

I have trouble running my case with a larger number of processors. My case first succeeded with 36 processors. However, when I increased the number of processors to 72, it diverged with the same time step, fvSchemes, fvSolution, etc.

As you can see in the attachments, at the same time of 0.45316s, the results are totally different. Does anyone know why this is the case? I used the scotch decomposition method. Thank you very much.

The fvSchemes and fvSolution are set up as follows:
fvSchemes
Code:
ddtSchemes
{
    default         backward; //Euler;
}

gradSchemes
{
    default         Gauss linear;//Gauss linear;
    grad(U)         Gauss linear;
}

divSchemes
{
    default             none;

    div(phi,U)          Gauss LUST grad(U); //linear;
    div((nuEff*dev2(T(grad(U)))))  Gauss linear; //linear;

}

laplacianSchemes
{
    default         Gauss linear limited corrected 0.333;//Gauss linear corrected;
}

interpolationSchemes
{
    default         linear;
}

snGradSchemes
{
    default         limited corrected 0.333;//corrected;
}

wallDist
{
    method meshWave;
}
fvSolution
Code:
solvers
{
    p
    {
        solver          GAMG;
        tolerance       1e-5;
        relTol          0.01;
        smoother        GaussSeidel;
    }

    pFinal
    {
        $p;
        smoother        DICGaussSeidel;
        tolerance       1e-05;
        relTol          0;
    }

    "(U|k|epsilon|omega|R|nut|nuTilda)"
    {
        solver          smoothSolver;
        smoother        symGaussSeidel;
        tolerance       1e-05;
        relTol          0;
    }

    "(U|k|omega|nut|nuTilda)Final"
    {
        $U;
        tolerance       1e-05;
        relTol          0;
    }
}

PISO
{
    nCorrectors     5;//1; pressure corrector
    nNonOrthogonalCorrectors 2;//2;
    nOuterCorrectors 100;//momentum corrector

    innerCorrectorResidualControl
    {
        p
        {
            relTol      0;

            // If this inital tolerance is reached, leave
            tolerance   1e-5;
        }

        U
        {
            relTol      0;

            // If this initial tolerance is reached, leave
            tolerance   1e-5;
        }
    }

    residualControl
    {
        p       1e-5;
        U       1e-5;
        "(nut|k|epsilon|omega|f|v2)" 1e-5;
    }
}

relaxationFactors
{
    p               0.5;
    U               0.5;
}
Attached Images
File Type: png Capture.PNG (103.2 KB, 9 views)
File Type: png capture1.PNG (96.5 KB, 6 views)
nguyenhung97 is offline   Reply With Quote

Old   April 4, 2023, 02:26
Default
  #2
Senior Member
 
Domenico Lahaye
Join Date: Dec 2013
Posts: 736
Blog Entries: 1
Rep Power: 17
dlahaye is on a distinguished road
Possibly an issue with the linear system solve accuracy.

Try tolerance 1e-12; and relTol 1e-7; for all fields.
dlahaye is offline   Reply With Quote

Old   April 10, 2023, 01:48
Default
  #3
New Member
 
Hung
Join Date: Dec 2021
Posts: 3
Rep Power: 4
nguyenhung97 is on a distinguished road
Quote:
Originally Posted by dlahaye View Post
Possibly an issue with the linear system solve accuracy.

Try tolerance 1e-12; and relTol 1e-7; for all fields.
Thank you for your suggestions and sorry for the late reply because my simulation is quite large and took some time.

Unfortunately, it still didn't work following your suggestions. The Courant is still very high (attached file).

Do you have any other ideas? Thank you.
Attached Images
File Type: png Capture.PNG (56.6 KB, 10 views)
nguyenhung97 is offline   Reply With Quote

Old   August 28, 2023, 10:27
Default
  #4
Senior Member
 
TWB
Join Date: Mar 2009
Posts: 402
Rep Power: 19
quarkz is on a distinguished road
Hi, I have similar problems - prob diverged around the starting pt. What I do is I run with a small no. of procs for a short interval like 2e-5. Then I restart from there with a larger no. of procs. FYI, I am running an incompressible case with pressure gradient at all boundaries equal to zero. Hence, I have to denote a location with p = 0.

Similarly, I have also experienced the opposite case:
Problem diverges or not depending on procs number
quarkz is offline   Reply With Quote

Old   September 8, 2023, 14:02
Default
  #5
New Member
 
Antonio Lau
Join Date: Dec 2022
Posts: 3
Rep Power: 3
Anton21 is on a distinguished road
Maybe using a smaller time step would help
Anton21 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Decomposing error simple method DanGode OpenFOAM Pre-Processing 1 July 27, 2021 06:58
[snappyHexMesh] snappyHexMesh stuck when snap is turned on yukuns OpenFOAM Meshing & Mesh Conversion 3 February 2, 2021 13:05
fluent divergence for no reason sufjanst FLUENT 2 March 23, 2016 16:08
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
Cluster ID's not contiguous in compute-nodes domain. ??? Shogan FLUENT 1 May 28, 2014 15:03


All times are GMT -4. The time now is 04:43.