CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Pimple failing without any warning or any abnormal (https://www.cfd-online.com/Forums/openfoam-solving/230706-pimple-failing-without-any-warning-any-abnormal.html)

sjohn2 October 2, 2020 01:39

Pimple failing without any warning or any abnormal
 
PIMPLE ids failinfg without any warning, not sure what is the reason, when I re run the case it fails at a later time, any hints is it the GAMG solver?

PHP Code:

PIMPLEIteration 1
smoothSolver
:  Solving for U.airxInitial residual 2.52488e-05, Final residual 1.39026e-10No Iterations 1
smoothSolver
:  Solving for U.airyInitial residual 2.78076e-05, Final residual 1.50724e-10No Iterations 1
smoothSolver
:  Solving for U.airzInitial residual 2.89627e-05, Final residual 1.52544e-10No Iterations 1
GAMG
:  Solving for pInitial residual 1.13934e-05, Final residual 7.98699e-08No Iterations 3
GAMG
:  Solving for pInitial residual 1.81298e-06, Final residual 1.40565e-08No Iterations 4
GAMG
:  Solving for pInitial residual 4.10222e-07, Final residual 3.58867e-09No Iterations 8
time step continuity errors 
sum local 6.74827e-16, global = -1.49887e-17cumulative 2.22457e-14
PIMPLE
Iteration 2
smoothSolver
:  Solving for U.airxInitial residual 4.27308e-11, Final residual 4.27308e-11No Iterations 0
smoothSolver
:  Solving for U.airyInitial residual 3.6203e-11, Final residual 3.6203e-11No Iterations 0
smoothSolver
:  Solving for U.airzInitial residual 4.9941e-11, Final residual 4.9941e-11No Iterations 0
GAMG
:  Solving for pInitial residual 1.22209e-07, Final residual 8.59521e-09No Iterations 1
GAMG
:  Solving for pInitial residual 1.99213e-08, Final residual 4.38494e-09No Iterations 1
[31#0  Foam::error::printStack(Foam::Ostream&) at ??:?
[31#1  Foam::sigSegv::sigHandler(int) at ??:?
[31#2  ? in "/cvmfs/soft.computecanada.ca/nix/store/63pk88rnmkzjblpxydvrmskkc8ci7cx6-glibc-2.24/lib/libc.so.6"
[31#3  ? in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libopen-pal.so.40"
[31#4  opal_progress in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libopen-pal.so.40"
[31#5  ompi_request_default_wait in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libmpi.so.40"
[31#6  ompi_coll_base_sendrecv_actual in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libmpi.so.40"
[31#7  ompi_coll_base_allreduce_intra_recursivedoubling in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libmpi.so.40"
[31#8  PMPI_Allreduce in "/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/Compiler/intel2018.3/openmpi/3.1.2/lib/libmpi.so.40"
[31#9  Foam::reduce(double&, Foam::sumOp<double> const&, int, int) at ??:?
[31#10  Foam::PCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[31#11  Foam::GAMGSolver::solveCoarsestLevel(Foam::Field<double>&, Foam::Field<double> const&) const at ??:?
[31#12  Foam::GAMGSolver::Vcycle(Foam::PtrList<Foam::lduMatrix::smoother> const&, Foam::Field<double>&, Foam::Field<double> const&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::PtrList<Foam::Field<double> >&, Foam::PtrList<Foam::Field<double> >&, unsigned char) const at ??:?
[31#13  Foam::GAMGSolver::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[31#14  Foam::fvMatrix<double>::solveSegregated(Foam::dictionary const&) at ??:?
[31#15  Foam::fvMatrix<double>::solve(Foam::dictionary const&) at ??:?
[31#16  Foam::fvMatrix<double>::solve() at ??:?
[31#17  ? at ??:?
[31#18  __libc_start_main in "/cvmfs/soft.computecanada.ca/nix/store/63pk88rnmkzjblpxydvrmskkc8ci7cx6-glibc-2.24/lib/libc.so.6"
[31#19  ? at /tmp/nix-build-glibc-2.24.drv-0/glibc-2.24/csu/../sysdeps/x86_64/start.S:122 


mAlletto October 3, 2020 10:36

Looks like some mpi error. Did you try to run it on one node

sjohn2 October 4, 2020 20:24

Unfortunately running in serial is not an option, takes for ever.
I instead ran the pressure with PBiCGStab and this problem went away.

Can you check if my GAMG settings are right. This problem always seem to arise while solving on the coarsest grid

PHP Code:

       solver          GAMG;           // very efficient multigrid solver
        
tolerance       1e-07;          // solver finishes if either absolute
        
relTol          0.001;          // tolerance is reached or the relative
                                        // tolerance here
        
minIter         3;              // a minimum number of iterations
        
maxIter         300;            // limitation of iterions number
        
smoother        DIC;            // setting for GAMG
        
nPreSweeps      1;              // 1 for pd, set to 0 for all other!
        
nPostSweeps     2;              // 2 is fine
        
nFinestSweeps   2;              // 2 is fine
        
scaleCorrection true;           // true is fine
        
directSolveCoarsestLevel false// false is fine
        
cacheAgglomeration on;          // on is fine; set to off, if dynamic
                                        // mesh refinement is used!
        
nCellsInCoarsestLevel 2400;      // 500 is fine,
                                        // otherwise sqrt(number of cells)
        
agglomerator    faceAreaPair;   // faceAreaPair is fine
        
mergeLevels     1;              // 1 is fine 


mAlletto October 5, 2020 03:16

I'm not an expert settings of the gamg solver. I use the one of the motorbike tutorial. Playing around with the parameters and seeing what happens is also an option


All times are GMT -4. The time now is 06:05.