CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Crash with renumberMesh and pimpleFoam LES

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   December 4, 2019, 15:31
Default Crash with renumberMesh and pimpleFoam LES
  #1
jmt
Member
 
Julian
Join Date: Sep 2019
Posts: 32
Rep Power: 6
jmt is on a distinguished road
I am running a LES case with a 17 million cell mesh from blockMesh, using the pimpleFoam solver in OF 5.0.

The case takes a large amount of CPU hours, so I am looking for approaches to speed up the calculation.

One of these is using the renumberMesh utility. I am encountering a crash with the pressure solver when I try and use renumberMesh in parallel. I am not sure if they are related, so I ran one case with
* renumberMesh in serial before decomposePar

and one case with
* renumberMesh in parallel after decomposePar. That is,

1. Generate a mesh with blockMesh.
2. Decompose the mesh for 180 processors via decomposePar.
3. Run renumberMesh in parallel on 180 procs: renumberMesh -overwrite -parallel (SLURM submission script handles the parallel portion)
4. Run the simulation on 180 procs with pimpleFoam.

The simulation runs fine for a while. The residuals and CFL number look ok. Then, the simulation crashes with the following log excerpt.

I've also attached my fvSolution dictionary below. Thank you for any advice or input.

Code:
Courant Number mean: 0.01953872075 max: 0.5964071291
Time = 0.027228

PIMPLE: iteration 1
[0] Starting inflgen, time = 8023 s
[0] The number of vortons is 8032
[0] Finishing inflgen, time = 8023 s
smoothSolver:  Solving for Ux, Initial residual = 0.0005660971091, Final residual = 3.365824579e-06, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 0.001857616441, Final residual = 1.248576987e-05, No Iterations 1
smoothSolver:  Solving for Uz, Initial residual = 0.001845039049, Final residual = 1.264188109e-05, No Iterations 1
GAMG:  Solving for p, Initial residual = 0.02449620851, Final residual = 4.450573519e-07, No Iterations 13
time step continuity errors : sum local = 7.999202759e-12, global = -2.499062109e-14, cumulative = 2.316567643e-12
PIMPLE: iteration 2
smoothSolver:  Solving for Ux, Initial residual = 1.32914688e-05, Final residual = 1.25106515e-07, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 4.68221212e-05, Final residual = 5.277816827e-07, No Iterations 1
smoothSolver:  Solving for Uz, Initial residual = 4.765156009e-05, Final residual = 5.373488844e-07, No Iterations 1
GAMG:  Solving for p, Initial residual = 0.03288688617, Final residual = 6.918512333e-07, No Iterations 13
time step continuity errors : sum local = 1.243177916e-11, global = 3.739872997e-15, cumulative = 2.320307516e-12
PIMPLE: iteration 3
smoothSolver:  Solving for Ux, Initial residual = 4.273234509e-06, Final residual = 4.73847033e-08, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 1.546522661e-05, Final residual = 1.842282608e-07, No Iterations 1
smoothSolver:  Solving for Uz, Initial residual = 1.610282976e-05, Final residual = 1.97009005e-07, No Iterations 1
[9] #0  Foam::error::printStack(Foam::Ostream&)--------------------------------------------------------------------------
A process has executed an operation involving a call to the
"fork()" system call to create a child process.  Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your job may hang, crash, or produce silent
data corruption.  The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

  Local host:          [[49135,0],9] (PID 54348)

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
 in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #1  Foam::sigSegv::sigHandler(int) in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #2  ? in "/lib64/libc.so.6"
[9] #3  ? at btl_vader_component.c:0
[9] #4  opal_progress in "/apps2/openmpi/3.1.0-gcc/lib/libopen-pal.so.40"
[9] #5  ompi_sync_wait_mt in "/apps2/openmpi/3.1.0-gcc/lib/libopen-pal.so.40"
[9] #6  ompi_request_default_wait in "/apps2/openmpi/3.1.0-gcc/lib/libmpi.so.40"
[9] #7  ompi_coll_base_sendrecv_actual in "/apps2/openmpi/3.1.0-gcc/lib/libmpi.so.40"
[9] #8  ompi_coll_base_allreduce_intra_recursivedoubling in "/apps2/openmpi/3.1.0-gcc/lib/libmpi.so.40"
[9] #9  MPI_Allreduce in "/apps2/openmpi/3.1.0-gcc/lib/libmpi.so.40"
[9] #10  void Foam::allReduce<double, Foam::sumOp<double> >(double&, int, ompi_datatype_t*, ompi_op_t*, Foam::sumOp<double> const&, int, int) in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so"
[9] #11  Foam::reduce(double&, Foam::sumOp<double> const&, int, int) in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so"
[9] #12  Foam::PCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #13  Foam::GAMGSolver::solveCoarsestLevel(Foam::Field<double>&, Foam::Field<double> const&) const in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #14  Foam::GAMGSolver::Vcycle(Foam::PtrList<Foam::lduMatrix::smoother> const&, Foam::Field<double>&, Foam::Field<double> const&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::PtrList<Foam::Field<double> >&, Foam::PtrList<Foam::Field<double> >&, unsigned char) const in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #15  Foam::GAMGSolver::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so"
[9] #16  Foam::fvMatrix<double>::solveSegregated(Foam::dictionary const&) in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so"
[9] #17  Foam::fvMatrix<double>::solve(Foam::dictionary const&) in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/bin/pimpleFoam"
[9] #18  ? in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/bin/pimpleFoam"
[9] #19  __libc_start_main in "/lib64/libc.so.6"
[9] #20  ? in "/home/user/OpenFOAM/OpenFOAM-5.x/platforms/linux64GccDPInt32Opt/bin/pimpleFoam"
[cn374:54348] *** Process received signal ***
[cn374:54348] Signal: Segmentation fault (11)
[cn374:54348] Signal code:  (-6)
[cn374:54348] Failing at address: 0x9bbee0000d44c
fvSolution:

Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  5                                     |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSolution;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

solvers
{
    "(p|rho)"
    {
        solver          GAMG;
        tolerance       1e-6;
        relTol          0.01;
    smoother    GaussSeidel;
    }

    "(p|rho)Final"
    {
        $p;
        relTol          0;
    }

    "(U|e|k|nuTilda)"
    {
        solver          smoothSolver;
        smoother        symGaussSeidel;
        tolerance       1e-6;
        relTol          0.01;
    }

    "(U|e|k|nuTilda)Final"
    {
        $U;
        relTol          0;
    }
}

PIMPLE
{
    momentumPredictor yes;
    nOuterCorrectors 3;
    nCorrectors     2;
    nNonOrthogonalCorrectors 0;

}

relaxationFactors
{
    equations
    {
        ".*"  1;
    }
}

// ************************************************************************* //
jmt is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 06:11.