CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

Issues with OpenFOAM on an HPC with GCC+MVAPICH2.

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 18, 2020, 15:48
Default Issues with OpenFOAM on an HPC with GCC+MVAPICH2.
  #1
New Member
 
Prakriti
Join Date: Jun 2018
Posts: 6
Rep Power: 7
Prakriti is on a distinguished road
Hi all,

I compiled OpenFOAM-7 on an HPC that has GNU/GCC compilers but not OpenMPI. I had initially compiled it with ICC/IntelMPI using the pointers provided in this post: Compilation Error with OpenFOAM-5.x using Intel Compilers

However, with that combination (Icc+IMPI), I started running into errors like the simulations stopping on random time steps.

So I tried to reinstall OpenFOAM with GCC+OpenMPI but compiling OpenMPI on my local allocation on the HPC. I did this for OpenFOAM-8 but could not get it to compile for OpenFOAM-7 (unrecognized MPI).

Finally, I tried to compile OpenFOAM-7 with GCC+MVAPICH2 (already present on the HPC) and OpenFOAM compiled without errors.

However, when I ran my simulations last night, they randomly stopped at random time steps with error messages:


1) 1st kind.




diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
PIMPLE: Iteration 1
DILUPBiCGStab: Solving for Ux, Initial residual = 2.38762e-08, Final residual = 1.04175e-16, No Iterations 1
DILUPBiCGStab: Solving for Uy, Initial residual = 2.38762e-08, Final residual = 1.04175e-16, No Iterations 1
DILUPBiCGStab: Solving for Uz, Initial residual = 2.38762e-08, Final residual = 1.04175e-16, No Iterations 1
DILUPBiCGStab: Solving for O2, Initial residual = 0.000388346, Final residual = 3.19766e-18, No Iterations 1
DILUPBiCGStab: Solving for He, Initial residual = 8.82241e-17, Final residual = 8.82241e-17, No Iterations 0
[27] #0 Foam::error:rintStack(Foam::Ostream&) at ??:?
[27] #1 Foam::sigFpe::sigHandler(int) at ??:?
[27] #2 ? in "/lib64/libc.so.6"
[27] #3 double Foam::sumProd<double>(Foam::UList<double> const&, Foam::UList<double> const&) at ??:?
[27] #4 Foam::PBiCGStab::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[27] #5 Foam::fvMatrix<double>::solveSegregated(Foam::dict ionary const&) at ??:?
[27] #6 Foam::fvMatrix<double>::solve(Foam::dictionary const&) at ??:?
[27] #7 Foam::fvMatrix<double>::solve() at ??:?
[27] #8 ? at ??:?
[27] #9 __libc_start_main in "/lib64/libc.so.6"
[27] #10 ? at ??:?
[c188-051.frontera.tacc.utexas.edu:mpi_rank_27][error_sighandler] Caught error: Floating point exception (signal 8)

================================================== =================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 155593 RUNNING AT c188-051
= EXIT CODE: 8
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
================================================== =================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Floating point exception (signal 8)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions



2) 2nd kind.


diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
PIMPLE: Iteration 1
DILUPBiCGStab: Solving for Ux, Initial residual = 0.00180794, Final residual = 1.78746e-10, No Iterations 1
DILUPBiCGStab: Solving for Uy, Initial residual = 0.0026315, Final residual = 2.52956e-10, No Iterations 1
DILUPBiCGStab: Solving for Uz, Initial residual = 0.0202497, Final residual = 1.56385e-09, No Iterations 1
DILUPBiCGStab: Solving for FSD, Initial residual = 3.02246e-05, Final residual = 4.04184e-11, No Iterations 1
DILUPBiCGStab: Solving for O2, Initial residual = 6.59176e-06, Final residual = 5.15039e-09, No Iterations 1
DILUPBiCGStab: Solving for H2O, Initial residual = 6.63204e-06, Final residual = 5.16573e-09, No Iterations 1
DILUPBiCGStab: Solving for CH4, Initial residual = 2.88619e-06, Final residual = 9.02189e-09, No Iterations 1
DILUPBiCGStab: Solving for CO2, Initial residual = 6.63204e-06, Final residual = 5.16573e-09, No Iterations 1
Radiation solver iter: 0
GAMG: Solving for ILambda_0_0, Initial residual = 0.0494203, Final residual = 0.00193215, No Iterations 1
GAMG: Solving for ILambda_1_0, Initial residual = 0.0356651, Final residual = 0.00106093, No Iterations 1
[0] #0 Foam::error:rintStack(Foam::Ostream&) at ??:?
[0] #1 Foam::sigFpe::sigHandler(int) at ??:?
[0] #2 ? in "/lib64/libc.so.6"
[0] #3 MPIR_SUM in "/opt/apps/gcc9_1/mvapich2-x/2.3/lib64/libmpi.so.12"
[0] #4 MPIR_Allreduce_socket_aware_two_level_MV2 in "/opt/apps/gcc9_1/mvapich2-x/2.3/lib64/libmpi.so.12"
[0] #5 MPIR_Allreduce_index_tuned_intra_MV2 in "/opt/apps/gcc9_1/mvapich2-x/2.3/lib64/libmpi.so.12"
[0] #6 MPIR_Allreduce_impl in "/opt/apps/gcc9_1/mvapich2-x/2.3/lib64/libmpi.so.12"
[0] #7 MPI_Allreduce in "/opt/apps/gcc9_1/mvapich2-x/2.3/lib64/libmpi.so.12"
[0] #8 void Foam::allReduce<double, Foam::sumOp<double> >(double&, int, int, int, Foam::sumOp<double> const&, int, int) at ??:?
[0] #9 Foam::reduce(double&, Foam::sumOp<double> const&, int, int) at ??:?
[0] #10 Foam::PBiCGStab::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[0] #11 Foam::GAMGSolver::solveCoarsestLevel(Foam::Field<d ouble>&, Foam::Field<double> const&) const at ??:?
[0] #12 Foam::GAMGSolver::Vcycle(Foam::PtrList<Foam::lduMa trix::smoother> const&, Foam::Field<double>&, Foam::Field<double> const&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::PtrList<Foam::Field<double> >&, Foam::PtrList<Foam::Field<double> >&, unsigned char) const at ??:?
[0] #13 Foam::GAMGSolver::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:?
[0] #14 Foam::fvMatrix<double>::solveSegregated(Foam::dict ionary const&) at ??:?
[0] #15 Foam::fvMatrix<double>::solve(Foam::dictionary const&) at ??:?
[0] #16 Foam::fvMatrix<double>::solve(Foam::word const&) at ??:?
[0] #17 Foam::radiationModels::radiativeIntensityRay::corr ect() at ??:?
[0] #18 Foam::radiationModels::fvDOM::calculate() at ??:?
[0] #19 Foam::radiationModel::correct() at ??:?
[0] #20 ? at ??:?
[0] #21 __libc_start_main in "/lib64/libc.so.6"
[0] #22 ? at ??:?
[c168-114.frontera.tacc.utexas.edu:mpi_rank_0][error_sighandler] Caught error: Floating point exception (signal 8)

================================================== =================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 225163 RUNNING AT c168-114
= EXIT CODE: 8
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
================================================== =================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Floating point exception (signal 8)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions





Could these be attributable to an OpenFOAM compilation issue? I'm stuck, please help.


Also, I would like some help with compiling OpenMPI locally on my HPC user allocation, to be able to compile OpenFOAM with it. Please help.

Last edited by Prakriti; October 22, 2020 at 19:56.
Prakriti is offline   Reply With Quote

Reply

Tags
cluster setup, compilation error, openfoam-7, openmpi 4, wyldkat

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Issues with mpirun in HPC lebc OpenFOAM Running, Solving & CFD 5 April 22, 2019 17:20
[OpenFOAM.org] Problems with installing openfoam 5.0 on HPC Cluster sjlouie91 OpenFOAM Installation 4 January 20, 2019 15:35
OpenFoam in docker (gcc 6) vs. OpenFoam natively compiled (gcc 8.1.1) Anouk OpenFOAM Running, Solving & CFD 0 September 17, 2018 08:02
New OpenFOAM Forum Structure jola OpenFOAM 2 October 19, 2011 06:55
OpenFOAM 1.6.x, 1.7.0 and 1.7.x are not fully prepared to work with gcc-4.5.x wyldckat OpenFOAM Bugs 18 October 21, 2010 05:51


All times are GMT -4. The time now is 11:13.