CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

Problem with MPI?

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   May 27, 2010, 07:47
Default Problem with MPI?
  #1
Member
 
John Wang
Join Date: Mar 2009
Location: Singapore
Posts: 71
Rep Power: 8
cwang5 is on a distinguished road
Hi guys,

I was trying to run a parallel simulation for a custom made code when I received this error message from MPI:

Code:
[1] #0  Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- 
 An MPI process has executed an operation involving a call to the 
 "fork()" system call to create a child process.  Open MPI is currently 
 operating in a condition that could result in memory corruption or 
 other system errors; your MPI job may hang, crash, or produce silent 
 data corruption.  The use of fork() (or system() or other calls that 
 create child processes) is strongly discouraged. 
  
 The process that invoked fork was: 
  
   Local host:          compute173 (PID 704) 
   MPI_COMM_WORLD rank: 1 
  
 If you are *absolutely sure* that your application will successfully 
 and correctly survive a call to fork(), you may disable this warning 
 by setting the mpi_warn_on_fork MCA parameter to 0. 
 -------------------------------------------------------------------------- 
  in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so" 
 [1] #1  Foam::sigFpe::sigFpeHandler(int) in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so" 
 [1] #2  __restore_rt at sigaction.c:0 
 [1] #3  Foam::divide(Foam::Field<double>&, Foam::UList<double> const&, Foam::UList<double> const&) in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so" 
 [1] #4  void Foam::divide<Foam::fvPatchField, Foam::volMesh>(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleTransportModels.so" 
 [1] #5  Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > Foam::operator/<Foam::fvPatchField, Foam::volMesh>(Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > const&, Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > const&) in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleTransportModels.so" 
 [1] #6  Foam::incompressible::RASModels::kOmegaSST::F2() const in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleRASModels.so" 
 [1] #7  Foam::incompressible::RASModels::kOmegaSST::correct() in "/home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleRASModels.so" 
 [1] #8  main in "/home/m080031/OpenFOAM/m080031-1.6.x/applications/bin/linux64GccDPOpt/flapFoam" 
 [1] #9  __libc_start_main in "/lib64/libc.so.6" 
 [1] #10  Foam::regIOobject::writeObject(Foam::IOstream::streamFormat, Foam::IOstream::versionNumber, Foam::IOstream::compressionType) const in "/home/m080031/OpenFOAM/m080031-1.6.x/applications/bin/linux64GccDPOpt/flapFoam" 
 [compute173:00704] *** Process received signal *** 
 [compute173:00704] Signal: Floating point exception (8) 
 [compute173:00704] Signal code:  (-6) 
 [compute173:00704] Failing at address: 0x2742000002c0 
 [compute173:00704] [ 0] /lib64/libc.so.6 [0x3e1b030280] 
 [compute173:00704] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3e1b030215] 
 [compute173:00704] [ 2] /lib64/libc.so.6 [0x3e1b030280] 
 [compute173:00704] [ 3] /home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so(_ZN4Foam6divideERNS_5FieldIdEERKNS_5UListIdEES6_+0xc1) [0x2b0d224220a1] 
 [compute173:00704] [ 4] /home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleTransportModels.so(_ZN4Foam6divideINS_12fvPatchFieldENS_7volMeshEEEvRNS_14GeometricFieldIdT_T0_EERKS6_S9_+0xd9) [0x2b0d208fb8e9] 
 [compute173:00704] [ 5] /home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleTransportModels.so(_ZN4FoamdvINS_12fvPatchFieldENS_7volMeshEEENS_3tmpINS_14GeometricFieldIdT_T0_EEEERKS8_SA_+0x2d8) [0x2b0d208fcc48] 
 [compute173:00704] [ 6] /home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleRASModels.so(_ZNK4Foam14incompressible9RASModels9kOmegaSST2F2Ev+0x162) [0x2b0d20c166a2] 
 [compute173:00704] [ 7] /home/m080031/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libincompressibleRASModels.so(_ZN4Foam14incompressible9RASModels9kOmegaSST7correctEv+0x1874) [0x2b0d20c1b8f4] 
 [compute173:00704] [ 8] /home/m080031/OpenFOAM/m080031-1.6.x/applications/bin/linux64GccDPOpt/flapFoam [0x41e8dd] 
 [compute173:00704] [ 9] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3e1b01d974] 
 [compute173:00704] [10] /home/m080031/OpenFOAM/m080031-1.6.x/applications/bin/linux64GccDPOpt/flapFoam(_ZNK4Foam11regIOobject11writeObjectENS_8IOstream12streamFormatENS1_13versionNumberENS1_15compressionTypeE+0xc9) [0x4199e9] 
 [compute173:00704] *** End of error message ***
The custom solver ran fine for serial computation, and the current simulation ran correctly for ~1 hour before the error showed up. Anyone know what might have cause this "fork" in mpi operation? Thanks.

John
cwang5 is offline   Reply With Quote

Old   July 12, 2010, 09:50
Default same problem
  #2
New Member
 
Stefanie Schiffer
Join Date: Mar 2010
Location: Cologne, Germany
Posts: 26
Rep Power: 7
StSchiff is on a distinguished road
Hey John,

I received the same error warning. Funny thing is, I have another case like this running and it hasn't stopped yet. The two cases differ in the choice of delta for my LES Simulation. I'm using the dynSmagorinsky model. For the case that is still running I chose delta "smooth" and for the one that stopped I chose delta "vanDriest". Other than that the cases are identical.

Did you manage to find out what caused the error?

Stefanie
StSchiff is offline   Reply With Quote

Old   July 12, 2010, 10:12
Default
  #3
Member
 
John Wang
Join Date: Mar 2009
Location: Singapore
Posts: 71
Rep Power: 8
cwang5 is on a distinguished road
Hi Stefanie,

I think the cause of the error was that I was using sumMag instead of gSum for one of the calculations, resulting in a division by zero which caused the "floating point exception" error message, and crashed the solver, the crashed solver in turn caused MPI to display that error message.

Hope that helps.

John
cwang5 is offline   Reply With Quote

Old   July 12, 2010, 10:38
Default
  #4
New Member
 
Stefanie Schiffer
Join Date: Mar 2010
Location: Cologne, Germany
Posts: 26
Rep Power: 7
StSchiff is on a distinguished road
Hey John,

thanks for the fast reply. Looks like my problem is similar. In the calcDelta() function in vanDriestDelta.C there's a division that must have caused the floating point exception. For the smooth delta there is a different formular and I guess no zero division occurs here. That's why that case is still running without any trouble. I guess I'll have to rethink my bc values for nuSgs for this case.

Stefanie
StSchiff is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
UDF compiling problem Wouter Fluent UDF and Scheme Programming 6 June 6, 2012 04:43
Incoherent problem table in hollow-fiber spinning Gianni FLUENT 0 April 5, 2008 10:33
natural convection problem for a CHT problem Se-Hee CFX 2 June 10, 2007 06:29
Adiabatic and Rotating wall (Convection problem) ParodDav CFX 5 April 29, 2007 19:13
Is this problem well posed? Thomas P. Abraham Main CFD Forum 5 September 8, 1999 14:52


All times are GMT -4. The time now is 04:41.