CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Native Meshers: snappyHexMesh and Others

troubles with sHM and parallel

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   August 30, 2012, 17:00
Default troubles with sHM and parallel
  #1
Senior Member
 
Tobi's Avatar
 
Tobias Holzmann
Join Date: Oct 2010
Location: Leoben (Austria)
Posts: 1,087
Blog Entries: 6
Rep Power: 19
Tobi will become famous soon enough
Send a message via ICQ to Tobi Send a message via Skype™ to Tobi
Hey guys,

I am using sHM since 1 year and since last week I get the following error by executing sHM parallel:

Code:
Create mesh for time = 1

[1] 
[1] 
[1] --> FOAM FATAL ERROR: 
[1] read failed
[1] 
[1]     From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[1]     in file UIPread.C at line 114.
[1] 
FOAM parallel run aborting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL ERROR: 
[2] read failed
[2] 
[2]     From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[2]     in file UIPread.C at line 114.
[2] 
FOAM parallel run aborting
[2] 
Read mesh in = 0.12 s
[2] #0  [1] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2  Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2  Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[1] #3  Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[2] #3  Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #4  Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4  Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #5  Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #5  Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #6  Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #6  Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #7  Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #7  Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[1] #8   in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #8  

[1]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[1] #9  __libc_start_main[2]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[2] #9  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #10   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #10  

[1]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2]  in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 13823 on
node cfd exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[cfd:13821] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[cfd:13821] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I compiled a other sHM - version. but removed it again. is it possible that I destroyed my 2.1.x version of sHM?

Or whats the error?
Thanks for reading and helping

PS: Running solvers in parallel mode is working fine as well

Tobi
Tobi is offline   Reply With Quote

Old   August 30, 2012, 17:54
Default
  #2
Senior Member
 
Tobi's Avatar
 
Tobias Holzmann
Join Date: Oct 2010
Location: Leoben (Austria)
Posts: 1,087
Blog Entries: 6
Rep Power: 19
Tobi will become famous soon enough
Send a message via ICQ to Tobi Send a message via Skype™ to Tobi
Okay it seems that there is more destroyed - I ll recompile OF.
Tobi is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Parallel sHM error vigges OpenFOAM Native Meshers: snappyHexMesh and Others 8 March 18, 2013 07:27
Parallel meshing with OP 2.0.0 Aurelien Thinat OpenFOAM Native Meshers: snappyHexMesh and Others 34 March 27, 2012 18:55
Multi Region Meshing with sHM marango OpenFOAM Native Meshers: snappyHexMesh and Others 3 March 27, 2012 00:51
snappyHexMesh parallel run error dhruv OpenFOAM 2 February 16, 2012 05:34
processorWeights problem with snappyhexmesh in parallel oskar OpenFOAM Native Meshers: snappyHexMesh and Others 0 July 7, 2011 10:05


All times are GMT -4. The time now is 06:26.