Tobi |
August 30, 2012 17:00 |
troubles with sHM and parallel
Hey guys,
I am using sHM since 1 year and since last week I get the following error by executing sHM parallel:
Code:
Create mesh for time = 1
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] read failed
[1]
[1] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[1] in file UIPread.C at line 114.
[1]
FOAM parallel run aborting
[1]
[2]
[2]
[2] --> FOAM FATAL ERROR:
[2] read failed
[2]
[2] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[2] in file UIPread.C at line 114.
[2]
FOAM parallel run aborting
[2]
Read mesh in = 0.12 s
[2] #0 [1] #0 Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1 Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1 Foam::error::abort() in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTypes, int, Foam::DynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[1] #3 Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[2] #3 Foam::IPstream::IPstream(Foam::UPstream::commsTypes, int, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #4 Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4 Foam::IOdictionary::readFile(bool) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #6 Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #6 Foam::solution::solution(Foam::objectRegistry const&, Foam::fileName const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #7 Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #7 Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[1] #8 in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #8
[1] in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[1] #9 __libc_start_main[2] in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[2] #9 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #10 in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #10
[1] in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[2] in "/home/shorty/OpenFOAM/OpenFOAM-2.1.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 13823 on
node cfd exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[cfd:13821] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[cfd:13821] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I compiled a other sHM - version. but removed it again. is it possible that I destroyed my 2.1.x version of sHM?
Or whats the error?
Thanks for reading and helping :)
PS: Running solvers in parallel mode is working fine as well
Tobi
|