CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (http://www.cfd-online.com/Forums/openfoam-solving/)
-   -   OpenFOAM 2.0.0. and 2.0.1 doesn't work in parallel mode (http://www.cfd-online.com/Forums/openfoam-solving/92294-openfoam-2-0-0-2-0-1-doesnt-work-parallel-mode.html)

rv82 September 9, 2011 04:39

OpenFOAM 2.0.0. and 2.0.1 doesn't work in parallel mode
 
Hi all! Help me please! When I try to run pisoFoam in parallel with
Code:

mpirun - np 4 pisoFoam -parallel
, I get the following message:
Code:

Build  : 2.0.0-a317a4e7cd55
Exec  : interFoam -parallel
Date  : Sep 09 2011
Time  : 15:24:24
Host  : 5101-5
PID    : 27833
Case  : /home/roman/OpenFOAM/roman-2.0.1/run/damBreak
nProcs : 4
Slaves :
3
(
5101-5.27834
5101-5.27835
5101-5.27836
)

Pstream initialized with:
    floatTransfer    : 0
    nProcsSimpleSum  : 0
    commsType        : nonBlocking
[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] ill defined primitiveEntry starting at keyword 'slaves' on line 0 and ending at line 3
[0]
[0] file: IStringStream.sourceFile at line 3.
[0]
[0]    From function primitiveEntry::readEntry(const dictionary&, Istream&)
[0]    in file db/dictionary/primitiveEntry/primitiveEntryIO.C at line 165.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 27833 on
node 5101-5 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

When I try to run other solvers (for example, interFoam from tutorials), I get the same message.
:(

Tell me please what does it mean?
Everything worked fine until I reinstalled Linux :(

romant September 12, 2011 02:53

did you decompose the mesh before running it in parallel?

rv82 September 12, 2011 09:38

Quote:

Originally Posted by romant (Post 323728)
did you decompose the mesh before running it in parallel?

of course. The number of "parts" corresponds to number of processor cores.

PS. I try to run OpenFOAM on older version of Linux, and I got the same error. Although it worked fine in august of this year.

rv82 October 3, 2011 10:47

Problem was found! Thanks to developers of OpenFOAM.
OpenFOAM doesn't like the name of my machine -- 5101-5. I have changed this name to a5101_5 and OpenFOAM started successfully!


All times are GMT -4. The time now is 13:53.