CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM (https://www.cfd-online.com/Forums/openfoam/)
-   -   OpenFOAM 2.0.0 in paralell (https://www.cfd-online.com/Forums/openfoam/90379-openfoam-2-0-0-paralell.html)

andrea.pasquali July 8, 2011 05:40

OpenFOAM 2.0.0 in paralell
 
1 Attachment(s)
Hi,
I'm trying the new OpenFOAM 2.0.0.
I found that if I try to run in parallel it crashes:
Quote:

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

[0]
[0]
[0] [cluster-exec1:17247] *** An error occurred in MPI_Recv
[cluster-exec1:17247] *** on communicator MPI_COMM_WORLD
[cluster-exec1:17247] *** MPI_ERR_TRUNCATE: message truncated
[cluster-exec1:17247] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
--> FOAM FATAL IO ERROR:
[0] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token
[0]
[0] file: IOstream at line 0.
[0]
[0] From function IOstream::fatalCheck(const char*) const
[0] in file db/IOstreams/IOstreams/IOstream.C at line 114.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 6 with PID 17247 on
node cluster-exec1 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
Anyone found this error as me?
I run also "foamInstallationTest" but all seem good (I attached the log file).

Thanks for any help

Andrea

Tag July 8, 2011 05:56

i run dieselFoam in parallel, 6 procs are used with any problems ...

wyldckat July 9, 2011 05:55

Greetings to all!

@Andrea: read these posts and the links therein:
Another thing that could be useful is knowing if you are using your cluster's MPI or the OpenMPI that comes with OpenFOAM's ThirdParty. If you are using your cluster's MPI, it's also useful to know which is it!?

Best regards,
Bruno

Aurelien Thinat July 28, 2011 11:12

Hi everyone,

I'm facing the same problem but with snappyHexMesh in //. The other applications are working (simpleFoam / interFoam / LTSInterFoam...) in parallel. Here is my problem :http://www.cfd-online.com/Forums/ope...tml#post317890


Did you find any solution with your problem Andrea ?


All times are GMT -4. The time now is 23:56.