CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Programming & Development (https://www.cfd-online.com/Forums/openfoam-programming-development/)
-   -   Parallel Simulation Issues (https://www.cfd-online.com/Forums/openfoam-programming-development/172751-parallel-simulation-issues.html)

demolaoye June 4, 2016 22:26

Parallel Simulation Issues
 
Hello, OpenFoam users.

I am trying to execute a parallel simulation. However, each time, I get this error:

"
[rho00.local:27980] *** An error occurred in MPI_Recv
[rho00.local:27980] *** on communicator MPI_COMM_WORLD
[rho00.local:27980] *** MPI_ERR_TRUNCATE: message truncated
[rho00.local:27980] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 27980 on
node rho00.local exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[rho00.local:27976] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[rho00.local:27976] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
"

I would like helpful tips on what to add or tweak to resolve this issue. Note that the case runs well when executed using single processor.

Thank you.

--
Oluwafemi

a.weber June 8, 2016 07:42

Thats very little information... but at first you can always try the debugging tool:

Code:

mpirunDebug -np <n> <application> -parallel


All times are GMT -4. The time now is 01:57.