CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM (
-   -   new Solver won't run parallel (

Chris Lucas February 8, 2011 13:00

new Solver won't run parallel

I have modified rhoPisoFoam a bit e.g. changed the energy equation (see link below). Now the variable htot is used instead of h.

The solver runs fine when I use one processor with one core, but when I want to run a case parallel, the simulation crashes after the htot iteration with the following error.

[morgoth:15364] *** on communicator MPI_COMM_WORLD
[morgoth:15364] *** MPI_ERR_TRUNCATE: message truncated
[morgoth:15364] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
mpirun has exited due to process rank 1 with PID 15364 on
node morgoth exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).

Do I have to make any changes to openMPI or in openFoam to make the new solver run parallel

Thanks for the help,

wyldckat February 8, 2011 19:26

Greetings Christian,

Note to other readers: if you know better than what I wrote, please also respond!

I'm not completely familiar with the way OpenFOAM handles MPI, but I believe that it should be as easy as:
  • Creating the new field with an option for using parallel.
  • Or adding the variable to the list of parallelizable variables.
  • Or some operations to the variable have to be done in a dedicated zone in the code for parallel operations.
Check the header files that are included inside the main function and check how the other variables are created and if there is a modifier somewhere that indicates them to be parallelizable... Also look for special handling of the other some variables.
Best regards,

braennstroem April 16, 2011 10:29


did you could fix this problem... I have the error message sometimes for standard solvers as well, but do not know the reason for this!?
I do not think that you need to update some list of fields...


Btw. off-topic another problem I had with the nfs stales is fixed now; the raid-controller was the bad guy.

Chris Lucas April 18, 2011 02:49


if I remember correctly, the problem was related to the command "correctBoundaryCondition" I used on a dummy field in my solver


Linse January 10, 2012 11:30


Originally Posted by Chris Lucas (Post 303991)

if I remember correctly, the problem was related to the command "correctBoundaryCondition" I used on a dummy field in my solver


Hello Christian!
You mentioned a dummy field in your solver, and at the moment I am looking for some kind of a dummy field or a dummy function myself. Can you share the code for that specific field with me? Maybe it already is the solution to my problems...

All times are GMT -4. The time now is 08:06.