CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM Bugs (
-   -   OpenFOAM v1.6 & OpenMPI & functionObjects (

bruce August 10, 2009 02:31

OpenFOAM v1.6 & OpenMPI & functionObjects
Hello All,

I am moving my previous post from "OpenFOAM Running / Solving / CFD" since it seems to be a bug.

Even though i notice this bug in my real time test case, i also notice this in tutorial case of OpenFOAM, for example

CASE: tutorials/incompressible/simpleFoam/pitzDaily
COMMAND: mpirun -np 4 simpleFoam -parallel

I run a case parallel with "functionObjects" in controlDict in OpenFOAM Version 1.6.
I use pre compiled OpenFOAM1.6 & openmpi versions.

- case runs fine in single processor with "functionObjects"
- case runs fine in parallel multiple processor without "functionObjects"

The problem is case do not run with "functionObjects" in parallel !!! So i get an run time error.

Here is the content of the functionObjects i use in controlDict file,

HTML Code:

        type fieldMinMax;
        functionObjectLibs ("");
        log yes;
        outputControl  timeStep;
        outputInterval 1;
        mode magnitude;

Here is an error output from simpleFoam solver,
Time = 1

smoothSolver: Solving for Ux, Initial residual = 0.000858153, Final residual = 4.80409e-05, No Iterations 4
smoothSolver: Solving for Uy, Initial residual = 0.00247583, Final residual = 0.000145901, No Iterations 4
smoothSolver: Solving for Uz, Initial residual = 0.00376188, Final residual = 0.000214772, No Iterations 4
GAMG: Solving for p, Initial residual = 0.140115, Final residual = 0.0044083, No Iterations 2
time step continuity errors : sum local = 0.0024423, global = -1.95703e-05, cumulative = -1.95703e-05
smoothSolver: Solving for omega, Initial residual = 0.000519947, Final residual = 2.3265e-05, No Iterations 3
smoothSolver: Solving for k, Initial residual = 0.00221736, Final residual = 9.98441e-05, No Iterations 3
ExecutionTime = 46.25 s ClockTime = 47 s

[cfd4:17702] *** An error occurred in MPI_Recv
[cfd4:17702] *** on communicator MPI_COMM_WORLD
[cfd4:17702] *** MPI_ERR_TRUNCATE: message truncated
[cfd4:17702] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
mpirun has exited due to process rank 0 with PID 17702 on
node cfd4 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).


mattijs August 10, 2009 06:32

I pushed a fix to 1.6.x.

Thanks for reporting


bruce August 10, 2009 07:46

Thanks mattijs,

It works !


mathieu July 29, 2010 17:02


It seems that the bug reported by Bruce is also present in OpenFOAM-1.5-dev (rev. 1758). In short, I have the same error message with a parallel case that uses functionObjects. The case runs fine in serial. Some more details :

- The simulation stops at the end of the second timestep (even if all residuals look good).

- The bug appeared when I switch the linear solvers (for "p" and "cellDisplacement") from PCG to GAMG.

- I started the simulation with PCG, waited a few timesteps, then switched to GAMG (without stopping the simulation) and the simulation seems to run correctly... for now...

- There is no problem with the parallel simulation if I don't use a functionObject (which is, of course, not an interesting option).



mathieu July 30, 2010 00:34

Hi again,

Please disregard my last post: it seems the problem is in a "home made" functionObject. Sorry about that.


lakeat December 16, 2011 13:19

Hi, I met the same error. Could you please recall what the reason for that error?

I am using a home made functionObjects too.

Thanks a lot.

lakeat December 16, 2011 15:14

Ok, I think MPI_BUFFER_SIZE is not enough for my case.

lakeat December 16, 2011 15:37

When I decrease the Nz, the grids points in the spanwise direction, everything works fine.

I am very confused:confused:

All times are GMT -4. The time now is 12:50.