CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM Running, Solving & CFD (
-   -   MPI issues with my own solver (

davide_c March 22, 2012 10:54

MPI issues with my own solver
Hi everybody!

I'm not sure if this can be just a running problem, a programming problem or a bug, so I will write here hoping it's ok.

I programmed my own solver ( for joint fluid dynamics and electrical problems ) which is working really fine on serial runs (with around 10^5 mesh cells) but as i now want to use it for a non 2D case, my mesh comes up to 1.5kk cells and I switched to parallel.

I tested before with the breakingDam classic example, which works fine, then i switched to my own stuff

I decomposePar'd the domain - all fine - and then i ran the solver obtaining the following message:


[moon01-01:19649] *** An error occurred in MPI_Recv
[moon01-01:19649] *** on communicator MPI_COMM_WORLD
[moon01-01:19649] *** MPI_ERR_TRUNCATE: message truncated
[moon01-01:19649] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
mpirun has exited due to process rank 0 with PID 19649 on
node moon01-01 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).

the strange fact is that this error comes up every time at the exact same stage in the solving: I have a newton iteration stage for a nonlinear problem, and this comes up always after the 15th iteration, before the 16th linear solving of the problem.

does anyone have a hint about what's going on?

davide_c March 23, 2012 09:57

after further tries, i came to the conclusion that this is 99% probably something programming-related so I posted another thread in the correct section... feel free to have a look there ;)

All times are GMT -4. The time now is 18:12.