CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM (
-   -   MPI comunication wrong after use of gather/scatter (

matteoL October 5, 2011 05:09

MPI comunication wrong after use of gather/scatter
1 Attachment(s)
I have implemented my own version of a mesh motion solver that, when run in parallel, it needs to exchange data with other partitions.
To do so I have used the gather/scatter and IPStream/OPStream functions available in OF.

The new library seems to work fine.

The problem is that now when i run my code in parallel the fluid solver doesn't work anymore!
The residual of U and P solver are very high and looking at the solution (see attached image) it seems like there is no more comunications between processors since there is a very high unphysical boundary layer where the different partitions touch each other.

The cube shoew in the picture has been decomposed in 3 subdomain with simple (1 1 3) scheme. The correct result should be a small value everywhere but on the top (cavity problem, initial time step).

Any idea how can I have managed to mess up with the MPI comunications in the linear solvers? I haven't touched them at all and my own library I have used only OF MPI functions (i.e. gather/scatter/PStream).


matteoL October 10, 2011 13:19

Gather/scatter is not the problem... mesh.update() is the trigger..
After some more analysis, I have discovered that the problem is not linked at all to gather/scatter...

In fact I have no problem If I use my mesh motion solver as long as i don' t call for mesh.update().

mesh.update() call for my own version of curPoints() anbd solve().

Even If i set solve() to do nothing and curPoitns() just to give back the current mesh points without modifiyng them, I still have the problem described above in parallel..

How can this be possible? What else is called in mesh.update that could create such a strange behaviour?

Thanks again,

All times are GMT -4. The time now is 16:58.