CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM Bugs (
-   -   Parallel Mesh Motion Error 1.5-dev (

rlc138 October 15, 2009 22:33

Parallel Mesh Motion Error 1.5-dev

I am having trouble moving my dynamic mesh in parallel. It seems that vertices shared by more than two processors are not moved correctly. It works fine in serial mode. I am using the laplaceFaceDecomposition solver.

I have a simple test case with a fin in a rectangular flow field. I specify a velocity of the fin's patch only in the y-direction and solve using moveDynamicMesh. The figure below shows vertices on two processors in blue and vertices shared by three processors in red. The vertices in red are not being moved correctly. Has anyone else experienced this issue?

I will gladly share this test case.


hjasak October 16, 2009 05:30

There was some nonsense with point renumbering in 1.5 that I have merged into the -dev line but was never happy with. It is possible there is a code merge complication (=error=bug!) coming out of rhis.

May I ask a favour: could you please go back to OpenFOAM-1.4.1 and try the same motion problem in parallel. If that works OK (and it was tested to death...), I will be able to sort this out.

It would help if I could please get my hands on the case: would that be possible?

Help + thanks,


P.S. Apologies for the inconvenience: I would guess this has veen driving you crazy for a while...

hjasak October 16, 2009 05:31

Sorry, I meant 1.4.1-dev


rlc138 October 16, 2009 19:04

Hi Hrv:

Thanks for the prompt response.

I have tried it with 1.4.1-dev and it also runs fine in serial but fails in parallel (see output below).

The case is attached. Any help you can provide will be greatly appreciated!


1.4.1-dev output:

Create mesh for time = 0
Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: laplaceFaceDecomposition
Selecting motion diffusivity: quadratic
Time = 1
PCG: Solving for motionUx, Initial residual = 0, Final residual = 0, No Iterations 0
PCG: Solving for motionUy, Initial residual = 0.177076, Final residual = 9.61766e-07, No Iterations 155
PCG: Solving for motionUz, Initial residual = 0, Final residual = 0, No Iterations 0
PCG: Solving for motionUx, Initial residual = 0, Final residual = 0, No Iterations 0
PCG: Solving for motionUy, Initial residual = 6.12771e-06, Final residual = 8.64646e-07, No Iterations 4
PCG: Solving for motionUz, Initial residual = 0, Final residual = 0, No Iterations 0
[1] --> FOAM FATAL ERROR : face 7 area does not match neighbour by 0.000291031 with tolerance 0.0001. Possible face ordering problem.
patch: procBoundary1to0 mesh face: 2317
[1] From function processorPolyPatch::calcGeometry()
[1] in file meshes/polyMesh/polyPatches/constraint/processor/processorPolyPatch.C at line 217.
FOAM parallel run exiting
[0] --> FOAM FATAL ERROR : face 66 area does not match neighbour by 0.0128847 with tolerance 0.0001. Possible face ordering problem.
patch: procBoundary0to6 mesh face: 2267

rlc138 October 16, 2009 19:17

I guess my file size was too large.

Try this (I did not zip it here because it was not downloading correctly):

Please let me know if there is anything else I can do to help debug this.

rlc138 October 21, 2009 09:42


Another user (markc) claims he had similar problems if the mesh did not have an even number of cells.
The case I posted has 6000 cells but the decomposed meshes do not have even numbers.


hjasak October 21, 2009 11:13

Hehe, amusing. No, it has nothing to do with the number of cells being odd (and the rabbit's foot is not involved either) :) Sorry, could not resist a joke.

This is to do with the update of globally shared points that happen to be on fixed value boundaries. For some reason, they do not get updated properly,: it looks like the calculation of globally shared points got messed up.

I will look into it further and report. Please remind me next week if you don't hear from me in the meantime (busy).


hjasak October 21, 2009 12:55

Your case works perfectly in 1.4.1-dev, ie. the error is to do with the renumbering of vertices in parallel domains, which also breaks a lot of other things. I will take it out from 1.6-dev and revert to the correct code.

All you had to do in 1.4.1-dev is to relax the check on parallel matching a little bit (OpenFOAM-1.4.1-dev/.OpenFOAM-1.4.1-dev/controlDict, under Tolerances) and all will be well.


rlc138 October 24, 2009 06:16

Great! Thanks Hrv.

...And I'm glad it had nothing to do with a Rabbit's foot.


hjasak October 24, 2009 12:02

:) Actually, this one hurts and I will need to do something drastic to stop things like this happening again. Thank you for finding it and apologies for inconvenience.


rlc138 November 9, 2009 15:28


I have reverted to 1.4.1-dev and I can get solutions by relaxing the processorMatchTol, but the mesh matching at the interface is very ugly (see attached proc0 purple and proc1 brown - from the same sample problem I sent you earlier). I have tried tightening the motionU solver convergence and tweaked a few other things but I cannot get any improvements. Do you have any suggestions?


jhoepken August 10, 2010 08:57

Unfortunately, I'm currently having the same problems, you have had. I have a blockMesh grid, which runs just fine. I ran snappyHexMesh on that grid, without adding layers, in order to get a finer grid. If I apply the mesh motion (1.5-dev) onto that refined snappy grid, some nodes seem not be moved at all and as a result, I obtain highly skewed and nonorthogonal cells (see attached image).

As you've suggested, I've relaxed the processorMatchTol in etc/controlDict, but without any improvements. Do you have any advises?

rlc138 August 10, 2010 09:24

I ended up implementing my own mesh motion solver. I made it specific to the problem I was solving to keep it simple and robust.

jhoepken August 10, 2010 09:26

Thanks for your quick replay, but this is exactly what I am trying to avoid ;)

hchen May 7, 2015 10:40


I know it is a old post, but if any of you could give me some suggestions on the dynamic mesh in parallel execution. I open a new thread in:

All times are GMT -4. The time now is 00:16.