CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM (http://www.cfd-online.com/Forums/openfoam/)
-   -   Parallel Moving Mesh Bug for Multi-patch Case (http://www.cfd-online.com/Forums/openfoam/64718-parallel-moving-mesh-bug-multi-patch-case.html)

albcem May 21, 2009 00:23

Parallel Moving Mesh Bug for Multi-patch Case
 
Hello all,

I am testing the mesh motion for a simple case that consists of a rectangular block with 6 patches for 6 faces. I use pointMotionVelocity to drive the mesh motion.

If I run a solver like "InterDyMFoam" on a single processor, everything runs right. However, if I decompose the case to 2 or more processors and continue the run in parallel then I get errors similar to the below.

It seems the mesh.update() does not interpret the number of points to move on the patches distributed between the processors right. If I define the whole cube as a single patch and run it in parallel then I do not run into any issues.

Some validation on whether this is a true bug and insight into resolving the issue would be helpful.

Thanks

Cem

Exec : rasInterDyMFoam -parallel
Date : May 20 2009
Time : 19:59:13
Host :
PID : 8833
Case :
nProcs : 2
Slaves :
1
(
goldenhorn.8834
)

Pstream initialized with:
floatTransfer : 1
nProcsSimpleSum : 0
commsType : nonBlocking

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0.01

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
[1] [0]
[0]
[0] size 834 is not equal to the given value of 835
[0]
[0] file: /MultiPatchCube/processor0/0.01/pointMotionU::Block_Side2 from line 3800 to line 3801.
[0]
[0] From function Field<Type>::Field(const word& keyword, const dictionary& dict, const label s)
[0] in file /home/cae/OpenFOAM/OpenFOAM-1.5/src/OpenFOAM/lnInclude/Field.C at line 224.
[0]
FOAM parallel run exiting
[0]
[goldenhorn.caebridge.com:08833] MPI_ABORT invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1

[1]
[1] size 889 is not equal to the given value of 890
[1]
[1] file: /MultiPatchCube/processor1/0.01/pointMotionU::Block_Bottom from line 1627 to line 1628.
[1]
[1] From function Field<Type>::Field(const word& keyword, const dictionary& dict, const label s)
[1] in file /home/cae/OpenFOAM/OpenFOAM-1.5/src/OpenFOAM/lnInclude/Field.C at line 224.
[1]
FOAM parallel run exiting
[1]
[goldenhorn:08834] MPI_ABORT invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1
mpirun noticed that job rank 1 with PID 8834 on node goldenhorn exited on signal 123 (Unknown signal 123).


All times are GMT -4. The time now is 15:28.