CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM

Parallel Moving Mesh Bug for Multi-patch Case

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   May 21, 2009, 00:23
Default Parallel Moving Mesh Bug for Multi-patch Case
  #1
Member
 
Cem Albukrek
Join Date: Mar 2009
Posts: 50
Rep Power: 8
albcem is on a distinguished road
Hello all,

I am testing the mesh motion for a simple case that consists of a rectangular block with 6 patches for 6 faces. I use pointMotionVelocity to drive the mesh motion.

If I run a solver like "InterDyMFoam" on a single processor, everything runs right. However, if I decompose the case to 2 or more processors and continue the run in parallel then I get errors similar to the below.

It seems the mesh.update() does not interpret the number of points to move on the patches distributed between the processors right. If I define the whole cube as a single patch and run it in parallel then I do not run into any issues.

Some validation on whether this is a true bug and insight into resolving the issue would be helpful.

Thanks

Cem

Exec : rasInterDyMFoam -parallel
Date : May 20 2009
Time : 19:59:13
Host :
PID : 8833
Case :
nProcs : 2
Slaves :
1
(
goldenhorn.8834
)

Pstream initialized with:
floatTransfer : 1
nProcsSimpleSum : 0
commsType : nonBlocking

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0.01

Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
[1] [0]
[0]
[0] size 834 is not equal to the given value of 835
[0]
[0] file: /MultiPatchCube/processor0/0.01/pointMotionU::Block_Side2 from line 3800 to line 3801.
[0]
[0] From function Field<Type>::Field(const word& keyword, const dictionary& dict, const label s)
[0] in file /home/cae/OpenFOAM/OpenFOAM-1.5/src/OpenFOAM/lnInclude/Field.C at line 224.
[0]
FOAM parallel run exiting
[0]
[goldenhorn.caebridge.com:08833] MPI_ABORT invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1

[1]
[1] size 889 is not equal to the given value of 890
[1]
[1] file: /MultiPatchCube/processor1/0.01/pointMotionU::Block_Bottom from line 1627 to line 1628.
[1]
[1] From function Field<Type>::Field(const word& keyword, const dictionary& dict, const label s)
[1] in file /home/cae/OpenFOAM/OpenFOAM-1.5/src/OpenFOAM/lnInclude/Field.C at line 224.
[1]
FOAM parallel run exiting
[1]
[goldenhorn:08834] MPI_ABORT invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1
mpirun noticed that job rank 1 with PID 8834 on node goldenhorn exited on signal 123 (Unknown signal 123).

Last edited by albcem; May 22, 2009 at 01:11.
albcem is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Runing InterFoam for 3 D damBreak case Patch issue nishant_hull OpenFOAM Running, Solving & CFD 5 February 18, 2009 01:32
how to extend FSI 2D codes to 3D, need advises abouziar Main CFD Forum 1 May 30, 2008 04:08
TranformPoints gives skewed mesh Possible Bug andersking OpenFOAM Mesh Utilities 3 March 25, 2008 22:33
Free surface boudary conditions with SOLA-VOF Fan Main CFD Forum 10 September 9, 2006 12:24
Automatic Mesh Motion solver michele OpenFOAM Running, Solving & CFD 10 September 26, 2005 08:21


All times are GMT -4. The time now is 01:27.