CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Programming & Development

neighbour boundary has no faces in parallel

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   August 3, 2013, 23:52
Exclamation neighbour boundary has no faces in parallel
  #1
New Member
 
David
Join Date: Mar 2010
Location: Vancouver, Canada
Posts: 13
Rep Power: 7
PEM_GUY is on a distinguished road
I have a multi-region domain, consisting of three blocks:

block_left, block_centre, block_right

there are the obvious exterior boundaries and 6 interfaces between the blocks:

for block_left:
block_left_to_block_centre

for block_centre:
block_centre_to_block_left
block_centre_to_block_right

for block_right:
block_right_to_block_centre

I have a custom boundary condition between the blocks and it works perfectly in serial, when in parallel however the boundary condition stops working.

It appears that the issue is related to some of the interfaces ending up with no faces after the decomposition.

Here is a section of code from the boundary condition and the output from the solver log file.

** CODE **

int oldTag = UPstream::msgType();
UPstream::msgType() = oldTag+1;

const mappedPatchBase& mpp = refCast<const mappedPatchBase>
(
this->patch().patch()
);
const polyMesh& neighbourMesh = mpp.sampleMesh();
const fvPatch& neighbourPatch = refCast<const fvMesh>
(
neighbourMesh
).boundary()[mpp.samplePolyPatch().index()];

const fvMesh& principalMesh = patch().boundaryMesh().mesh();

Info<< tab << "******************************" << endl;
Info<< "PRINCIPAL NAME" << nl << principalMesh.name() << endl;
Info<< "NEIGHBOUR NAME" << nl << neighbourMesh.name() << endl;

Info<< "PRINCIPAL PATCH" << nl << this->patch().patch() << endl;
Info<< "NEIGHBOUR PATCH" << nl << neighbourPatch.patch() << endl;
Info<< tab << "******************************" << endl;


// Force a recalculation of mapping and schedule
const mapDistribute& distMap = mpp.map();
(void)distMap.schedule();

vectorField pUnitNormal = this->patch().Sf()/this->patch().magSf();
vectorField nUnitNormal = neighbourPatch.Sf()/neighbourPatch.magSf();

Info << "p Unit Normal" << nl << pUnitNormal << endl;
Info << "n Unit Normal" << nl << nUnitNormal << endl;

**OUTPUT**

PRINCIPAL NAME
block_r
NEIGHBOUR NAME
block_c
PRINCIPAL PATCH
type mappedWall;
nFaces 0;
startFace 125;
sampleMode nearestPatchFace;
sampleRegion block_c;
samplePatch block_c_to_block_r;
offsetMode uniform;
offset (0 0 0);

NEIGHBOUR PATCH
type mappedWall;
nFaces 6;
startFace 119;
sampleMode nearestPatchFace;
sampleRegion block_r;
samplePatch block_r_to_block_c;
offsetMode uniform;
offset (0 0 0);

******************************
p Unit Normal
0()
n Unit Normal
6{(1 0 0)}


this behaviour alternates between the unit normal of the principal face and the one for the neighbour having no normal. This behaviour is entirely correlated to nFaces being 0 for the given patch.

Does anyone have some insight?

Your help is much appreciated.

PG
PEM_GUY is offline   Reply With Quote

Reply

Tags
boundary, boundary conditions, normal, openfoam, parallel

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Radiation interface hinca CFX 15 January 26, 2014 18:11
An error has occurred in cfx5solve: volo87 CFX 5 June 14, 2013 17:44
Group/Merge boundary faces Koga OpenFOAM Pre-Processing 2 November 13, 2012 14:58
Domain Imbalance HMR CFX 3 March 6, 2011 21:10
DecomposePar unequal number of shared faces maka OpenFOAM Pre-Processing 6 August 12, 2010 09:01


All times are GMT -4. The time now is 06:25.