|
[Sponsors] |
Averaging over neighbouring cells in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 16, 2011, 07:09 |
Averaging over neighbouring cells in parallel
|
#1 |
New Member
John O\'Sullivan
Join Date: Mar 2009
Location: Auckland, New Zealand
Posts: 7
Rep Power: 17 |
Hi everyone,
I'm having trouble working out how to access neighbouring points when running my application in parallel. As part of my algorithm I need to filter/smooth the values of the Reynolds stresses based on the neighbouring cells. It works fine in serial but in parallel it generates problematic results as the cellCells() call only returns the neighbours on the same processor. I've read lots of posts and found some info referring to PStream and globalMesh but must admit I'm a novice at parallel programming so any help would be great. I thought there may be a simple way to tell the application to look across the entire domain when doing the averaging? Here's my code as it is: forAll(mesh_.cells(), cellI) { const labelList& nbrs = mesh_.cellCells()[cellI]; scalar volCell = mesh_.V()[cellI]; scalar volume = volCell; scalar volRxx = volCell * R_[cellI].xx(); scalar volRxy = volCell * R_[cellI].xy(); scalar volRxz = volCell * R_[cellI].xz(); scalar volRyy = volCell * R_[cellI].yy(); scalar volRyz = volCell * R_[cellI].yz(); scalar volRzz = volCell * R_[cellI].zz(); forAll(nbrs,nbrI) { scalar volCell = mesh_.V()[nbrs[nbrI]]; volume += volCell; volRxx += volCell * R_[nbrs[nbrI]].xx(); volRxy += volCell * R_[nbrs[nbrI]].xy(); volRxz += volCell * R_[nbrs[nbrI]].xz(); volRyy += volCell * R_[nbrs[nbrI]].yy(); volRyz += volCell * R_[nbrs[nbrI]].yz(); volRzz += volCell * R_[nbrs[nbrI]].zz(); } Rbuffer[cellI].xx() = volRxx/volume; Rbuffer[cellI].xy() = volRxy/volume; Rbuffer[cellI].xz() = volRxz/volume; Rbuffer[cellI].yy() = volRyy/volume; Rbuffer[cellI].yz() = volRyz/volume; Rbuffer[cellI].zz() = volRzz/volume; } Thanks! |
|
March 21, 2011, 23:28 |
|
#2 |
New Member
John O\'Sullivan
Join Date: Mar 2009
Location: Auckland, New Zealand
Posts: 7
Rep Power: 17 |
Noone with any ideas? I've spent a lot of time looking through source code for something similar but haven't come up with anything yet.
Thanks! |
|
April 11, 2011, 16:46 |
|
#3 |
Senior Member
Eugene de Villiers
Join Date: Mar 2009
Posts: 725
Rep Power: 21 |
All coupled boundaries of geometric fields have a member function called "patchNeighbourField". I think this should do what you need provided the property you want is stored in a geometric field.
Otherwise you want to use the syncTools, but that's a bit more complicated. Eugene |
|
October 29, 2015, 08:39 |
|
#4 |
Senior Member
Artur
Join Date: May 2013
Location: Southampton, UK
Posts: 372
Rep Power: 20 |
Hi All,
Thanks Eugene for posting about patchNeighbourField(). Does anybody know if the neighbour data is stored in this (Pstream::myProcNo()) process or does calling this method send a request to the processor solving the other domain? I got a bit entangled in the code and can't seem to figure this one out, thanks. A |
|
Tags |
neighbouring cells, parallel |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[snappyHexMesh] SnappyHexMesh for internal Flow | vishwa | OpenFOAM Meshing & Mesh Conversion | 24 | June 27, 2016 08:54 |
[Netgen] Import netgen mesh to OpenFOAM | hsieh | OpenFOAM Meshing & Mesh Conversion | 32 | September 13, 2011 05:50 |
[snappyHexMesh] snappyHexMesh aborting | Tobi | OpenFOAM Meshing & Mesh Conversion | 0 | November 10, 2010 03:23 |
[snappyHexMesh] external flow with snappyHexMesh | chelvistero | OpenFOAM Meshing & Mesh Conversion | 11 | January 15, 2010 19:43 |
physical boundary error!! | kris | Siemens | 2 | August 3, 2005 00:32 |