CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Native Meshers: snappyHexMesh and Others (http://www.cfd-online.com/Forums/openfoam-meshing-snappyhexmesh/)
-   -   SnappyHex on a decomposed case (http://www.cfd-online.com/Forums/openfoam-meshing-snappyhexmesh/114538-snappyhex-decomposed-case.html)

Pj. March 13, 2013 05:06

SnappyHex on a decomposed case (disappearing boundary conditions)
 
Hi everybody,

this is my first post, so I introduce myself: I`m an engineering student attending the last year of master degree at the Politecnico of Milan.

I am having some troubles with the parallel computing of snappyHex.

I need to use both snappyHexMesh and decomposePar. If I run snappyHex and then decomposePar there are not problem. In the boundary condition of p,U etc I`ve already inserted the XXX_patch0 that snappy hex will create. So after I`ve "snapped" it, I can decompose it and solve it.

Instead if I want to run snappyHex on multiple cores I have to decompose the case first. But the files p,U etc in the folders processorX "lose" the boundary conditions XXX_patch0. So, when i do the snappyHex and then i lauch the case, it gives me the error

keyword spires_patch0 is undefined in dictionary "/[...]/B0fine/processor0/0/p::boundaryField"

while it is

keyword spires_patch0 is undefined in dictionary "/[...]/B0fine/0/p::boundaryField"

how can i say to decomposePar to keep those boundary condition even if it doesn-t sees them in the blockmesh boundary file?

thank you very much

hayes March 13, 2013 07:47

Hi Luca,

the problem is, after the meshing with snappyHexMesh there are no information about your stl patches within the processor directories (processor0, processor1, ...) yet. So what you can do is reconstruct the mesh back into one master mesh, remove the processor folders and decompose again. After these commands it should run without any problems:

blockMesh
decomposePar
mpirun --hostfile YOURHOSTFILE -np ... snappyHexMesh -overwrite -parallel
reconstructParMesh -constant
rm -r processor*
decomposePar
mpirun --hostfile YOURHOSTFILE -np ... simpleFoam -parallel

Best regards,
Chris

Pj. March 13, 2013 10:02

I'll try this tomorrow, but I don't understand: if during the first decomposePar the boundary conditions on XXX_patch0 get lost, when does it get recovered in the proces you showed me? Is it the "-constant" option of reconstructParMesh?

hayes March 13, 2013 13:46

the mesh that first gets decomposed is the one created by blockMesh. SO no patch exists other than specified in your boundaryDict. And consequently during decomposePar on this background mesh there is no need for decomposePar to copy any initial boundary information for patches that do not exist yet.

After snappyHexMesh the stl surface is included as a patch, but still without boundary information in 0 for the new patch.

Pj. March 14, 2013 00:04

Oh... so during the decomposePar it looks if the boundary conditions in 0/* are useful, since it doesn't sees any XXX_patchX it discards those boundary condition.

The second time it reads again the boundary conditions in 0/* and since now those patches exists it keeps them.

Am I correct?

Anyway I'm trying right now. I will see...

Pj. March 15, 2013 00:34

I did what you said and it worked perfectly.

I`ve to say that I`m a bit surprised that in the decomposeParDict there isn`t an option to say it to keep all boundary conditions that appear in the 0 forlder fields, even those that refer to patches that are not (yet) in the constant/polyMesh/boundary file.

That would save all this kind of work around.

Anyway the problem has been solved so everythig is fine.

Thank you very much for the very fast help,

Bye


All times are GMT -4. The time now is 00:45.