CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Running in parallel (https://www.cfd-online.com/Forums/openfoam-solving/112051-running-parallel.html)

Djub January 21, 2013 07:27

Running in parallel
 
hi dear foamers,

I am dealing with running a large case in parallel.
Previously, I was doing different steps in this order :
Code:

blockmesh
snappyHexMesh
snappyHexMesh
decomposePar
mpirun -n 14 pimpleFoam -parallel

(I don't mention the displacement of dictionaries and boundary conditions)
And it worked quite well. Nevertheless, I think it was not 100% correct, because for example paraView is not able to read the decomposed case. I have to reconstruct it before to plot it.

Note: I am using twice sHM because I want to snap the mesh over one object, but not over another one. Thus, I use two STL and two different sHMDict.

By the way, I want to use renumberMesh, and to use sHM in parallel mode. Thus, my new steps are:
Code:

blockMesh
decomposePar
mpirun -n 14 snappyHexMesh -parallel -overwrite
mpirun -n 14 snappyHexMesh -parallel -overwrite
mpirun -n 14 renumberMesh -parallel -overwrite
mpirun -n 14 pimpleFoam -parallel

First question: I found these steps in tutorials/incompressible/pisoFoam/les/motorBike/motorBike/. In this tutorial, they are deleting all the *level* files:
Code:

find . -type f -iname "*level*" -exec rm {} \;
Why ? Is this necessary ?

Second question: When I run paraView on the case, using "decomposed case", I can see the separations between the 14 different processors. Is it normal?
When I try reconstructPar -zeroTime, it crashes:
Code:

--> FOAM FATAL ERROR:
Size of maps does not correspond to size of mesh for processor 0
faceProcAddressing : 1023858 nFaces : 1028376
cellProcAddressing : 335957 nCell : 337705
boundaryProcAddressing : 10 nFaces : 11

When I try reconstructParMesh , it doesnot work on time 0, but need the -constant option. Then, reconstructPar -zeroTime works fine. And then, paraview works fine on the reconstructed case.
Can anyone explain me all this process ? Why working on constant or zerotime ?

Do you need anything more to help me?

wyldckat January 23, 2013 17:43

Greetings Julien,

I don't have much time to go into details, so I'll be succinct:
  1. The command you showed on the first questions removes files that would just get in the way of decomposition and reconstruction.
  2. It's simpler if you take a look at the cases shared here: http://code.google.com/p/bluecfd-sin...untimes202_211
I know there is a thread on this forum that discusses more on this topic, but I can't find it right now :(


Best regards,
Bruno

Djub January 24, 2013 05:07

Thanks Bruno,
I think I've catched some more about how it works. In the BlueCFD tuto, you reconstruct the Mesh (only the mesh), and then decompose again. It forces the basic case to correspond with the decomposed case. Nice!
So I suppose I can copy my boundary condition just before to run the ultimate decomposition? Only once, instead of each processor?
Saying:
Code:

cp -r 0.org/* 0/
before final decomposition, instead of
Code:

ls -d processor* | xargs -i cp -r 0.org/* ./{}/0/
after final decomposition.

Thanks for this tuto!

wyldckat January 24, 2013 16:01

Hi Julien,

Quote:

Originally Posted by Djub (Post 403729)
So I suppose I can copy my boundary condition just before to run the ultimate decomposition? Only once, instead of each processor?
Saying:
Code:

cp -r 0.org/* 0/
before final decomposition, instead of
Code:

ls -d processor* | xargs -i cp -r 0.org/* ./{}/0/
after final decomposition.

:confused: Just in case those two weren't rhetorical questions, the answers are:
  1. Yes...
  2. And yes. :)
But like I said, I know there is a thread somewhere that goes into more detail... mmm, in fact there are two:
  1. http://www.cfd-online.com/Forums/ope...omposepar.html
  2. http://www.cfd-online.com/Forums/ope...ssorx-0-a.html
Best regards,
Bruno


All times are GMT -4. The time now is 18:07.