Parallel meshing with OP 2.0.0
I am currently testing the new snappyHexMesh in OF-2.0.0/OF-2.0.x (the feature edge handling), successfully in serial mode but not in parallel.
So I have desactivated the feature edge handling in the snappyHexMeshDict and retry in // without more luck. But it's working well in // with OF-1.7.x.
The error message is :
"--> FOAM FATAL ERROR:
From function UIPStream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber) in file UIPread.C at line 114
FOAM parallel run aborting"
Is it a well-known problem of OF-2.0.0 or just a single problem from my case ?
Thank you all,
Easy.. snappyHexMesh need a parallel decomposer.. use ptscotch ..
Create two decomposeParDicts .. one for running e other for meshing.. or use patch and change the comment in something like
ptscotch is not available. Methods available are :
Then my case doesn't seem to be the problem : as I said, parallel snappy works very well in OpenFoam-1.7.x but not in 2.0.0/x.
So no...it's not that easy...
I'm using the approach I said in OF-2.0.x with success... To decompose for snappyHexMesh you have to use ptscotch ... to decompose for the running you have to use scotch.
Verify if you have compiled all related code in Third-Party-2.0.x
canesin@Privado-PC:~/OpenFOAM/canesin-2.0.x/run/naca0012-snappy$ head -n 200 log.snappyHexMesh\ -overwrite
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.x |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
Build : 2.0.x-90da72aa43cd
Exec : snappyHexMesh -overwrite -parallel
Date : Jul 28 2011
Time : 10:13:11
Host : Privado-PC
PID : 10117
Case : /home/canesin/OpenFOAM/canesin-2.0.x/run/naca0012-snappy
nProcs : 4
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create mesh for time = 0
Read mesh in = 0.03 s
Overall mesh bounding box : (-2 0 0.01) (5 2 0.26)
Relative tolerance : 1e-06
Absolute matching distance : 7.284401142e-06
Reading refinement surfaces.
Read refinement surfaces in = 0.35 s
Is there any "ptscotch" folder in your ThirdParty directory ? I only have a "scotch_5.1.11" folder in the ThirdParty-2.0.x directory.
It is the folder "scotch_5.1.11"
Let me be a little more clear, I will try to verbosite the process:
I understood that point. The problem is that ptscotch is not available in the : "Valid decompositionMethods" list which is :
I'm trying to rebuild the scotch library.
Sorry, I believe it's my english that's not helping.
.. You will not decompose with ptscotch .. you will decompose with scotch..
.. But before you run snappyHexMesh you will change the method to ptscotch
Thats because snappy uses it to generate the decomposition graph to order decompose patches..
So, ptscotch will not be a valid decomposition method and that's ok.
Ok I got it.
I launched decomposePar with scotch.
Then I modified the decomposeParDict changing the method by ptscotch.
And finally I launched snappyHexMesh in mpi.
It crashed : I got the same error message I wrote in my first post :
Rigth.. It looks like a communication problem.. I don't have the time now to look deeper in the problem..
But.. as usual a communication problem should be one of two things:
- race condition
- gost pointer
Both can be solved changing in the snappyHexMeshDict:
I have switch this option in the controlDict in : ~/OpenFOAM-2.0.x/etc/controlDict
But there is no change (except the line commsType of course).
Are you using some different MPI version ???
mpirun --version :
-> mpirun (Open MPI) 1.5.3
which mpirun :
That's odd .. but, as it is working here try recompile with SYSTEMOPENMPI .. I'm using mpirun (Open MPI) 1.4.1
Hi Fabio and Aurelien,
The other day it was reported that OpenMPI 1.5.3 is pretty much a beta release and there are still some bugs in it: http://www.cfd-online.com/Forums/ope...s-openmpi.html
Nonetheless, Aurelien can you please try the incompressible/windSimpleFoam/turbineSiting tutorial? It's a practical example of using snappyHexMesh in parallel! (source of this information)
Simply run ./Allrun in that folder.
I've just executed this tutorial with no problems and windSimpleFoam converged in 73 iterations after a 30s run! And I'm using OpenMPI 1.5.3 that comes with OpenFOAM.
Have you found the solution to your problem? I had excatly the same problem with yours. I have no problem in running the incompressible/windSimpleFoam/turbineSiting tutorial. I have no problem in running the serial snappyHexMesh on my problem.
I not sure what we have in common for our model. I even try to inrease the mpi_buffer but still have the same problem.
Please let me know if you already found the solution. Very appreciate!
I didn't find any solution yet. The tutorial turbineSiting is working well too. But in this tutorial there isn't any feature edge handling option.
Then I'm not runnning snappyHexMesh with the command line "runApplication" or "runParallel". I'm directly running "mpirun ....".
There are the 2 points I am going to check (today or tomorow). I'll let you know if I find anything new.
I found the clue already. You need to have initial condition (time=0 folder) specified in order to run snappyHexMesh in parallel. Futhermore you could use hierarchical decomposition method if you having problem changing scoth to ptscoth.
Actually I had just discover this and do not go into deeper yet on how to fully mastering snappyHexMesh. For the time been, I am still investigating many way to have a complete understanding of snappyHexMesh. I hope my little finding of this could lead you to further investigate your problem. I am not sure how the features line will affect this as I have study the details yet. Hopefully with this clue, then you can investigate in parallel with me this problem.
What do you mean by having "initial condition specified in order" ? In my case I have a full 0/ folder with k, omega, U, p and nut. In each of these files, I have specified the boundary conditions...
I mean you must have the initial condition in 0 folder. Since you mentioned you already have initial condition setup in 0 folder then I am not sure what the problem is. I will let you know if I found another clue.
|All times are GMT -4. The time now is 14:53.|