|
[Sponsors] |
[snappyHexMesh] Parallel meshing with OP 2.0.0 |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
July 20, 2011, 04:06 |
Parallel meshing with OP 2.0.0
|
#1 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Good morning,
I am currently testing the new snappyHexMesh in OF-2.0.0/OF-2.0.x (the feature edge handling), successfully in serial mode but not in parallel. So I have desactivated the feature edge handling in the snappyHexMeshDict and retry in // without more luck. But it's working well in // with OF-1.7.x. The error message is : "--> FOAM FATAL ERROR: read failed From function UIPStream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber) in file UIPread.C at line 114 FOAM parallel run aborting" Is it a well-known problem of OF-2.0.0 or just a single problem from my case ? Thank you all, Aurélien |
|
July 27, 2011, 20:59 |
|
#2 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
Easy.. snappyHexMesh need a parallel decomposer.. use ptscotch ..
Create two decomposeParDicts .. one for running e other for meshing.. or use patch and change the comment in something like Code:
FoamFile { version 2.0; format ascii; class dictionary; location "system"; object decomposeParDict; } // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // numberOfSubdomains 4; //method scotch; method ptscotch; distributed no; roots ( ); Code:
#Decompose, and snappy the mesh.. reconstruct runApplication decomposePar mv system/decomposeParDict system/decomposeParDict.run mv system/decomposeParDict.pre system/decomposeParDict runParallel 'snappyHexMesh -overwrite' 4 mv system/decomposeParDict system/decomposeParDict.pre mv system/decomposeParDict.run system/decomposeParDict runApplication reconstructPar |
|
July 28, 2011, 03:19 |
|
#3 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Hi,
ptscotch is not available. Methods available are : - hierarchical - manual - metis - multiLevel - scotch - simple - structured Then my case doesn't seem to be the problem : as I said, parallel snappy works very well in OpenFoam-1.7.x but not in 2.0.0/x. So no...it's not that easy... |
|
July 28, 2011, 08:56 |
|
#4 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
I'm using the approach I said in OF-2.0.x with success... To decompose for snappyHexMesh you have to use ptscotch ... to decompose for the running you have to use scotch.
Verify if you have compiled all related code in Third-Party-2.0.x canesin@Privado-PC:~/OpenFOAM/canesin-2.0.x/run/naca0012-snappy$ head -n 200 log.snappyHexMesh\ -overwrite /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.0.x | | \\ / A nd | Web: www.OpenFOAM.com | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.0.x-90da72aa43cd Exec : snappyHexMesh -overwrite -parallel Date : Jul 28 2011 Time : 10:13:11 Host : Privado-PC PID : 10117 Case : /home/canesin/OpenFOAM/canesin-2.0.x/run/naca0012-snappy nProcs : 4 Slaves : 3 ( Privado-PC.10118 Privado-PC.10119 Privado-PC.10120 ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 Read mesh in = 0.03 s Overall mesh bounding box : (-2 0 0.01) (5 2 0.26) Relative tolerance : 1e-06 Absolute matching distance : 7.284401142e-06 Reading refinement surfaces. Read refinement surfaces in = 0.35 s Last edited by Canesin; July 28, 2011 at 09:19. |
|
July 28, 2011, 09:30 |
|
#5 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Canesin,
Is there any "ptscotch" folder in your ThirdParty directory ? I only have a "scotch_5.1.11" folder in the ThirdParty-2.0.x directory. |
|
July 28, 2011, 09:42 |
|
#6 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
It is the folder "scotch_5.1.11"
Let me be a little more clear, I will try to verbosite the process: Code:
#Decompose, and snappy the mesh.. reconstruct runApplication decomposePar #IN THIS MOMENT IT IS: method scotch mv system/decomposeParDict system/decomposeParDict.run mv system/decomposeParDict.pre system/decomposeParDict #NOW IS: ptscotch runParallel 'snappyHexMesh -overwrite' 4 mv system/decomposeParDict system/decomposeParDict.pre mv system/decomposeParDict.run system/decomposeParDict #NOW IS AGAIN scotch runApplication reconstructPar |
|
July 28, 2011, 09:47 |
|
#7 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
I understood that point. The problem is that ptscotch is not available in the : "Valid decompositionMethods" list which is :
"7 ( hierarchical manual metis multiLevel scotch simple structured )" I'm trying to rebuild the scotch library. |
|
July 28, 2011, 09:57 |
|
#8 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
Aurelien Thinat,
Sorry, I believe it's my english that's not helping. .. You will not decompose with ptscotch .. you will decompose with scotch.. .. But before you run snappyHexMesh you will change the method to ptscotch Thats because snappy uses it to generate the decomposition graph to order decompose patches.. So, ptscotch will not be a valid decomposition method and that's ok. |
|
July 28, 2011, 10:10 |
|
#9 | |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Ok I got it.
I launched decomposePar with scotch. Then I modified the decomposeParDict changing the method by ptscotch. And finally I launched snappyHexMesh in mpi. It crashed : I got the same error message I wrote in my first post : Quote:
|
||
July 28, 2011, 10:29 |
|
#10 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
Rigth.. It looks like a communication problem.. I don't have the time now to look deeper in the problem..
But.. as usual a communication problem should be one of two things: - race condition - gost pointer Both can be solved changing in the snappyHexMeshDict: Code:
commsType blocking; //If it is now in nomBlocking |
|
July 28, 2011, 11:22 |
|
#11 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
I have switch this option in the controlDict in : ~/OpenFOAM-2.0.x/etc/controlDict
But there is no change (except the line commsType of course). |
|
July 28, 2011, 11:40 |
|
#12 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
Are you using some different MPI version ???
Post your: $mpirun --version $which mpirun |
|
July 28, 2011, 12:03 |
|
#13 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
mpirun --version :
-> mpirun (Open MPI) 1.5.3 which mpirun : -> /home/trap/OpenFOAM/ThirdParty-2.0.x/platforms/linux64Gcc/openmpi-1.5.3/bin/mpirun |
|
July 28, 2011, 14:39 |
|
#14 |
Member
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16 |
That's odd .. but, as it is working here try recompile with SYSTEMOPENMPI .. I'm using mpirun (Open MPI) 1.4.1
|
|
July 28, 2011, 15:19 |
|
#15 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Fabio and Aurelien,
The other day it was reported that OpenMPI 1.5.3 is pretty much a beta release and there are still some bugs in it: http://www.cfd-online.com/Forums/ope...s-openmpi.html Nonetheless, Aurelien can you please try the incompressible/windSimpleFoam/turbineSiting tutorial? It's a practical example of using snappyHexMesh in parallel! (source of this information) Simply run ./Allrun in that folder. I've just executed this tutorial with no problems and windSimpleFoam converged in 73 iterations after a 30s run! And I'm using OpenMPI 1.5.3 that comes with OpenFOAM. Best regards, Bruno
__________________
|
|
August 4, 2011, 04:34 |
|
#16 |
New Member
|
Hi Aurélien,
Have you found the solution to your problem? I had excatly the same problem with yours. I have no problem in running the incompressible/windSimpleFoam/turbineSiting tutorial. I have no problem in running the serial snappyHexMesh on my problem. I not sure what we have in common for our model. I even try to inrease the mpi_buffer but still have the same problem. Please let me know if you already found the solution. Very appreciate! Rgds, Leong |
|
August 4, 2011, 04:56 |
|
#17 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Hi Leong,
I didn't find any solution yet. The tutorial turbineSiting is working well too. But in this tutorial there isn't any feature edge handling option. Then I'm not runnning snappyHexMesh with the command line "runApplication" or "runParallel". I'm directly running "mpirun ....". There are the 2 points I am going to check (today or tomorow). I'll let you know if I find anything new. |
|
August 4, 2011, 23:58 |
|
#18 |
New Member
|
Hi Aurélien,
I found the clue already. You need to have initial condition (time=0 folder) specified in order to run snappyHexMesh in parallel. Futhermore you could use hierarchical decomposition method if you having problem changing scoth to ptscoth. Actually I had just discover this and do not go into deeper yet on how to fully mastering snappyHexMesh. For the time been, I am still investigating many way to have a complete understanding of snappyHexMesh. I hope my little finding of this could lead you to further investigate your problem. I am not sure how the features line will affect this as I have study the details yet. Hopefully with this clue, then you can investigate in parallel with me this problem. Leong |
|
August 5, 2011, 03:17 |
|
#19 |
Senior Member
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 16 |
Leong,
What do you mean by having "initial condition specified in order" ? In my case I have a full 0/ folder with k, omega, U, p and nut. In each of these files, I have specified the boundary conditions... |
|
August 5, 2011, 07:26 |
|
#20 |
New Member
|
Hi Aurelien,
I mean you must have the initial condition in 0 folder. Since you mentioned you already have initial condition setup in 0 folder then I am not sure what the problem is. I will let you know if I found another clue. Rgds, Leong |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Error running simpleFoam in parallel | Yuby | OpenFOAM Running, Solving & CFD | 14 | October 7, 2021 04:38 |
"Failed Starting Thread 0" | ebringley | OpenFOAM Running, Solving & CFD | 2 | April 26, 2019 05:45 |
[ANSYS Meshing] Why do close parallel lines always become crossed when transfer to Ansys Meshing | smartgyg | ANSYS Meshing & Geometry | 0 | January 20, 2017 15:40 |
[ANSYS Meshing] Meshing in parallel | oborona | ANSYS Meshing & Geometry | 24 | October 15, 2015 10:54 |
Parallel meshing using XP64 with PVM in CFX Mesh | Huw | ANSYS Meshing & Geometry | 4 | July 12, 2010 10:24 |