CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Meshing & Mesh Conversion

[snappyHexMesh] Parallel meshing with OP 2.0.0

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 20, 2011, 04:06
Default Parallel meshing with OP 2.0.0
  #1
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Good morning,

I am currently testing the new snappyHexMesh in OF-2.0.0/OF-2.0.x (the feature edge handling), successfully in serial mode but not in parallel.

So I have desactivated the feature edge handling in the snappyHexMeshDict and retry in // without more luck. But it's working well in // with OF-1.7.x.

The error message is :
"--> FOAM FATAL ERROR:
read failed

From function UIPStream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber) in file UIPread.C at line 114
FOAM parallel run aborting"


Is it a well-known problem of OF-2.0.0 or just a single problem from my case ?

Thank you all,

Aurélien
Aurelien Thinat is offline   Reply With Quote

Old   July 27, 2011, 20:59
Default
  #2
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
Easy.. snappyHexMesh need a parallel decomposer.. use ptscotch ..

Create two decomposeParDicts .. one for running e other for meshing.. or use patch and change the comment in something like

Code:
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      decomposeParDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 4;

//method          scotch;
method          ptscotch;

distributed     no;

roots           ( );
ptscotch to snappy and scotch to run.. them move the files or apply the patch.. something like:

Code:
#Decompose, and snappy the mesh.. reconstruct 
runApplication decomposePar
mv system/decomposeParDict system/decomposeParDict.run
mv system/decomposeParDict.pre system/decomposeParDict
runParallel 'snappyHexMesh -overwrite' 4
mv system/decomposeParDict system/decomposeParDict.pre
mv system/decomposeParDict.run system/decomposeParDict
runApplication reconstructPar
Canesin is offline   Reply With Quote

Old   July 28, 2011, 03:19
Default
  #3
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Hi,

ptscotch is not available. Methods available are :
- hierarchical
- manual
- metis
- multiLevel
- scotch
- simple
- structured

Then my case doesn't seem to be the problem : as I said, parallel snappy works very well in OpenFoam-1.7.x but not in 2.0.0/x.

So no...it's not that easy...
Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 08:56
Default
  #4
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
I'm using the approach I said in OF-2.0.x with success... To decompose for snappyHexMesh you have to use ptscotch ... to decompose for the running you have to use scotch.

Verify if you have compiled all related code in Third-Party-2.0.x


canesin@Privado-PC:~/OpenFOAM/canesin-2.0.x/run/naca0012-snappy$ head -n 200 log.snappyHexMesh\ -overwrite
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.x |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.x-90da72aa43cd
Exec : snappyHexMesh -overwrite -parallel
Date : Jul 28 2011
Time : 10:13:11
Host : Privado-PC
PID : 10117
Case : /home/canesin/OpenFOAM/canesin-2.0.x/run/naca0012-snappy
nProcs : 4
Slaves :
3
(
Privado-PC.10118
Privado-PC.10119
Privado-PC.10120
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Read mesh in = 0.03 s

Overall mesh bounding box : (-2 0 0.01) (5 2 0.26)
Relative tolerance : 1e-06
Absolute matching distance : 7.284401142e-06

Reading refinement surfaces.
Read refinement surfaces in = 0.35 s

Last edited by Canesin; July 28, 2011 at 09:19.
Canesin is offline   Reply With Quote

Old   July 28, 2011, 09:30
Default
  #5
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Canesin,

Is there any "ptscotch" folder in your ThirdParty directory ? I only have a "scotch_5.1.11" folder in the ThirdParty-2.0.x directory.
Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 09:42
Default
  #6
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
It is the folder "scotch_5.1.11"


Let me be a little more clear, I will try to verbosite the process:

Code:
#Decompose, and snappy the mesh.. reconstruct 
runApplication decomposePar #IN THIS MOMENT IT IS: method scotch
mv system/decomposeParDict system/decomposeParDict.run
mv system/decomposeParDict.pre system/decomposeParDict #NOW IS: ptscotch
runParallel 'snappyHexMesh -overwrite' 4
mv system/decomposeParDict system/decomposeParDict.pre
mv system/decomposeParDict.run system/decomposeParDict #NOW IS AGAIN scotch
runApplication reconstructPar
Canesin is offline   Reply With Quote

Old   July 28, 2011, 09:47
Default
  #7
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
I understood that point. The problem is that ptscotch is not available in the : "Valid decompositionMethods" list which is :

"7
(
hierarchical
manual
metis
multiLevel
scotch
simple
structured
)"

I'm trying to rebuild the scotch library.
Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 09:57
Default
  #8
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
Aurelien Thinat,

Sorry, I believe it's my english that's not helping.

.. You will not decompose with ptscotch .. you will decompose with scotch..
.. But before you run snappyHexMesh you will change the method to ptscotch

Thats because snappy uses it to generate the decomposition graph to order decompose patches..

So, ptscotch will not be a valid decomposition method and that's ok.
Canesin is offline   Reply With Quote

Old   July 28, 2011, 10:10
Default
  #9
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Ok I got it.

I launched decomposePar with scotch.
Then I modified the decomposeParDict changing the method by ptscotch.
And finally I launched snappyHexMesh in mpi.

It crashed : I got the same error message I wrote in my first post :
Quote:
trap@cfd08:~/Aurelien/MeD-flange/Mesh-4> mpirun --hostfile system/machines -np 4 snappyHexMesh -parallel > log.snappyParallel &

[2]
[2] --> FOAM FATAL ERROR:
[2] read failed
[2]
[2] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[2] in file UIPread.C at line 114.
[2]
FOAM parallel run aborting
[2]
[2] #0 Foam::error:rintStack(Foam::Ostream&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1 Foam::error::abort() in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTy pes, int, Foam:ynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber) in "/home/trap/OpenFOAM/Open[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] read failed
[1]
[1] From function UIPstream::UIPstream(const commsTypes, const int, DynamicList<char>&, streamFormat, versionNumber)
[1] in file UIPread.C at line 114.
[1]
FOAM parallel run aborting
[1]
FOAM-2.0.x/platforms/linux64GccDPOpt/lib/openmpi-1.5.3/libPstream.so"
[2] #3 Foam::IPstream::IPstream(Foam::UPstream::commsType s, int, int, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber)[1] #0 Foam::error:rintStack(Foam::Ostream&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4 Foam::IOdictionary::readFile(bool) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #5 Foam::IOdictionary::IOdictionary(Foam::IOobject const&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #6 Foam::solution::solution(Foam:bjectRegistry const&, Foam::fileName const&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #7 Foam::fvSolution::fvSolution(Foam:bjectRegistry const&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #8 Foam::fvMesh::fvMesh(Foam::IOobject const&) in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #9 main in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/bin/snappyHexMesh"
[2] #10 __libc_start_main in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1 Foam::error::abort() in "/lib64/libc.so.6"
[2] #11 _start at /usr/src/packages/BUILD/glibc-2.10.1/csu/../sysdeps/x86_64/elf/start.S:116
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
in "/home/trap/OpenFOAM/OpenFOAM-2.0.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2 Foam::UIPstream::UIPstream(Foam::UPstream::commsTy pes, int, Foam:ynamicList<char, 0u, 2u, 1u>&, int&, int, bool, Foam::IOstream::streamFormat, Foam::IOstream::versionNumber)--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 12679 on
node cfd08 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 10:29
Default
  #10
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
Rigth.. It looks like a communication problem.. I don't have the time now to look deeper in the problem..

But.. as usual a communication problem should be one of two things:

- race condition
- gost pointer

Both can be solved changing in the snappyHexMeshDict:
Code:
commsType blocking; //If it is now in nomBlocking
Canesin is offline   Reply With Quote

Old   July 28, 2011, 11:22
Default
  #11
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
I have switch this option in the controlDict in : ~/OpenFOAM-2.0.x/etc/controlDict

But there is no change (except the line commsType of course).
Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 11:40
Default
  #12
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
Are you using some different MPI version ???
Post your:

$mpirun --version
$which mpirun
Canesin is offline   Reply With Quote

Old   July 28, 2011, 12:03
Default
  #13
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
mpirun --version :
-> mpirun (Open MPI) 1.5.3

which mpirun :
-> /home/trap/OpenFOAM/ThirdParty-2.0.x/platforms/linux64Gcc/openmpi-1.5.3/bin/mpirun
Aurelien Thinat is offline   Reply With Quote

Old   July 28, 2011, 14:39
Default
  #14
Member
 
Fábio César Canesin
Join Date: Mar 2010
Location: Florianópolis
Posts: 67
Rep Power: 16
Canesin is on a distinguished road
That's odd .. but, as it is working here try recompile with SYSTEMOPENMPI .. I'm using mpirun (Open MPI) 1.4.1
Canesin is offline   Reply With Quote

Old   July 28, 2011, 15:19
Default
  #15
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,974
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Fabio and Aurelien,

The other day it was reported that OpenMPI 1.5.3 is pretty much a beta release and there are still some bugs in it: http://www.cfd-online.com/Forums/ope...s-openmpi.html

Nonetheless, Aurelien can you please try the incompressible/windSimpleFoam/turbineSiting tutorial? It's a practical example of using snappyHexMesh in parallel! (source of this information)
Simply run ./Allrun in that folder.

I've just executed this tutorial with no problems and windSimpleFoam converged in 73 iterations after a 30s run! And I'm using OpenMPI 1.5.3 that comes with OpenFOAM.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   August 4, 2011, 04:34
Default
  #16
New Member
 
Leong
Join Date: Mar 2009
Location: Malaysia
Posts: 20
Rep Power: 17
airfoil is on a distinguished road
Send a message via Skype™ to airfoil
Hi Aurélien,
Have you found the solution to your problem? I had excatly the same problem with yours. I have no problem in running the incompressible/windSimpleFoam/turbineSiting tutorial. I have no problem in running the serial snappyHexMesh on my problem.
I not sure what we have in common for our model. I even try to inrease the mpi_buffer but still have the same problem.
Please let me know if you already found the solution. Very appreciate!

Rgds,
Leong
airfoil is offline   Reply With Quote

Old   August 4, 2011, 04:56
Default
  #17
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Hi Leong,

I didn't find any solution yet. The tutorial turbineSiting is working well too. But in this tutorial there isn't any feature edge handling option.

Then I'm not runnning snappyHexMesh with the command line "runApplication" or "runParallel". I'm directly running "mpirun ....".

There are the 2 points I am going to check (today or tomorow). I'll let you know if I find anything new.
Aurelien Thinat is offline   Reply With Quote

Old   August 4, 2011, 23:58
Default
  #18
New Member
 
Leong
Join Date: Mar 2009
Location: Malaysia
Posts: 20
Rep Power: 17
airfoil is on a distinguished road
Send a message via Skype™ to airfoil
Hi Aurélien,
I found the clue already. You need to have initial condition (time=0 folder) specified in order to run snappyHexMesh in parallel. Futhermore you could use hierarchical decomposition method if you having problem changing scoth to ptscoth.
Actually I had just discover this and do not go into deeper yet on how to fully mastering snappyHexMesh. For the time been, I am still investigating many way to have a complete understanding of snappyHexMesh. I hope my little finding of this could lead you to further investigate your problem. I am not sure how the features line will affect this as I have study the details yet. Hopefully with this clue, then you can investigate in parallel with me this problem.

Leong
airfoil is offline   Reply With Quote

Old   August 5, 2011, 03:17
Default
  #19
Senior Member
 
Aurelien Thinat
Join Date: Jul 2010
Posts: 165
Rep Power: 15
Aurelien Thinat is on a distinguished road
Leong,

What do you mean by having "initial condition specified in order" ? In my case I have a full 0/ folder with k, omega, U, p and nut. In each of these files, I have specified the boundary conditions...
Aurelien Thinat is offline   Reply With Quote

Old   August 5, 2011, 07:26
Default
  #20
New Member
 
Leong
Join Date: Mar 2009
Location: Malaysia
Posts: 20
Rep Power: 17
airfoil is on a distinguished road
Send a message via Skype™ to airfoil
Hi Aurelien,
I mean you must have the initial condition in 0 folder. Since you mentioned you already have initial condition setup in 0 folder then I am not sure what the problem is. I will let you know if I found another clue.

Rgds,
Leong
airfoil is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Error running simpleFoam in parallel Yuby OpenFOAM Running, Solving & CFD 14 October 7, 2021 04:38
"Failed Starting Thread 0" ebringley OpenFOAM Running, Solving & CFD 2 April 26, 2019 05:45
[ANSYS Meshing] Why do close parallel lines always become crossed when transfer to Ansys Meshing smartgyg ANSYS Meshing & Geometry 0 January 20, 2017 15:40
[ANSYS Meshing] Meshing in parallel oborona ANSYS Meshing & Geometry 24 October 15, 2015 10:54
Parallel meshing using XP64 with PVM in CFX Mesh Huw ANSYS Meshing & Geometry 4 July 12, 2010 10:24


All times are GMT -4. The time now is 21:10.