CFD Online Discussion Forums

CFD Online Discussion Forums (
-   OpenFOAM Running, Solving & CFD (
-   -   About the problem of decomposePar utility (

zou_mo July 16, 2005 23:24

I run my case on a cluster. Wh
I run my case on a cluster. When my case has the size of 100*100*200, it goes well. But once the size increase to 128*128*256, something unexpected occurs. When I use decomposePar utility (which seems to be only a single node version?) to distribute my mesh to each processor, the error told me the memeory is not sufficient and the run crash.

I know that some software packages exsist which can be used to decompose mesh in parallel. Can the decomposePar utility be modified to meet the large size decomposition? If yes, I want to do something on decomposePar.


mattijs July 17, 2005 08:59

decomposePar is not very memor
decomposePar is not very memory efficient so will run out of memory (Guess you have a 32bit machine so only 2Gbytes of address space)

It could be made a bit more efficient but that would only get you a bit further.

Making it run truly parallel would be much harder since even the mesh files themselves will be too big for a single machine.

The real solution would be to do everything in parallel, from mesh generation down to postprocessing.

(unfortunately blockMesh cannot do parallel mesh generation)

sampaio July 27, 2005 12:26

And what would be the recommen
And what would be the recommended mesh generator for this task?

Would gambit do the job?
Would gmsh do the job?

Would I still need to convert from gambit/gmsh/other to foam, right? In this case, would I have the same problem of memory when using, say, fluentMeshToFoam?

Thanks a lot.

mattijs July 27, 2005 18:18

I haven't heard of any paralle
I haven't heard of any parallel mesh generators out there.

My guess your best bet is to get a 64bit machine to do the decomposePar and reconstructPar and postprocessing. OpenFOAM runs fine on 64bit machines.

All times are GMT -4. The time now is 17:37.