CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Meshing & Mesh Conversion (https://www.cfd-online.com/Forums/openfoam-meshing/)
-   -   [snappyHexMesh] SnappyHexMesh in parallel openmpi (https://www.cfd-online.com/Forums/openfoam-meshing/61515-snappyhexmesh-parallel-openmpi.html)

wikstrom October 14, 2008 07:18

SnappyHexMesh in parallel openmpi
 
Lately I have several times been running into the followin problem. It is repeatable with the same case on two different hardware archs and with icc and gcc compilers:

During Shell refinement iteration (>1) an MPI-error occur:

[dagobah:01576] *** An error occurred in MPI_Bsend
[dagobah:01576] *** on communicator MPI_COMM_WORLD
[dagobah:01576] *** MPI_ERR_BUFFER: invalid buffer pointer
[dagobah:01576] *** MPI_ERRORS_ARE_FATAL (goodbye)


Here is the complete case
http://www.cfd-online.com/OpenFOAM_D...hment_icon.gif snappyHexMesh-coarse.tgz

To run:

blockMesh
decomposePar
foamJob -p -s snappyHexMesh


I do not know if this is to be regarded a bug, or if it's only me...

Cheers
Niklas

niklas October 14, 2008 07:24

its just you http://www.cfd-on
 
its just you http://www.cfd-online.com/OpenFOAM_D...part/happy.gif

OK I also get that error.

Niklas
(maybe its a username issue)

mattijs October 16, 2008 03:05

Have you tried 1.5.x? If it do
 
Have you tried 1.5.x? If it does not work in that one please report as a bug.

wikstrom October 17, 2008 04:35

I am running recent pull of 1.
 
I am running recent pull of 1.5.x. Reporting bug!

Thanks for testing and great suggestions Niklas! Actually changed my irl name to Bob The Builder and now everything works fine! :-)

Thanks Niklas and Mattijs

mou_mi October 21, 2008 17:38

Hi I also face this error
 
Hi

I also face this error in snappyhexmesh parallel run?

*** An error occurred in MPI_Bsend
*** on communicator MPI_COMM_WORLD
*** MPI_ERR_BUFFER: invalid buffer pointer
*** MPI_ERRORS_ARE_FATAL (goodbye)

Would you tell me how and where I can change my name according to what Niklas said?

Thank you
mou

schwarczi November 18, 2008 11:31

Hi, I have the same problem
 
Hi,

I have the same problem that you described above in connetion with parallel meshing (blockMesh ->
decomposePar -> snappyHexMesh in package version 1.5)

My geometry was built up from several STL files, let's say 20. If I use only 19 parts, everything works fine, I have no problem. But when I used all the 20, I used to get that [MPI_ERRORS_ARE_FATAL...] message and SnappyHexMesh crashed again and again.

I tried to divide the task between the processors many different ways, different memory settings, etc. I checked the user names - as you suggested -, and made the meshing process ran with different users/root. Sorry, nothing has helped. I dublechecked the STL files, too, and used different combinations, the case is the same, when all of them are included the meshing crashes.

Do you have any good idea? Is it a bug? Or the problem is by MPI itself?


Thanks in advance,
Schwarczi

mattijs November 18, 2008 14:39

Make sure your MPI_BUFFER_SIZE
 
Make sure your MPI_BUFFER_SIZE is plenty big, 200000000 or larger. Also check on your nodes that you are not running out of memory altogether.

schwarczi November 24, 2008 09:52

Mattijs, Thank you very muc
 
Mattijs,

Thank you very much, your advice has been absolutely useful. Extending [MPI_BUFFER_SIZE] has helped me solve the [MPI_ERRORS_ARE_FATAL...] problem described in my last post.

Thanks,
Sch.


All times are GMT -4. The time now is 11:06.