CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (http://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   SnappyHexMesh OF-1.6-ext crashes on a parallel run (http://www.cfd-online.com/Forums/openfoam-bugs/82594-snappyhexmesh-1-6-ext-crashes-parallel-run.html)

norman1981 November 30, 2010 09:25

SnappyHexMesh OF-1.6-ext crashes on a parallel run
 
Dear all,

I got a crush running snappyHexMesh in parallel on 1.6-ext on one of my cases. I run snappyHexMesh in parallel on the motorBike tutorial to test the OF-1.6-ext snappyHexMesh version and it crushed too reporting these errors:
===
Truncating neighbour list at 12041 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Problem. oldPointI:5728 newPointI:-1
[0]
[0] From function fvMeshDistribute::mergeSharedPoints()
[0] in file directTopoChange/fvMeshDistribute/fvMeshDistribute.C at line 612.
[0]
FOAM parallel run aborting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] Problem. oldPointI:5432 newPointI:-1
[1]
[1] From function fvMeshDistribute::mergeSharedPoints()
[1] in file directTopoChange/fvMeshDistribute/fvMeshDistribute.C at line 612.
[1]
FOAM parallel run aborting
[1]
====

The same case run well in parallel in 1.5-dev.

Kind Regards

Norman

elvis November 30, 2010 10:47

Hi Norman,
there is a Bug Tracker made for the Extend version.
read http://www.extend-project.de/project...nd-bug-tracker

Your Bug Report should go to http://sourceforge.net/apps/mantisbt.../main_page.php

greets

elvis

norman1981 December 1, 2010 03:24

Quote:

Originally Posted by elvis (Post 285423)
Hi Norman,
there is a Bug Tracker made for the Extend version.
read http://www.extend-project.de/project...nd-bug-tracker

Your Bug Report should go to http://sourceforge.net/apps/mantisbt.../main_page.php

greets

elvis

Hi Elvis,

thank you. Yesterday I had a look at sourceforge mantis bug tracker before posting my message but there's not a "Core" (or something related) element in the "Choose Project" combo box therefore I thought this was still the right place to post core-related bugs :-)

Regards

Norman

aliqasemi November 25, 2011 08:55

Quote:

Originally Posted by norman1981 (Post 285405)
Dear all,

I got a crush running snappyHexMesh in parallel on 1.6-ext on one of my cases. I run snappyHexMesh in parallel on the motorBike tutorial to test the OF-1.6-ext snappyHexMesh version and it crushed too reporting these errors:
===
Truncating neighbour list at 12041 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility

From function void polyMesh::initMesh()
in file meshes/polyMesh/polyMeshInitMesh.C at line 82
Truncating neighbour list at 12560 for backward compatibility
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Problem. oldPointI:5728 newPointI:-1
[0]
[0] From function fvMeshDistribute::mergeSharedPoints()
[0] in file directTopoChange/fvMeshDistribute/fvMeshDistribute.C at line 612.
[0]
FOAM parallel run aborting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] Problem. oldPointI:5432 newPointI:-1
[1]
[1] From function fvMeshDistribute::mergeSharedPoints()
[1] in file directTopoChange/fvMeshDistribute/fvMeshDistribute.C at line 612.
[1]
FOAM parallel run aborting
[1]
====

The same case run well in parallel in 1.5-dev.

Kind Regards

Norman

Dear Norman,

I am getting the same error. Have you got any fix/workaround for this? Or should I use another version of the OpenFOAM?

Thanks in advance.

Ali

Ola Widlund November 29, 2011 04:34

Hi!

I think you would be better off having a fresh OF-2.0 installation as a complement to 1.6-ext. I don't think snappyHexMesh is worked on by anyone in the extend team, so now you are actually using the 1.6 version of snappyHexMesh. There are a lot of developments since then; see release notes on www.openfoam.com.

Of course, nothing prevents you from using your generated mesh with solvers built with the 1.6-ext release.

/Ola

aliqasemi December 7, 2011 12:48

Quote:

Originally Posted by Ola Widlund (Post 333914)
Hi!

I think you would be better off having a fresh OF-2.0 installation as a complement to 1.6-ext. I don't think snappyHexMesh is worked on by anyone in the extend team, so now you are actually using the 1.6 version of snappyHexMesh. There are a lot of developments since then; see release notes on www.openfoam.com.

Of course, nothing prevents you from using your generated mesh with solvers built with the 1.6-ext release.

/Ola

Thank you Ola,

Unfortunately, at the moment OF-2.0.x does not produce good quality meshes for me. I need a coarse mesh with no feature-edge handelling, roughly speaking. So it seems that I have to use the sHM in OF-1.6-ext - in serial - for now.

Ali


All times are GMT -4. The time now is 02:20.