CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Meshing & Mesh Conversion (https://www.cfd-online.com/Forums/openfoam-meshing/)
-   -   [snappyHexMesh] snappyHexMesh in parallel with cyclics (https://www.cfd-online.com/Forums/openfoam-meshing/89970-snappyhexmesh-parallel-cyclics.html)

tonyuprm June 27, 2011 16:47

snappyHexMesh in parallel with cyclics
 
1 Attachment(s)
Hi all,

I am trying to use snappyHexMesh in parallel to refine a region in a large domain. The domain has the following boundary conditions:

Code:

boundaryField
{
    recycle_1
    {
        type            cyclic;
        value          uniform (8.0 0.0 0.0);
    }

    inflow
    {
        type            fixedValue;
        value          uniform (8.0 0.0 0.0);
    }

    outflow
    {
        type            inletOutlet;
        inletValue      uniform (0.0 0.0 0.0);
        value          uniform (8.0 0.0 0.0);
    }

    recycle_2
    {
        type            cyclic;
        value          uniform (8.0 0.0 0.0);
    }
}

I am able to run blockMesh, decomposePar using parMetis and refine the mesh using snappyHexMesh in parallel. I am using preservePatches option for my cyclic boundary conditions in decomposeParDict. The checkMesh utility says my mesh is OK. After I run the mesh however I am not able to run my solver and I get the following errors:

Code:

using  64  processors
[38]
[38]
[38] --> FOAM FATAL IO ERROR:
[38] size 0 is not equal to the given value of 96
[38]
[38] file: /scratch/lmartine/grid_Resolution/turbineMesh/processor38/0/p::boundaryField::recycle_1 from line 26 to line 28.
[38]
[38]    From function Field<Type>::Field(const word& keyword, const dictionary&, const label)
[38]    in file /projects/nrel/apps/openfoam/src/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/Field.C at line 236.
[38]
FOAM parallel run exiting
[38]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 38 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 38 with PID 21329 on
node rr124 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

Attached are my blockMeshDict, decomposeParDict and snappyHexMeshDict files.
Im running OpenFOAM v 1.7.1

Attachment 8187

Thanks!

Tony

tonyuprm June 29, 2011 10:43

Hi all,

I was able to get it to work by running reconstructParMesh and decomposePar againg. This is not an efficient fix since these utilities will become a bottleneck as the grids get bigger. I found the temporary fix in this bug report:

Bug


All times are GMT -4. The time now is 23:33.