CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM

snappyHexMesh in parallel with cyclics

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   June 27, 2011, 16:47
Default snappyHexMesh in parallel with cyclics
  #1
Member
 
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 7
tonyuprm is on a distinguished road
Hi all,

I am trying to use snappyHexMesh in parallel to refine a region in a large domain. The domain has the following boundary conditions:

Code:
boundaryField
{
    recycle_1
    {
        type            cyclic;
        value           uniform (8.0 0.0 0.0);
    }

    inflow
    {
        type            fixedValue;
        value           uniform (8.0 0.0 0.0);
    }

    outflow
    {
        type            inletOutlet;
        inletValue      uniform (0.0 0.0 0.0);
        value           uniform (8.0 0.0 0.0);
    }

    recycle_2
    {
        type            cyclic;
        value           uniform (8.0 0.0 0.0);
    }
}
I am able to run blockMesh, decomposePar using parMetis and refine the mesh using snappyHexMesh in parallel. I am using preservePatches option for my cyclic boundary conditions in decomposeParDict. The checkMesh utility says my mesh is OK. After I run the mesh however I am not able to run my solver and I get the following errors:

Code:
using  64  processors
[38]
[38]
[38] --> FOAM FATAL IO ERROR:
[38] size 0 is not equal to the given value of 96
[38]
[38] file: /scratch/lmartine/grid_Resolution/turbineMesh/processor38/0/p::boundaryField::recycle_1 from line 26 to line 28.
[38]
[38]     From function Field<Type>::Field(const word& keyword, const dictionary&, const label)
[38]     in file /projects/nrel/apps/openfoam/src/OpenFOAM-1.7.1/src/OpenFOAM/lnInclude/Field.C at line 236.
[38]
FOAM parallel run exiting
[38]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 38 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 38 with PID 21329 on
node rr124 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
Attached are my blockMeshDict, decomposeParDict and snappyHexMeshDict files.
Im running OpenFOAM v 1.7.1

case.zip

Thanks!

Tony
tonyuprm is offline   Reply With Quote

Old   June 29, 2011, 10:43
Default
  #2
Member
 
Tony
Join Date: Jun 2010
Posts: 54
Rep Power: 7
tonyuprm is on a distinguished road
Hi all,

I was able to get it to work by running reconstructParMesh and decomposePar againg. This is not an efficient fix since these utilities will become a bottleneck as the grids get bigger. I found the temporary fix in this bug report:

Bug
tonyuprm is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
SnappyHexMesh OF-1.6-ext crashes on a parallel run norman1981 OpenFOAM Bugs 5 December 7, 2011 13:48
snappyhexmesh parallel Virtual-iCFD OpenFOAM Running, Solving & CFD 1 September 9, 2011 11:33
Parallel mesh generation using snappyHexMesh aki_yafuji OpenFOAM Mesh Utilities 0 December 25, 2010 04:49
cyclics and snappyHexMesh fabianpk OpenFOAM Bugs 1 June 22, 2010 12:51
Parallel case setup boundry conditions snappyhexmesh oskar OpenFOAM Pre-Processing 5 September 11, 2009 01:12


All times are GMT -4. The time now is 18:10.