CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Pre-Processing (https://www.cfd-online.com/Forums/openfoam-pre-processing/)
-   -   wrong zero boundary dir in parallel after snappyHexMesh (https://www.cfd-online.com/Forums/openfoam-pre-processing/184669-wrong-zero-boundary-dir-parallel-after-snappyhexmesh.html)

HagenC March 8, 2017 03:41

wrong zero boundary dir in parallel after snappyHexMesh
 
Hi everyone,
I propably miss something basic, nevertheless I have no clue.
I am doing parallel computing with OpenFOAM-v1612+ on 3 processors. Procedure is:
Code:

blockMesh
mpirun -np 3 redistributePar -decompose -parallel

In processor0/0/p.boundary I have procBoundary0to1 but not procBoundary0to2, which is ok I think, so there could be no connection between faces of the first and the last processor. This is in agree with processor0/constant/polyMesh/boundary, where also only procBoundary0to1 exists. But then after
Code:

mpirun -np 3 snappyHexMesh -overwrite -parallel
I have got procBoundary0to2 in processor0/constant/polyMesh/boundary but still procBoundary0to1 in processor0/0/p.boundary. So trying to run
Code:

mpirun -np 3 simpleFoam -parallel
gives the error message:
Code:

/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  v1612+                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.com                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : v1612+
Exec  : simpleFoam -parallel
Date  : Mar 08 2017
Time  : 09:30:46
Host  : "simmachine"
PID    : 20842
Case  : /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady
nProcs : 3
Slaves :
2
(
"simmachine.20843"
"simmachine.20844"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


SIMPLE: convergence criteria
    field p    tolerance 0.01
    field U    tolerance 0.001
    field "(k|epsilon|omega)"    tolerance 0.001

Reading field p

[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] size 0 is not equal to the given value of 784
[0]
[0] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor0/0/p.boundaryField.inlet from line 26 to line 28.
[1] [2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] Cannot find patchField entry for procBoundary2to0
[2]
[2] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor2/0/p.boundaryField from line 26 to line 42.
[2]
[2]    From function void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[2]    in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 191.
[2]
FOAM parallel run exiting
[2]
[0]
[1]
[1]
--> FOAM FATAL IO ERROR:
[1] size 0 is not equal to the given value of 1704
[1]
[1] file: /home/hagen/OpenFOAM/hagen-v1612+/run/stenose/RAS/kOmegaSST/long/steady/processor1/0/p.boundaryField.outlet from line [0] 32 to line 33.
[1]
[1]    From function Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int]
[1]    in file    From function /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304.
[1]
FOAM parallel run exiting
Foam::Field<Type>::Field(const Foam::word&, const Foam::dictionary&, Foam::label) [with Type = double; Foam::label = int][1]

[0]    in file /home/hagen/OpenFOAM/OpenFOAM-v1612+/src/OpenFOAM/lnInclude/Field.C at line 304.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]
FOAM parallel run exiting
[0]
[simmachine:20840] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
[simmachine:20840] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

I can solve that by reconstructing and decomposing again:
Code:

mpirun -np 3 redistributePar -reconstruct -constant -parallel
mpirun -np 3 redistributePar -decompose -parallel

but there must be an easier way!
If I only decompose again after snappyHexMesh without reconstructing before, I end up getting only the blockMesh.
I appreciate any recommondations.
Thank you.

HagenC March 8, 2017 09:30

Solved
 
Ok, I found a solution for that.
Seems that OpenFoam somehow gets confused with the information from the zero folder.
By using an empty dummy zero folder in your case directory, no zero folder will be created in the processor folders. So after snappyHexMesh you can use
Code:

restore0Dir -processor
to create 0 directories from an 0.org folder in the case directory. (Mind sourcing the run functions by . $WM_PROJECT_DIR/bin/tools/RunFunctions ).
And afterwards simpleFoam runs just fine.
By the way, with an older version (v1606+) I still struggeled with some ProcAddressing issues. This can be solved by using
Code:

mpirun -np 3 renumberMesh -overwrite -constant -parallel
after snappyHexMesh. The mesh will be renumbered and ill procAddressings will be deleted.
Best,
Hagen

HagenC March 13, 2017 04:47

Solved 2
 
I realized, I have forgotten to mention one important thing:
You need to have a
Code:

"proc.*"
{
    type    processor;
}

boundary condition in each of your p, U, omega, what ever files in 0.org folder. (0.orig in version v1612+)
Best,
Hagen


All times are GMT -4. The time now is 15:45.