CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Error running simpleFoam in parallel (https://www.cfd-online.com/Forums/openfoam-solving/148729-error-running-simplefoam-parallel.html)

Yuby February 18, 2015 07:36

Error running simpleFoam in parallel
 
Hi FOAMers!

I have recently posted one theard, but I have had another issue and I would like to know if you can help me. I have searched in the forums but I don't have found anything about this error.

I have done decomposePar in order to do my snappy, and after then I write mpirun -np 8 simpleFoam -parallel in order to run simpleFoam in parallel but I receive this error.

Can you help me to find the reason?

Thank you very much indeed in advance!

Code:

usuario@usuario-SATELLITE-P50-A-14G:~/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon$ mpirun -np 8 simpleFoam -parallel
/*---------------------------------------------------------------------------*\
| =========                |                                                |
| \\      /  F ield        | OpenFOAM: The Open Source CFD Toolbox          |
|  \\    /  O peration    | Version:  2.3.0                                |
|  \\  /    A nd          | Web:      www.OpenFOAM.org                      |
|    \\/    M anipulation  |                                                |
\*---------------------------------------------------------------------------*/
Build  : 2.3.0-f5222ca19ce6
Exec  : simpleFoam -parallel
Date  : Feb 18 2015
Time  : 13:25:03
Host  : "usuario-SATELLITE-P50-A-14G"
PID    : 7464
Case  : /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon
nProcs : 8
Slaves :
7
(
"usuario-SATELLITE-P50-A-14G.7465"
"usuario-SATELLITE-P50-A-14G.7466"
"usuario-SATELLITE-P50-A-14G.7467"
"usuario-SATELLITE-P50-A-14G.7468"
"usuario-SATELLITE-P50-A-14G.7469"
"usuario-SATELLITE-P50-A-14G.7470"
"usuario-SATELLITE-P50-A-14G.7471"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] Cannot find patchField entry for procBoundary4to5
[4]
[4] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor4/0/p.boundaryField from line 28 to line 21.
[4]
[4]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[4]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[6]
[6]
[6] --> FOAM FATAL IO ERROR:
[6] Cannot find patchField entry for procBoundary6to5
[6]
[6] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor6/0/p.boundaryField from line 28 to line 21.
[6]
[7]
[7]
[7] --> FOAM FATAL IO ERROR:
[7] Cannot find patchField entry for procBoundary7to4
[7]
[7] file: [6]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[6]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[6]
FOAM parallel run exiting
[6]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] Cannot find patchField entry for procBoundary1to0
[1]
[1] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor1/0/p.boundaryField from line 28 to line 21.
[1]
[1]    From function /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor7/0/p.boundaryField from line 28 to line 21.
[7]
[7]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[7]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[7]
FOAM parallel run exiting
[7]
[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] Cannot find patchField entry for procBoundary0to2
[0]
[0] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor0/0/p.boundaryField from line 28 to line 21.
[0]
[0]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[0]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[0]
FOAM parallel run exiting
[0]
GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[1]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[1]
FOAM parallel run exiting
[1]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] Cannot find patchField entry for procBoundary2to0
[2]
[2] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor2/0/p.boundaryField from line 28 to line 21.
[2]
[2]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[2]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] Cannot find patchField entry for procBoundary3to0
[3]
[3] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor3/0/p.boundaryField from line 28 to line 21.
[3]
[3]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[3]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[3]
FOAM parallel run exiting
[3]
[5] --> FOAM FATAL IO ERROR:
[5] Cannot find patchField entry for procBoundary5to4
[5]
[5] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor5/0/p.boundaryField from line 28 to line 21.
[5]
[5]    From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[5]    in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[5]
FOAM parallel run exiting
[5]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 7469 on
node usuario-SATELLITE-P50-A-14G exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[usuario-SATELLITE-P50-A-14G:07463] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[usuario-SATELLITE-P50-A-14G:07463] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages


Svensen February 20, 2015 03:46

It seems that you simple not define your boundary condition. It is not a problem of parallel execution. Try to execute program in serial and you will get the same error.

Yuby February 24, 2015 11:15

No, I doesn't work even in serial. :(

Anyone can help me?

I would be very pleased indeed

Svensen February 24, 2015 11:26

I've told to you that the problem is not in the parallel execution. You incorrectly defined the boundary conditions.
If you can, post you U and p files here and I will help you.

thiagopl February 24, 2015 11:26

That's what Svensen said. There is nothing to do with parallel running, somethng is wrong with your boundary conditions:

Quote:

Cannot find patchField entry for procBoundary4to5...

alexeym February 24, 2015 11:32

Hi,

Can you, please, post sequence of actions you've preformed to get this error? In particular how did you run snappyHexMesh?

According to the message something happened to the decomposition, simpleFoam can find patches corresponding to processor boundaries.

Yuby February 24, 2015 11:43

Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:

#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant


Yuby February 24, 2015 11:47

Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:

#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant

And after that

Code:

mpirun -np 8 simpleFoam -parallel
And then, the error appears.

Yuby February 24, 2015 11:48

Do you think that it has to be with the type of decomposition?

I tried with both hierarchical and scotch decompositions and I get the same error:(

Thank you very much for your replies!

alexeym February 24, 2015 11:52

Hi,

This is fatal for the decomposed case:

Code:

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

(if I get it right, you just delete 0 folder from processor* folders and copy vanilla 0.org folder into processor* folders)

You see, fields in 0 folder are also decomposed, here is an example of modified file:

Code:

boundaryField
{
    ...
    procBoundary0to1
    {
        type            processor;
        value          uniform 0;
    }
    procBoundary0to2
    {
        type            processor;
        value          uniform 0;
    }
}

simpleFoam complains about absence of these boundaries.

So you either, run reconstructParMesh, delete processor* folders, and decomposePar again. Or you can try to keep 0 folders in processor* folders.

Yuby February 25, 2015 17:57

Thank you very much indeed, Alexey!!!

That is the solution to this problem.

Completely pleased!:)

libindaniel2000 May 21, 2016 22:20

Have spend roughly 6 hours on this file. No luck.
 
This is my Allrun file. Is this what you meant when you said the two lines were fatal?
Code:

#!/bin/sh
cd ${0%/*} || exit 1    # Run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 4 -overwrite
 #mpirun -np 4 snappyHexMesh -overwrite -parallel >log.snappyHexMesh

#- For non-parallel running
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
#ls -d processor* | xargs -I {} rm -rf ./{}/0
#ls -d processor* | xargs -I {} cp -r 0.org ./{}/0
reconstructPar -latestTime
runParallel patchSummary 4
runParallel potentialFoam 4
runParallel $(getApplication) 4

runApplication reconstructParMesh -constant
runApplication reconstructPar -latestTime

# ----------------------------------------------------------------- end-of-file

I still get the same error as mentioned above. What am I missing?

I have used simple, hierarchical; but still no luck. I have also tried the mpirun directly without using RunParallel, but still no luck.
Also, the two lines are being used in MotorBike and the AllRun file works just fine.

rudolf.hellmuth October 18, 2016 04:53

Quote:

Originally Posted by libindaniel2000 (Post 601156)
I still get the same error as mentioned above. What am I missing?

I have used simple, hierarchical; but still no luck. I have also tried the mpirun directly without using RunParallel, but still no luck.
Also, the two lines are being used in MotorBike and the AllRun file works just fine.

I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:

boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}

So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf

bowen1024 April 17, 2017 16:27

Quote:

Originally Posted by rudolf.hellmuth (Post 621890)
I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:

boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}

So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf

Thanks! This works!

BenGher October 7, 2021 04:38

Quote:

Originally Posted by rudolf.hellmuth (Post 621890)
I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:

boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}

So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf


Thanks! I was becoming crazy not understanding why serial was working but not parallel. But does someone know why? According to https://www.cfd-online.com/Forums/op...ainttypes.html it seems it sets the same BC when using cyclic or empty.

However, in my case I am setting a domain with only patches ( inlet / outlet ) and I still had that problem.


All times are GMT -4. The time now is 21:32.