CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

Error running simpleFoam in parallel

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   February 18, 2015, 08:36
Default Error running simpleFoam in parallel
  #1
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
Hi FOAMers!

I have recently posted one theard, but I have had another issue and I would like to know if you can help me. I have searched in the forums but I don't have found anything about this error.

I have done decomposePar in order to do my snappy, and after then I write mpirun -np 8 simpleFoam -parallel in order to run simpleFoam in parallel but I receive this error.

Can you help me to find the reason?

Thank you very much indeed in advance!

Code:
usuario@usuario-SATELLITE-P50-A-14G:~/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon$ mpirun -np 8 simpleFoam -parallel
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.3.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.3.0-f5222ca19ce6
Exec   : simpleFoam -parallel
Date   : Feb 18 2015
Time   : 13:25:03
Host   : "usuario-SATELLITE-P50-A-14G"
PID    : 7464
Case   : /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon
nProcs : 8
Slaves : 
7
(
"usuario-SATELLITE-P50-A-14G.7465"
"usuario-SATELLITE-P50-A-14G.7466"
"usuario-SATELLITE-P50-A-14G.7467"
"usuario-SATELLITE-P50-A-14G.7468"
"usuario-SATELLITE-P50-A-14G.7469"
"usuario-SATELLITE-P50-A-14G.7470"
"usuario-SATELLITE-P50-A-14G.7471"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

[4] 
[4] 
[4] --> FOAM FATAL IO ERROR: 
[4] Cannot find patchField entry for procBoundary4to5
[4] 
[4] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor4/0/p.boundaryField from line 28 to line 21.
[4] 
[4]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[4]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[4] 
FOAM parallel run exiting
[4] 
[5] 
[5] 
[6] 
[6] 
[6] --> FOAM FATAL IO ERROR: 
[6] Cannot find patchField entry for procBoundary6to5
[6] 
[6] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor6/0/p.boundaryField from line 28 to line 21.
[6] 
[7] 
[7] 
[7] --> FOAM FATAL IO ERROR: 
[7] Cannot find patchField entry for procBoundary7to4
[7] 
[7] file: [6]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[6]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[6] 
FOAM parallel run exiting
[6] 
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] Cannot find patchField entry for procBoundary1to0
[1] 
[1] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor1/0/p.boundaryField from line 28 to line 21.
[1] 
[1]     From function /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor7/0/p.boundaryField from line 28 to line 21.
[7] 
[7]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[7]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[7] 
FOAM parallel run exiting
[7] 
[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] Cannot find patchField entry for procBoundary0to2
[0] 
[0] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor0/0/p.boundaryField from line 28 to line 21.
[0] 
[0]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[0]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[0] 
FOAM parallel run exiting
[0] 
GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[1]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[1] 
FOAM parallel run exiting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] Cannot find patchField entry for procBoundary2to0
[2] 
[2] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor2/0/p.boundaryField from line 28 to line 21.
[2] 
[2]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[2]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[2] 
FOAM parallel run exiting
[2] 
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] Cannot find patchField entry for procBoundary3to0
[3] 
[3] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor3/0/p.boundaryField from line 28 to line 21.
[3] 
[3]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[3]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[3] 
FOAM parallel run exiting
[3] 
[5] --> FOAM FATAL IO ERROR: 
[5] Cannot find patchField entry for procBoundary5to4
[5] 
[5] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor5/0/p.boundaryField from line 28 to line 21.
[5] 
[5]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[5]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[5] 
FOAM parallel run exiting
[5] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 7469 on
node usuario-SATELLITE-P50-A-14G exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[usuario-SATELLITE-P50-A-14G:07463] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[usuario-SATELLITE-P50-A-14G:07463] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Yuby is offline   Reply With Quote

Old   February 20, 2015, 04:46
Default
  #2
Member
 
Join Date: Jan 2015
Posts: 55
Rep Power: 2
Svensen is on a distinguished road
It seems that you simple not define your boundary condition. It is not a problem of parallel execution. Try to execute program in serial and you will get the same error.
Svensen is offline   Reply With Quote

Old   February 24, 2015, 12:15
Default
  #3
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
No, I doesn't work even in serial.

Anyone can help me?

I would be very pleased indeed
Yuby is offline   Reply With Quote

Old   February 24, 2015, 12:26
Default
  #4
Member
 
Join Date: Jan 2015
Posts: 55
Rep Power: 2
Svensen is on a distinguished road
I've told to you that the problem is not in the parallel execution. You incorrectly defined the boundary conditions.
If you can, post you U and p files here and I will help you.
Svensen is offline   Reply With Quote

Old   February 24, 2015, 12:26
Default
  #5
Member
 
Thiago Parente
Join Date: Sep 2011
Location: Diamantina, Brazil.
Posts: 36
Rep Power: 5
thiagopl is on a distinguished road
That's what Svensen said. There is nothing to do with parallel running, somethng is wrong with your boundary conditions:

Quote:
Cannot find patchField entry for procBoundary4to5...
__________________
OF 2.3.1 - Ubuntu 14.04
Fields of interest: Heat transfer; Convection heat transfer; Conjugate heat transfer.
thiagopl is offline   Reply With Quote

Old   February 24, 2015, 12:32
Default
  #6
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,132
Rep Power: 20
alexeym will become famous soon enoughalexeym will become famous soon enough
Hi,

Can you, please, post sequence of actions you've preformed to get this error? In particular how did you run snappyHexMesh?

According to the message something happened to the decomposition, simpleFoam can find patches corresponding to processor boundaries.
alexeym is offline   Reply With Quote

Old   February 24, 2015, 12:43
Default
  #7
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:
#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant
Yuby is offline   Reply With Quote

Old   February 24, 2015, 12:47
Default
  #8
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:
#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant
And after that

Code:
 mpirun -np 8 simpleFoam -parallel
And then, the error appears.

Last edited by Yuby; February 24, 2015 at 14:17.
Yuby is offline   Reply With Quote

Old   February 24, 2015, 12:48
Default
  #9
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
Do you think that it has to be with the type of decomposition?

I tried with both hierarchical and scotch decompositions and I get the same error

Thank you very much for your replies!
Yuby is offline   Reply With Quote

Old   February 24, 2015, 12:52
Default
  #10
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,132
Rep Power: 20
alexeym will become famous soon enoughalexeym will become famous soon enough
Hi,

This is fatal for the decomposed case:

Code:
#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1
(if I get it right, you just delete 0 folder from processor* folders and copy vanilla 0.org folder into processor* folders)

You see, fields in 0 folder are also decomposed, here is an example of modified file:

Code:
boundaryField
{
    ...
    procBoundary0to1
    {
        type            processor;
        value           uniform 0;
    }
    procBoundary0to2
    {
        type            processor;
        value           uniform 0;
    }
}
simpleFoam complains about absence of these boundaries.

So you either, run reconstructParMesh, delete processor* folders, and decomposePar again. Or you can try to keep 0 folders in processor* folders.
alexeym is offline   Reply With Quote

Old   February 25, 2015, 18:57
Smile
  #11
Member
 
Rubén
Join Date: Oct 2014
Location: Madrid
Posts: 44
Rep Power: 2
Yuby is on a distinguished road
Thank you very much indeed, Alexey!!!

That is the solution to this problem.

Completely pleased!
Yuby is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
simpleFoam parallel solver & Fluent polyhedral mesh Zlatko OpenFOAM Running, Solving & CFD 3 September 26, 2014 06:53
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 11:30
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49
parallel simpleFoam freezes the whole system vangelis OpenFOAM Running, Solving & CFD 14 May 16, 2012 05:12


All times are GMT -4. The time now is 09:56.