CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Error running simpleFoam in parallel

Register Blogs Community New Posts Updated Threads Search

Like Tree8Likes
  • 2 Post By Yuby
  • 6 Post By rudolf.hellmuth

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 18, 2015, 07:36
Default Error running simpleFoam in parallel
  #1
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
Hi FOAMers!

I have recently posted one theard, but I have had another issue and I would like to know if you can help me. I have searched in the forums but I don't have found anything about this error.

I have done decomposePar in order to do my snappy, and after then I write mpirun -np 8 simpleFoam -parallel in order to run simpleFoam in parallel but I receive this error.

Can you help me to find the reason?

Thank you very much indeed in advance!

Code:
usuario@usuario-SATELLITE-P50-A-14G:~/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon$ mpirun -np 8 simpleFoam -parallel
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.3.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.3.0-f5222ca19ce6
Exec   : simpleFoam -parallel
Date   : Feb 18 2015
Time   : 13:25:03
Host   : "usuario-SATELLITE-P50-A-14G"
PID    : 7464
Case   : /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon
nProcs : 8
Slaves : 
7
(
"usuario-SATELLITE-P50-A-14G.7465"
"usuario-SATELLITE-P50-A-14G.7466"
"usuario-SATELLITE-P50-A-14G.7467"
"usuario-SATELLITE-P50-A-14G.7468"
"usuario-SATELLITE-P50-A-14G.7469"
"usuario-SATELLITE-P50-A-14G.7470"
"usuario-SATELLITE-P50-A-14G.7471"
)

Pstream initialized with:
    floatTransfer      : 0
    nProcsSimpleSum    : 0
    commsType          : nonBlocking
    polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

[4] 
[4] 
[4] --> FOAM FATAL IO ERROR: 
[4] Cannot find patchField entry for procBoundary4to5
[4] 
[4] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor4/0/p.boundaryField from line 28 to line 21.
[4] 
[4]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[4]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[4] 
FOAM parallel run exiting
[4] 
[5] 
[5] 
[6] 
[6] 
[6] --> FOAM FATAL IO ERROR: 
[6] Cannot find patchField entry for procBoundary6to5
[6] 
[6] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor6/0/p.boundaryField from line 28 to line 21.
[6] 
[7] 
[7] 
[7] --> FOAM FATAL IO ERROR: 
[7] Cannot find patchField entry for procBoundary7to4
[7] 
[7] file: [6]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[6]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[6] 
FOAM parallel run exiting
[6] 
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: 
[1] Cannot find patchField entry for procBoundary1to0
[1] 
[1] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor1/0/p.boundaryField from line 28 to line 21.
[1] 
[1]     From function /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor7/0/p.boundaryField from line 28 to line 21.
[7] 
[7]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[7]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[7] 
FOAM parallel run exiting
[7] 
[0] 
[0] 
[0] --> FOAM FATAL IO ERROR: 
[0] Cannot find patchField entry for procBoundary0to2
[0] 
[0] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor0/0/p.boundaryField from line 28 to line 21.
[0] 
[0]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[0]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[0] 
FOAM parallel run exiting
[0] 
GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[1]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[1] 
FOAM parallel run exiting
[1] 
[2] 
[2] 
[2] --> FOAM FATAL IO ERROR: 
[2] Cannot find patchField entry for procBoundary2to0
[2] 
[2] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor2/0/p.boundaryField from line 28 to line 21.
[2] 
[2]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[2]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[2] 
FOAM parallel run exiting
[2] 
[3] 
[3] 
[3] --> FOAM FATAL IO ERROR: 
[3] Cannot find patchField entry for procBoundary3to0
[3] 
[3] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor3/0/p.boundaryField from line 28 to line 21.
[3] 
[3]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[3]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[3] 
FOAM parallel run exiting
[3] 
[5] --> FOAM FATAL IO ERROR: 
[5] Cannot find patchField entry for procBoundary5to4
[5] 
[5] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor5/0/p.boundaryField from line 28 to line 21.
[5] 
[5]     From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[5]     in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[5] 
FOAM parallel run exiting
[5] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 7469 on
node usuario-SATELLITE-P50-A-14G exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[usuario-SATELLITE-P50-A-14G:07463] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[usuario-SATELLITE-P50-A-14G:07463] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Yuby is offline   Reply With Quote

Old   February 20, 2015, 03:46
Default
  #2
Senior Member
 
Join Date: Jan 2015
Posts: 150
Rep Power: 11
Svensen is on a distinguished road
It seems that you simple not define your boundary condition. It is not a problem of parallel execution. Try to execute program in serial and you will get the same error.
Svensen is offline   Reply With Quote

Old   February 24, 2015, 11:15
Default
  #3
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
No, I doesn't work even in serial.

Anyone can help me?

I would be very pleased indeed
Yuby is offline   Reply With Quote

Old   February 24, 2015, 11:26
Default
  #4
Senior Member
 
Join Date: Jan 2015
Posts: 150
Rep Power: 11
Svensen is on a distinguished road
I've told to you that the problem is not in the parallel execution. You incorrectly defined the boundary conditions.
If you can, post you U and p files here and I will help you.
Svensen is offline   Reply With Quote

Old   February 24, 2015, 11:26
Default
  #5
Member
 
Thiago Parente Lima
Join Date: Sep 2011
Location: Diamantina, Brazil.
Posts: 62
Rep Power: 14
thiagopl is on a distinguished road
That's what Svensen said. There is nothing to do with parallel running, somethng is wrong with your boundary conditions:

Quote:
Cannot find patchField entry for procBoundary4to5...
__________________
Fields of interest: buoyantFoam, chtMultRegionFoam.
thiagopl is offline   Reply With Quote

Old   February 24, 2015, 11:32
Default
  #6
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
Hi,

Can you, please, post sequence of actions you've preformed to get this error? In particular how did you run snappyHexMesh?

According to the message something happened to the decomposition, simpleFoam can find patches corresponding to processor boundaries.
alexeym is offline   Reply With Quote

Old   February 24, 2015, 11:43
Default
  #7
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:
#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant
Yuby is offline   Reply With Quote

Old   February 24, 2015, 11:47
Default
  #8
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
Sorry!

I mean, it works in serial!

Sorry for not explaining well.

My run file is:

Code:
#!/bin/sh
cd constant/triSurface;
surfaceOrient frisbee.stl "(1e10 1e10 1e10)" frisbee.stl;
surfaceCheck frisbee.stl >surfaceCheck.log;
cd ../../;

cd ${0%/*} || exit 1    # run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 8 -overwrite

#- For non-parallel runningii
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1

cp -r 0.org 0

runApplication reconstructParMesh -constant
And after that

Code:
 mpirun -np 8 simpleFoam -parallel
And then, the error appears.

Last edited by Yuby; February 24, 2015 at 13:17.
Yuby is offline   Reply With Quote

Old   February 24, 2015, 11:48
Default
  #9
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
Do you think that it has to be with the type of decomposition?

I tried with both hierarchical and scotch decompositions and I get the same error

Thank you very much for your replies!
Yuby is offline   Reply With Quote

Old   February 24, 2015, 11:52
Default
  #10
Senior Member
 
Alexey Matveichev
Join Date: Aug 2011
Location: Nancy, France
Posts: 1,930
Rep Power: 38
alexeym has a spectacular aura aboutalexeym has a spectacular aura about
Send a message via Skype™ to alexeym
Hi,

This is fatal for the decomposed case:

Code:
#- For parallel running
ls -d processor* | xargs -i rm -rf ./{}/0 $1
ls -d processor* | xargs -i cp -r 0.org ./{}/0 $1
(if I get it right, you just delete 0 folder from processor* folders and copy vanilla 0.org folder into processor* folders)

You see, fields in 0 folder are also decomposed, here is an example of modified file:

Code:
boundaryField
{
    ...
    procBoundary0to1
    {
        type            processor;
        value           uniform 0;
    }
    procBoundary0to2
    {
        type            processor;
        value           uniform 0;
    }
}
simpleFoam complains about absence of these boundaries.

So you either, run reconstructParMesh, delete processor* folders, and decomposePar again. Or you can try to keep 0 folders in processor* folders.
alexeym is offline   Reply With Quote

Old   February 25, 2015, 17:57
Smile
  #11
Member
 
Rubén
Join Date: Oct 2014
Location: Munich
Posts: 47
Rep Power: 11
Yuby is on a distinguished road
Thank you very much indeed, Alexey!!!

That is the solution to this problem.

Completely pleased!
Yuby is offline   Reply With Quote

Old   May 21, 2016, 22:20
Unhappy Have spend roughly 6 hours on this file. No luck.
  #12
New Member
 
LD
Join Date: May 2016
Location: Savannah, GA
Posts: 7
Rep Power: 9
libindaniel2000 is on a distinguished road
This is my Allrun file. Is this what you meant when you said the two lines were fatal?
Code:
#!/bin/sh
cd ${0%/*} || exit 1    # Run from this directory

# Source tutorial run functions
. $WM_PROJECT_DIR/bin/tools/RunFunctions

runApplication surfaceFeatureExtract

runApplication blockMesh

runApplication decomposePar
runParallel snappyHexMesh 4 -overwrite
 #mpirun -np 4 snappyHexMesh -overwrite -parallel >log.snappyHexMesh

#- For non-parallel running
#cp -r 0.org 0 > /dev/null 2>&1

#- For parallel running
#ls -d processor* | xargs -I {} rm -rf ./{}/0
#ls -d processor* | xargs -I {} cp -r 0.org ./{}/0
reconstructPar -latestTime
runParallel patchSummary 4
runParallel potentialFoam 4
runParallel $(getApplication) 4

runApplication reconstructParMesh -constant
runApplication reconstructPar -latestTime

# ----------------------------------------------------------------- end-of-file
I still get the same error as mentioned above. What am I missing?

I have used simple, hierarchical; but still no luck. I have also tried the mpirun directly without using RunParallel, but still no luck.
Also, the two lines are being used in MotorBike and the AllRun file works just fine.
libindaniel2000 is offline   Reply With Quote

Old   October 18, 2016, 04:53
Default
  #13
Member
 
Rudolf Hellmuth
Join Date: Sep 2012
Location: Dundee, Scotland
Posts: 40
Rep Power: 13
rudolf.hellmuth is on a distinguished road
Quote:
Originally Posted by libindaniel2000 View Post
I still get the same error as mentioned above. What am I missing?

I have used simple, hierarchical; but still no luck. I have also tried the mpirun directly without using RunParallel, but still no luck.
Also, the two lines are being used in MotorBike and the AllRun file works just fine.
I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:
boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}
So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf
Rojj, BenGher, Ramzy1990 and 3 others like this.
rudolf.hellmuth is offline   Reply With Quote

Old   April 17, 2017, 16:27
Default
  #14
New Member
 
Join Date: Dec 2016
Posts: 6
Rep Power: 9
bowen1024 is on a distinguished road
Quote:
Originally Posted by rudolf.hellmuth View Post
I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:
boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}
So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf
Thanks! This works!
bowen1024 is offline   Reply With Quote

Old   October 7, 2021, 04:38
Default
  #15
New Member
 
Ben Gherardi
Join Date: Jun 2016
Posts: 17
Rep Power: 9
BenGher is on a distinguished road
Quote:
Originally Posted by rudolf.hellmuth View Post
I guess I've got to the same problem, and I figured out how to fix it.

In 0/* dictionaries you have to have that #includeEtc below:
Code:
boundaryField
{
    //- Set patchGroups for constraint patches
    #includeEtc "caseDicts/setConstraintTypes"
...
}
So, when you copy the dictionaries to processor*, the solver will find the BC definition in "caseDicts/setConstraintTypes".

I had deleted that #includeEtc line because my Paraview on Windows was not reading my case.foam because of the hashtag (#).

Best regards,
Rudolf

Thanks! I was becoming crazy not understanding why serial was working but not parallel. But does someone know why? According to #includeEtc "caseDicts/setConstraintTypes" it seems it sets the same BC when using cyclic or empty.

However, in my case I am setting a domain with only patches ( inlet / outlet ) and I still had that problem.
__________________
Enjoy the flow
BenGher is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
simpleFoam parallel solver & Fluent polyhedral mesh Zlatko OpenFOAM Running, Solving & CFD 3 September 26, 2014 06:53
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 11:30
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49
parallel simpleFoam freezes the whole system vangelis OpenFOAM Running, Solving & CFD 14 May 16, 2012 05:12


All times are GMT -4. The time now is 04:50.