Yuby |
February 18, 2015 07:36 |
Error running simpleFoam in parallel
Hi FOAMers!
I have recently posted one theard, but I have had another issue and I would like to know if you can help me. I have searched in the forums but I don't have found anything about this error.
I have done decomposePar in order to do my snappy, and after then I write mpirun -np 8 simpleFoam -parallel in order to run simpleFoam in parallel but I receive this error.
Can you help me to find the reason?
Thank you very much indeed in advance!
Code:
usuario@usuario-SATELLITE-P50-A-14G:~/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon$ mpirun -np 8 simpleFoam -parallel
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.3.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.3.0-f5222ca19ce6
Exec : simpleFoam -parallel
Date : Feb 18 2015
Time : 13:25:03
Host : "usuario-SATELLITE-P50-A-14G"
PID : 7464
Case : /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon
nProcs : 8
Slaves :
7
(
"usuario-SATELLITE-P50-A-14G.7465"
"usuario-SATELLITE-P50-A-14G.7466"
"usuario-SATELLITE-P50-A-14G.7467"
"usuario-SATELLITE-P50-A-14G.7468"
"usuario-SATELLITE-P50-A-14G.7469"
"usuario-SATELLITE-P50-A-14G.7470"
"usuario-SATELLITE-P50-A-14G.7471"
)
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
Reading field p
[4]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] Cannot find patchField entry for procBoundary4to5
[4]
[4] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor4/0/p.boundaryField from line 28 to line 21.
[4]
[4] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[4] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[4]
FOAM parallel run exiting
[4]
[5]
[5]
[6]
[6]
[6] --> FOAM FATAL IO ERROR:
[6] Cannot find patchField entry for procBoundary6to5
[6]
[6] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor6/0/p.boundaryField from line 28 to line 21.
[6]
[7]
[7]
[7] --> FOAM FATAL IO ERROR:
[7] Cannot find patchField entry for procBoundary7to4
[7]
[7] file: [6] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[6] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[6]
FOAM parallel run exiting
[6]
[1]
[1]
[1] --> FOAM FATAL IO ERROR:
[1] Cannot find patchField entry for procBoundary1to0
[1]
[1] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor1/0/p.boundaryField from line 28 to line 21.
[1]
[1] From function /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor7/0/p.boundaryField from line 28 to line 21.
[7]
[7] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[7] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[7]
FOAM parallel run exiting
[7]
[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] Cannot find patchField entry for procBoundary0to2
[0]
[0] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor0/0/p.boundaryField from line 28 to line 21.
[0]
[0] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[0] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[0]
FOAM parallel run exiting
[0]
GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[1] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[1]
FOAM parallel run exiting
[1]
[2]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] Cannot find patchField entry for procBoundary2to0
[2]
[2] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor2/0/p.boundaryField from line 28 to line 21.
[2]
[2] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[2] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] Cannot find patchField entry for procBoundary3to0
[3]
[3] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor3/0/p.boundaryField from line 28 to line 21.
[3]
[3] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[3] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[3]
FOAM parallel run exiting
[3]
[5] --> FOAM FATAL IO ERROR:
[5] Cannot find patchField entry for procBoundary5to4
[5]
[5] file: /home/usuario/OpenFOAM/usuario-2.3.0/run/Frisbee_KEpsilon/processor5/0/p.boundaryField from line 28 to line 21.
[5]
[5] From function GeometricField<Type, PatchField, GeoMesh>::GeometricBoundaryField::readField(const DimensionedField<Type, GeoMesh>&, const dictionary&)
[5] in file /home/usuario/OpenFOAM/OpenFOAM-2.3.0/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 209.
[5]
FOAM parallel run exiting
[5]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 5 with PID 7469 on
node usuario-SATELLITE-P50-A-14G exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[usuario-SATELLITE-P50-A-14G:07463] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[usuario-SATELLITE-P50-A-14G:07463] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
|