|
[Sponsors] |
November 7, 2013, 13:53 |
simpleFoam parallel
|
#1 |
New Member
Andrew Mortimer
Join Date: Oct 2013
Posts: 15
Rep Power: 12 |
Hi,
I'm having a bit of trouble running simpleFoam in parallel. I am using the motorBike tutorial and trying to run it on 6 cores (processor is i7-4930k). I ran blockMesh, surfaceFeatureExtract & snappyHexMesh. I then commented out the functions part of the controlDict file (following a tutorial from a lecturer). Then I ran decomposePar, and viewed the individual meshes in paraFoam and everything seemed to have split up evenly. The next step I ran Code:
mpirun -np 6 simpleFoam -parallel Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.2 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.2.2-9240f8b967db Exec : simpleFoam -parallel Date : Nov 07 2013 Time : 18:47:22 Host : "andrew-pc" PID : 620 Case : /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel nProcs : 6 Slaves : 5 ( "andrew-pc.621" "andrew-pc.622" "andrew-pc.623" "andrew-pc.624" "andrew-pc.625" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 3 Reading field p [0] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot find file [0] [0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [0] FOAM parallel run exiting [0] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot find file [2] [2] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 73. [2] FOAM parallel run exiting [2] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. [3] FOAM parallel run exiting [3] [4] [4] [4] --> FOAM FATAL IO ERROR: [4] cannot find file [4] [4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0. [4] [4] From function regIOobject::readStream() [4] in file db/regIOobject/regIOobjectRead.C at line 73. [4] FOAM parallel run exiting [4] [5] [5] [5] --> FOAM FATAL IO ERROR: [5] cannot find file [5] [5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0. [5] [5] From function regIOobject::readStream() [5] in file db/regIOobject/regIOobjectRead.C at line 73. [5] FOAM parallel run exiting [5] -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 621 on node andrew-pc exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [andrew-pc:00619] 5 more processes have sent help message help-mpi-api.txt / mpi-abort [andrew-pc:00619] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages andrew@andrew-pc:~/OpenFOAM/andrew-2.2.2/run/motorBikeParallel$ mpirun -np 6 simpleFoam -parallel > simpleFoamParallel.log [0] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot find file [0] [0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [1] [0] FOAM parallel run exiting [0] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C[2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot find file [2] [2] at line 73. [1] FOAM parallel run exiting [1] [4] [4] [4] --> FOAM FATAL IO ERROR: [4] cannot find file [4] [4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0. [4] [4] From function file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 73. [2] FOAM parallel run exiting [2] regIOobject::readStream() [4] in file db/regIOobject/regIOobjectRead.C at line 73. [4] FOAM parallel run exiting [4] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0. [3] [5] [5] [5] --> FOAM FATAL IO ERROR: [5] cannot find file [5] [5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0. [5] [5] From function regIOobject::readStream() [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. [3] FOAM parallel run exiting [3] [5] in file db/regIOobject/regIOobjectRead.C at line 73. [5] FOAM parallel run exiting [5] -------------------------------------------------------------------------- mpirun has exited due to process rank 2 with PID 630 on node andrew-pc exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [andrew-pc:00627] 5 more processes have sent help message help-mpi-api.txt / mpi-abort [andrew-pc:00627] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages andrew@andrew-pc:~/OpenFOAM/andrew-2.2.2/run/motorBikeParallel$ mpirun -np 6 simpleFoam -parallel > simpleFoamParallel.log [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor1/3/p at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL IO ERROR: [2] cannot find file [2] [2] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor2/3/p at line 0. [2] [2] From function regIOobject::readStream() [2] in file [4] [4] [4] --> FOAM FATAL IO ERROR: [4] cannot find file [4] [4] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor4/3/p at line 0. [4] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [3] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor3/3/p at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. [3] FOAM parallel run exiting [3] db/regIOobject/regIOobjectRead.C at line 73. [2] FOAM parallel run exiting [2] [5] [5] [5] --> FOAM FATAL IO ERROR: [5] cannot find file [5] [5] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor5/3/p at line 0. [5] [5] From function regIOobject::readStream() [5] in file db/regIOobject/regIOobjectRead.C at line 73. [5] FOAM parallel run exiting [5] [4] From function regIOobject::readStream() [4] in file db/regIOobject/regIOobjectRead.C at line 73. [4] FOAM parallel run exiting [4] [0] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot find file [0] [0] file: /home/andrew/OpenFOAM/andrew-2.2.2/run/motorBikeParallel/processor0/3/p at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 639 on node andrew-pc exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [andrew-pc:00638] 5 more processes have sent help message help-mpi-api.txt / mpi-abort [andrew-pc:00638] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Thanks for any help, Andrew |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Foam::error::printStack(Foam::Ostream&) with simpleFoam -parallel | U.Golling | OpenFOAM Running, Solving & CFD | 52 | September 23, 2023 03:35 |
SimpleFoam run in Parallel | jayrup | OpenFOAM | 9 | July 26, 2019 00:00 |
simpleFoam in parallel issue | plucas | OpenFOAM Running, Solving & CFD | 3 | July 17, 2013 11:30 |
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel | JR22 | OpenFOAM Running, Solving & CFD | 2 | April 19, 2013 16:49 |
parallel simpleFoam freezes the whole system | vangelis | OpenFOAM Running, Solving & CFD | 14 | May 16, 2012 05:12 |