|
[Sponsors] |
October 9, 2017, 10:21 |
How to run bubbleInterTrackFoam in parallel?
|
#1 |
New Member
Join Date: May 2017
Posts: 2
Rep Power: 0 |
Hello everybody,
I tried to run the solver bubbleInterTrackFoam to simulate the rising of a bubble. However, when I tried to run it in parallel I get the following error: [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot open file [1] [1] file: /media/jato/Jato/simulation/bubbleintertrackfoam/bubble3D-bidirectional-parallel/processor1/0/fluidIndicator at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 61. [1] FOAM parallel run exiting [0] [0] [0] --> FOAM FATAL IO ERROR: [0] cannot open file [0] [0] file: /media/jato/Jato/simulation/bubbleintertrackfoam/bubble3D-bidirectional-parallel/processor0/0/fluidIndicator at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 61. [0] FOAM parallel run exiting [0] [1] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 28311 on node jato-Precision-Tower-7810 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). Could you please give me a hint? |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Foam::error::PrintStack | almir | OpenFOAM Running, Solving & CFD | 91 | December 21, 2022 05:50 |
[snappyHexMesh] snappyHexMesh error "Cannot determine normal vector from patches." | lethu | OpenFOAM Meshing & Mesh Conversion | 1 | June 3, 2020 08:49 |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 05:18 |
MPI error in parallel application | usv001 | OpenFOAM Programming & Development | 2 | September 14, 2017 12:30 |
Parallel Run on dynamically mounted partition | braennstroem | OpenFOAM Running, Solving & CFD | 14 | October 5, 2010 15:43 |