CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (https://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   foam-extend-3.0 ggi linearUpwind (https://www.cfd-online.com/Forums/openfoam-bugs/128711-foam-extend-3-0-ggi-linearupwind.html)

Pekka January 19, 2014 14:52

foam-extend-3.0 ggi linearUpwind
 
Hi,

I faced problem when the linearUpwind div scheme and ggi interface is used. Error message refers to the to mpi failure.
Code:


[0]
[0]
[0] --> FOAM FATAL IO ERROR:
[0] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token
[0]
[0] file: IOstream at line 0.
[0]
[0]    From function IOstream::fatalCheck(const char*) const
[0]    in file db/IOstreams/IOstreams/IOstream.C at line 108.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[Kone3:18215] *** An error occurred in MPI_Recv
[Kone3:18215] *** on communicator MPI_COMM_WORLD
[Kone3:18215] *** MPI_ERR_TRUNCATE: message truncated
[Kone3:18215] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 18215 on
node Kone3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

Error occur when the ggi pair is divided to multiple cpus. If ggi pair is on the same cpu then model works fine. Error can be repeat, for example, tutorial case "turboPassageRotating" by changing the div scheme from
Code:

div(phi,U)      Gauss upwind;
to
Code:

div(phi,U)      Gauss linearUpwind Gauss linear;
.
Similar models works fine also using "Gauss linear;" div scheme.
Whether this is true bug or user error? All ideas and comments are welcomed to solving this problem.

BR/Pekka

elvis January 24, 2014 03:44

Hi,
if I am not mistaken there is a Mantis BugTracker
http://sourceforge.net/apps/mantisbt.../main_page.php


All times are GMT -4. The time now is 07:50.