CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (https://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   FE-3.1 Parallel issues with BCs (gMax, reduce...) (https://www.cfd-online.com/Forums/openfoam-bugs/139466-fe-3-1-parallel-issues-bcs-gmax-reduce.html)

Phicau July 24, 2014 15:40

FE-3.1 Parallel issues with BCs (gMax, reduce...)
 
Dear all,

I have submitted a new bug report in FOAM-extend:

https://sourceforge.net/p/openfoam-e...ndrelease/251/


After releasing IHFOAM I realized that there is some trouble running in parallel, as MPI thinks it runs out of buffer memory (MPI_ERR_TRUNCATE: message truncated). This is not the case, as MPI_BUFFER_SIZE is 20000000, the same as for other versions in which no problem arises.

I then narrowed down the problem and discovered that it is triggered when calling parallel functions: gMax, gMin, reduce...

However, this problem is not always present, it is dependent on which field the boundary condition is applied to. The same BC applied to pressure works, but when applied to alpha1 it crashes.

I have created a small example that reproduces the problem:
- customZeroGrad: a custom zero gradient BC with some parallel operations and Info statements. (Sorry for the vast amount of defined-but-unused variables, as I created the BC from the wave generation one).
- cavity_new: a very simple case derived from the cavity tutorial, ready to run with icoFoam and interFoam

The way to reproduce is as follows:
- Compile customZeroGrad:

Code:

cd customZeroGrad
./localMake

- Run the case

Code:

cd cavity_new
./runParallelIcoFoam
./cleanCase
./runParallelInterFoam_OK
./cleanCase
./runParallelInterFoam_FAIL

The first case runs without problems. The BC is applied to the p field.
The second case is also OK. The BC is applied to pd.
The third case fails. The BC is applied to alpha1. The error message is:

Code:

[user:PID] *** An error occurred in MPI_Recv
[user:PID] *** on communicator MPI_COMM_WORLD
[user:PID] *** MPI_ERR_TRUNCATE: message truncated
[user:PID] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort

There are actually two bugs here.

----------------------------------------------------------
First, when the case runs, gMin and gMax do not work as expected. The minimum and maximum points of the patch are calculated in two different ways: with a loop and a reduce statement, and the result is right (minX = 0.1 and maxX = 0.2).

And then with gMin and gMax a warning appears:


Code:

--> FOAM Warning :
    From function max(const UList<Type>&)
    in file ~/foam/foam-extend-3.1/src/foam/lnInclude/FieldFunctions.C at line 321
    empty field, returning zero

and the results are wrong:

Code:

GM Procesador 1: xMin 0, xMax 0.2
GM Procesador 0: xMin 0, xMax 0.2
GM Procesador 2: xMin 0, xMax 0.2
GM Procesador 3: xMin 0, xMax 0.2

maximums are calculated fine for all the processors and minimums are not.

----------------------------------------------------------
The second bug is the failure when the parallel functions are called for the alpha1 field, as they have proven to work elsewhere...


Best regards,

Pablo

kmooney August 1, 2014 03:29

At OFW9 Hrv mentioned spotting a rather drastic bug in a few parallel aspects of the VOF code. He didn't go into much detail but perhaps you two have discovered the same problem! Regardless this this should be something we can sort out; a proper mpi reduce isn't the most complicated thing out there.

Cheers,
Kyle

Phicau August 1, 2014 06:22

Hi Kyle,

I also think the issue I reported is pretty straightforward to fix. I have been doing some tests and apparently only two minor changes are needed:

/foam-extend-3.1/src/foam/fields/Fields/Field/FieldFunctions.C

from line 310:
Code:

template<class Type>
Type max(const UList<Type>& f)
{
    if (f.size())
    {
        Type Max(f[0]);
        TFOR_ALL_S_OP_FUNC_F_S(Type, Max, =, max, Type, f, Type, Max)
        return Max;
    }
    else
    {
        /*
        WarningIn("max(const UList<Type>&)")
            << "empty field, returning zero" << endl;

        return pTraits<Type>::zero;
        */
        return pTraits<Type>::min;
    }
}

TMP_UNARY_FUNCTION(Type, max)

template<class Type>
Type min(const UList<Type>& f)
{
    if (f.size())
    {
        Type Min(f[0]);
        TFOR_ALL_S_OP_FUNC_F_S(Type, Min, =, min, Type, f, Type, Min)
        return Min;
    }
    else
    {
        /*
        WarningIn("min(const UList<Type>&)")
            << "empty field, returning zero" << endl;

        return pTraits<Type>::zero;
        */
        return pTraits<Type>::max;
    }
}

TMP_UNARY_FUNCTION(Type, min)

After the changes I recompiled the whole FOAM-extend, the IHFOAM BCs and solvers.

These modifications work for me, however, I don't know if they have any side effects, as Bernhard pointed out here http://www.cfd-online.com/Forums/ope...llel-runs.html , but I believe that they shouldn't.

If this is the solution, one bug less!

Best,

Pablo


All times are GMT -4. The time now is 21:04.