FE-3.1 Parallel issues with BCs (gMax, reduce...)
Dear all,
I have submitted a new bug report in FOAM-extend: https://sourceforge.net/p/openfoam-e...ndrelease/251/ After releasing IHFOAM I realized that there is some trouble running in parallel, as MPI thinks it runs out of buffer memory (MPI_ERR_TRUNCATE: message truncated). This is not the case, as MPI_BUFFER_SIZE is 20000000, the same as for other versions in which no problem arises. I then narrowed down the problem and discovered that it is triggered when calling parallel functions: gMax, gMin, reduce... However, this problem is not always present, it is dependent on which field the boundary condition is applied to. The same BC applied to pressure works, but when applied to alpha1 it crashes. I have created a small example that reproduces the problem: - customZeroGrad: a custom zero gradient BC with some parallel operations and Info statements. (Sorry for the vast amount of defined-but-unused variables, as I created the BC from the wave generation one). - cavity_new: a very simple case derived from the cavity tutorial, ready to run with icoFoam and interFoam The way to reproduce is as follows: - Compile customZeroGrad: Code:
cd customZeroGrad Code:
cd cavity_new The second case is also OK. The BC is applied to pd. The third case fails. The BC is applied to alpha1. The error message is: Code:
[user:PID] *** An error occurred in MPI_Recv ---------------------------------------------------------- First, when the case runs, gMin and gMax do not work as expected. The minimum and maximum points of the patch are calculated in two different ways: with a loop and a reduce statement, and the result is right (minX = 0.1 and maxX = 0.2). And then with gMin and gMax a warning appears: Code:
--> FOAM Warning : Code:
GM Procesador 1: xMin 0, xMax 0.2 ---------------------------------------------------------- The second bug is the failure when the parallel functions are called for the alpha1 field, as they have proven to work elsewhere... Best regards, Pablo |
At OFW9 Hrv mentioned spotting a rather drastic bug in a few parallel aspects of the VOF code. He didn't go into much detail but perhaps you two have discovered the same problem! Regardless this this should be something we can sort out; a proper mpi reduce isn't the most complicated thing out there.
Cheers, Kyle |
Hi Kyle,
I also think the issue I reported is pretty straightforward to fix. I have been doing some tests and apparently only two minor changes are needed: /foam-extend-3.1/src/foam/fields/Fields/Field/FieldFunctions.C from line 310: Code:
template<class Type> These modifications work for me, however, I don't know if they have any side effects, as Bernhard pointed out here http://www.cfd-online.com/Forums/ope...llel-runs.html , but I believe that they shouldn't. If this is the solution, one bug less! Best, Pablo |
All times are GMT -4. The time now is 21:04. |