CFD Online Logo CFD Online URL
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

MPI Error in Custom Utility

Register Blogs Members List Search Today's Posts Mark Forums Read

LinkBack Thread Tools Search this Thread Display Modes
Old   February 7, 2022, 06:43
Default MPI Error in Custom Utility
New Member
Join Date: Nov 2016
Posts: 8
Rep Power: 9
jackdoubleyou is on a distinguished road
Hi everyone,

I've written a bespoke utility that will identify droplets and other structures in a VOF field, and want to parallelise it so that it can be incorporated into a multiphase solver. This parallelisation is almost complete, however I've run into some edge cases that I cannot solve.

Without going into endless detail, the parallelisation works by passing droplet IDs across processor boundaries using a volScalarField which tracks which droplet any cell is attached to. The utility will then write out a connectivity file for each processor, to enable any droplets that cross processor boundaries to be reconstructed as a post-processing step. My current issue seems to only occur if there is a processor domain which has no droplets requiring identification.

The following loop loops over all processor patches and sets the patch face value to be the same as the cell centre value if the cell value is greater than 0.

forAll(mesh.boundaryMesh(), patchi)
        if (isA<processorPolyPatch>(mesh.boundaryMesh()[patchi]))
            forAll(id.boundaryField()[patchi], facei)
                label adjacentCell = mesh.boundaryMesh()[patchi].faceCells()[facei];

                if (id[adjacentCell] > 0)
                    id.boundaryFieldRef()[patchi][facei] = id[adjacentCell];

The utility will crash at the initEvaluate and evaluate function calls, with an MPI Wait error:

[proteus:25213] *** An error occurred in MPI_Wait
[proteus:25213] *** reported by process [1815216129,7]
[proteus:25213] *** on communicator MPI_COMM_WORLD
[proteus:25213] *** MPI_ERR_TRUNCATE: message truncated
[proteus:25213] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
 [proteus:25213] ***    and potentially your MPI job)
I suspect the issue is that the processor which has no droplets reaches this section of the code much before the others, and for some reason this causes a crash, although I can think of no reason why it should crash for this reason.

In an attempt to synchronise the processors before this loop, I added the following reduce call, but this also throws an error.

label tmp = Pstream::myProcNo();
reduce(tmp, maxOp<label>());
[proteus:25771] Read -1, expected 86400, errno = 14
[proteus:25771] *** An error occurred in MPI_Recv
[proteus:25771] *** reported by process [4918845867728568321,7]
[proteus:25771] *** on communicator MPI_COMM_WORLD
[proteus:25771] *** MPI_ERR_TRUNCATE: message truncated
[proteus:25771] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[proteus:25771] ***    and potentially your MPI job)
Any thoughts as to what might be happening? I haven't been able to find anything useful based on the error messages, and the parallel implementation of OpenFOAM is something I'm fairly new to. I'm using OpenFOAM v5.0, compiled on SLED 15 SP3 using GCC v7.5.0 and OpenMPI v3.1.1
jackdoubleyou is offline   Reply With Quote


mpi error, openfoam v5.0

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
Programming custom sample utility RaulCA OpenFOAM Programming & Development 0 August 28, 2019 05:24
Problems using the custom utility NusseltCalc sajad6 OpenFOAM Post-Processing 3 October 22, 2014 18:22
Sgimpi pere OpenFOAM 27 September 24, 2011 07:57
Error using LaunderGibsonRSTM on SGI ALTIX 4700 jaswi OpenFOAM 2 April 29, 2008 10:54
Is Testsuite on the way or not lakeat OpenFOAM Installation 6 April 28, 2008 11:12

All times are GMT -4. The time now is 05:07.