CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Programming & Development (https://www.cfd-online.com/Forums/openfoam-programming-development/)
-   -   Boundary Conditions in Parallel (https://www.cfd-online.com/Forums/openfoam-programming-development/230374-boundary-conditions-parallel.html)

Gerry Kan September 21, 2020 03:31

Boundary Conditions in Parallel
 
Howdy Foamers:

I am wondering at what point in the solver time loop are boundary conditions applied? Initially I thought I might be in the "*Eqn.H" but I it did not seem to be the case. I also didn't see anything that would indicate a fvPatchField in either the solver source code (pimpleFoam.C, for instance) and createFields.H

In other words, where would I expect to see fvPatchFields for the relevant governing equations being declared and called in the OpenFOAM object hierarchy?

Thank you in advance, Gerry.

P.S. - I am asking because I implemented a custom fvPatchField. At the moment it works in series, but hangs in parallel without core dume or any error messages. Though I believe this has possibly something to do with patch indexing in parallel, the hanging takes place outside of my fvPatchField member function calls. Going through the OpenFOAM call graph didn't really tell me any insight as to where (else) I should be looking for the problem.

dlahaye September 21, 2020 06:24

* solving an equation typically involves going through there following four steps: (1/4) define the equation with fvScalarMatrix as an output;
(2/4) relax the equation by calling the relax member function of the class meaning putting as much weight on the diagonal of the matrix as possible;
(3/4) apply the constraints by calling the constraint (or correct ?) member function and
(4/4) solve the equations by calling the solve member function;

* definition of relax and correct as member functions of fvScalarMatrix are given in https://github.com/OpenFOAM/OpenFOAM...rix/fvMatrix.H

* see also Chapter 18 of Moukalled-Mangani-Darwish book

Gerry Kan September 22, 2020 08:38

Dear Domenico:

Thanks for your response. I can see these steps by looking at the *Eqn.H files. I also thought that the boundary conditions would have been applied in the constraining step (your step 3/4).

The only thing I don't get is that, even if I comment out all the equation and solving bits in the *Eqn.H (so the files essentially define the necessary variables but do nothing), the boundary conditions are still being called. So this means it must be done somewhere else outside. I just don't know where.

Gerry.

Gerry Kan September 22, 2020 09:07

Dear Domenico:

On second thought ... I am going through fvMatrix from the link in your message, and I see a bunch of functions that are boundary related. I will look into this direction to see if it leads me somewhere.

Gerry.

dlahaye September 22, 2020 11:27

Dear Gerry,

Thank you for you input.

I do not know whether Dirichlet (fixed value) and Neumann boundary (fixed flux) conditions are treated on equal setting in OpenFoam.

Does this help: https://github.com/UnnamedMoose/Basi.../OFtutorial7.C

What solver and test case are you using for your test?

Good luck, Domenico.

Gerry Kan September 22, 2020 14:46

Dear Domenico:

Thanks for your input. Yes, I read this in previous occasions and based on this I have implemented different custom boundary conditions.

What I am trying to do is to implement a zeroGradient boundary condition that works on a cyclic patch (called in my case "cyclicZeroGradient"). I would like to prevent certain (scalar) variables, for instance, mass fraction, from building up over time through the cyclic boundary condition.

I took the existing cyclicFvPatchField, renamed it to something else (say, to "cyclicZeroGradient" and modified the member functions neighbourPatchField () and updateInterfaceMatrix() that behave like zeroGradientFvPatchField.

This works in a single CPU in the manner I expected, but in parallel the solver hanged immediately after the call to neighbourPatchField(). However, I also noticed that under this implementation, reverting the cyclicZeroGradient to cyclic also hangs in parallel. I hope it is as simple that I broke something that I was not aware of.

It also seems to me that the cyclic patch and cyclicFvPatchField also function differently than the typical fvPatchFields like fixedValue and fixedGradient, as the mapping of the cyclic boundaries are done already (in blockMesh, for instance) without using an extra utility to modify the mesh to reflect the coupling. Therefore I would like to take advantage of this.

Again thanks in advance,
Gerry.

P.S. - I also have a custom solver, but I imagine this new fvPatchField can be tested in something relatively simple such as pimpleFOAM.

Gerry Kan September 23, 2020 04:50

I managed to narrow down the problem. The hang is caused by the following snippit of code that takes place outside of the boundary condition in question.

Code:

if ( Pstream::master() )  {
    scalar domainMass = gSum ( cellMass );
}

The strange thing is, this code used to work a week ago on all clusters I worked with. As far as I know I have not touched this part of my code for over a month, so why all in a sudden it starts to hang I have no idea.

Gerry.

Gerry Kan September 23, 2020 06:07

Probably my bad, it seems that if the gSum() is called outside of the Pstream::master() condition block then the problem goes away.

It hasn't crashed so far probably because this block of code was never executed in run time up until recently (i.e., when the solver started to hang).

Gerry.


All times are GMT -4. The time now is 09:45.