|
[Sponsors] |
Parallelising an explicit solver: is correctBoundaryConditions enough? |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
April 28, 2021, 04:49 |
Parallelising an explicit solver: is correctBoundaryConditions enough?
|
#1 |
New Member
Shannon Leakey
Join Date: Mar 2019
Posts: 10
Rep Power: 7 |
Hello everyone
I am programming a second-order Godunov-type scheme in OpenFOAM using dual-time stepping. It works fine in serial, but has problems in parallel... sometimes. In parallel, sometimes it converges in pseudo-time, but sometimes (if there are rapid changes near a processor boundary) it can get stuck in an extreme feedback loop at the processor boundary that causes it to blow up! This makes me think the problem is in the synchronisation between different processors. How the scheme works There are conserved variables Q and primitive variables W. In each pseudo-time iteration, (limited) gradients of W are calculated, then these are used to reconstruct the Riemann states W_L and W_R at each interface, which are used to calculate the conservative flux F(Q) with a Riemann solver. The flux F(Q) is then used to update the conserved variables Q, which are then used to update the primitive variables W. correctBoundaryConditions I use correctBoundaryConditions:
MPI thoughts I have trawled through dbnsFoam on foam-extend, these forums, and also the general internet, and the best I can get is from this presentation: Instructional workshop on OpenFOAM programming. It seems to suggest that each runTime loop sends out information in a non-blocking way, and correctBoundaryConditions is like a blocking synchronisation point where this information is received. If this is true, then perhaps I need another sending out of information after the gradients are calculated? (Sorry if I used the wrong words - I don't know that much about MPI at the moment.) Would anybody be able to shed light on blocking and non-blocking calls in OpenFOAM? Should correctBoundaryConditions solve all my problems or do I need something else? Thanks in advance, Shannon |
|
April 29, 2021, 13:35 |
|
#2 |
Senior Member
|
OpenFoam inherits blocking and non-blocking calls from MPI.
For correctBoundaryConditions to take of inter-processor boundaries, a naive approach might not suffice. Possibly it helps for you to share more information on what you are trying to accomplish. |
|
May 3, 2021, 12:27 |
Left/right/owner/neighbour
|
#3 |
New Member
Shannon Leakey
Join Date: Mar 2019
Posts: 10
Rep Power: 7 |
Thanks Domenico for your reply
I read up on MPI and went through the code for correctBoundaryConditions - I am now pretty sure correctBoundaryConditions is not the problem. I have also verified this with some classic Info/Pout outputs in the solver. I think the problem was that I defined the left/right Riemann states with respect to owner/neighbour cells. The foam-extend solver dbnsFoam seems to do this in src/dbns/numericFlux/numericFlux.C and I was taking it as inspiration. However, I don't think this makes sense as the owner might not always be on the left with respect to the velocity direction, for example, at a processor boundary. Perhaps dbnsFoam has corrected for this in the Riemann solver, which is how they get away with it. I need to do some thinking about this! |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Not able to get convergence using Explicit spatial and transient solver for shock tub | Patelp1996 | FLUENT | 0 | March 8, 2021 18:02 |
Conjugate heat transfer | virothi | Main CFD Forum | 10 | January 5, 2021 05:46 |
EHD model very low deltaT | mcfdma | OpenFOAM Running, Solving & CFD | 1 | November 3, 2020 06:46 |
Working directory via command line | Luiz | CFX | 4 | March 6, 2011 21:02 |
why the solver reject it? Anyone with experience? | bearcat | CFX | 6 | April 28, 2008 15:08 |