Possible problem in cyclic boundary conditions in parallel cases
4 Attachment(s)
Hello,
I was running a simple scalar transport problem in a periodic square (all 4 sides periodic), with a fixed sinusoidal velocity profile. The initial condition for the scalar is shown in the attachment Initial.png. Values are 0 and 1 at t = 0. If the case is run in serial mode, at t = 0.1, the result is shown in Serial.png, with the expected behaviour. However, when the case is split over two processors, at the same time, there are inconsistencies along the decomposition surface (vertical line in the middle of the domain. I used the "simple" decomposition in the pictures, however a similar problem is present with METIS too), where the concentration increases unexpectedly to 2 (See Parallel.png). Also, half of the periodic patch is affected. The blockMeshDict is attached for your convenience. Edit: OpenFOAM Version: 1.5.x from git repository, updated today. Best, 
Hi again :)
Good news: it's not a bug in the cyclic conditions, but it is due to how I set the velocity. I have a fixed velocity field that changes at specified times, as a function of the position, so I did something like: forAll (mesh.cells(), celli) { scalar x = mesh.C()[celli].component(vector::X); scalar y = mesh.C()[celli].component(vector::Y); if (...) { U[celli] = vector(sin(y*...), 0.0, 0.0); } else { U[celli] = vector(0.0, sin(x*...), 0.0); } } The problem is that the velocity field is not updated correctly along the boundary introduced by the domain decomposition. Any better way to do that, which works also in parallel? Best, 
Call correctBoundaryConditions on the field after changing the internal field.

Thanks :D

All times are GMT 4. The time now is 13:53. 