CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (https://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   correction number and parallel computing (https://www.cfd-online.com/Forums/openfoam-bugs/82262-correction-number-parallel-computing.html)

psosnows November 22, 2010 04:21

correction number and parallel computing
 
2 Attachment(s)
Hello FOAMers,

lately I came across a problem with making parallel a very simple case.

OpenFoam version: 1.7.1

I was using slightly modified icoFoam solver (with added source term) with standard PISO loop.

The case was a plane channel flow with quite good mesh resolution (128^3).

Used schemes were:
backward for time advancement,
Gauss linear corrected for all the others.

The case was running with time step, which kept the Courant number around 0.4.

Mesh was orthogonal, so there were no corrections for that.

And I was running the case with just 1 pressure correction. I am aware that this is not sufficient to obtain the proper solution for pressure, but the problem is not related to that.

It is possible to run the case using serial solver.

At the same time, any kind of making it parallel results the solver to blow up, i.e. during first time step, the solver exceeds the number of pressure loops (standard 1001) and at the next time step it blows up.

I have observed the problem after dividing the mesh either into 2 and 16 domains.

And this worries me a bit. It seems as there is some flaw in procedure of solving parallel cases.

In addition, I run the same case using Crank-Nicholson time advancement. The serial computation always worked fine. But once again, after decomposition, the case blew up (but usually after a few time steps).

If it is possible- please take a look at that matter.
I attach the logs of both serial and parallel runs of one pressure correction case.

Best,
Pawel S.

sina.s January 2, 2013 09:17

Hi Pawel!

I am facing exactly the same problem as you have described in this thread (I am using OF 2.1.x), and it would be really helpful to me to know how have you managed to solve this problem?, or which set of parameters are responsible for such behaviour in parallel run? (is it the BC's? or type of solvers? or ..?.. )

Thank you

Sina

psosnows January 2, 2013 10:41

Hello Sina,

unfortunately, as you can see from the short "history" of this post, it has not driven much attention and most likely just got lost in the depths of the forum. Unfortunate...

Regarding the problem, since I work mostly on simple cases, for time sake I just did serial computation. In recent days I was to get back to parallel runs, and have to say that am quite disappointed that the problem is still present in 2.1.x.

I believe that the thing is somewhere in the code, and it is deep. And that was the reason I made that thread.

Regarding suggestions, try to check the size of the smallest cell. I know that this has little to do with parallel computing, but quite often you end up with cells that are of size 1e-30 m^3, and that is below the double precision limit that can be stored accurately- easy to miss. Also "playing" with fvSchemes and fvSolution may prove to help. I do not know what are you doing. If it just has to "shine", maybe changing a scheme or precision will help.

Anyway, good luck. Hope that if this topic gets some attention, soon the problem may be solved or explained how to deal with.

Best,
Pawel

Matt_B March 19, 2013 12:22

Guys,
I'm experiencing the same problem as you: my solver runs without any problem in serial mode, but quickly blows up in parallel mode with the pressure equation not converging at all in parallel mode. Searching in this forum I found this thread in which probably is explained the reason of why pressure is very badly calculated: http://www.cfd-online.com/Forums/ope...am-solved.html

Here it suggest to change, just for pressure, the Matrix calculator from PCG to GAMG, because PCG is parallelization-resistant.

Matteo

psosnows March 19, 2013 13:58

Hello Matteo,

thank you very much for that tip. In not so distant time I am moving towards bigger simulations, and parallel processing will be a necessity. Knowing a possible solution to arising instabilities is very helpful.

Best,
Pawel

Yahoo June 17, 2013 20:56

1 Attachment(s)
Hi
I have been stuck on a problem related to simulation of a solidification problem on multiple processors. The problem is that at the pressure reference cell (pRefCell), I get a very weird behavior. Here is the liquid fraction contour when the simulation is performed on multiple processors.
Attachment 22779
Similar problem has been reported on:
(1) Poisson eq w setReference works serial diverges in parallel
http://www.cfd-online.com/Forums/ope...-parallel.html

(2) interfoam blows up on parallel run
http://www.cfd-online.com/Forums/ope...allel-run.html

(3) temperature anomaly at pressure reference cell
http://www.cfd-online.com/Forums/ope...ence-cell.html
What has been suggested in these posts is: (1) using GAMG solver instead of PCG as the pressure solver on parallel run, and (2) adjusting the fluxes after including the buoyancy term. I have applied both of the comments, although the second comment does not make a full sense for me, but still have the problem.
Please let me know if you have any comments.


All times are GMT -4. The time now is 23:09.