CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Bugs

correction number and parallel computing

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 2 Post By Matt_B

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 22, 2010, 04:21
Default correction number and parallel computing
  #1
Senior Member
 
Pawel Sosnowski
Join Date: Mar 2009
Location: Munich, Germany
Posts: 105
Rep Power: 18
psosnows is on a distinguished road
Hello FOAMers,

lately I came across a problem with making parallel a very simple case.

OpenFoam version: 1.7.1

I was using slightly modified icoFoam solver (with added source term) with standard PISO loop.

The case was a plane channel flow with quite good mesh resolution (128^3).

Used schemes were:
backward for time advancement,
Gauss linear corrected for all the others.

The case was running with time step, which kept the Courant number around 0.4.

Mesh was orthogonal, so there were no corrections for that.

And I was running the case with just 1 pressure correction. I am aware that this is not sufficient to obtain the proper solution for pressure, but the problem is not related to that.

It is possible to run the case using serial solver.

At the same time, any kind of making it parallel results the solver to blow up, i.e. during first time step, the solver exceeds the number of pressure loops (standard 1001) and at the next time step it blows up.

I have observed the problem after dividing the mesh either into 2 and 16 domains.

And this worries me a bit. It seems as there is some flaw in procedure of solving parallel cases.

In addition, I run the same case using Crank-Nicholson time advancement. The serial computation always worked fine. But once again, after decomposition, the case blew up (but usually after a few time steps).

If it is possible- please take a look at that matter.
I attach the logs of both serial and parallel runs of one pressure correction case.

Best,
Pawel S.
Attached Files
File Type: txt log_0_parallel.txt (6.6 KB, 15 views)
File Type: txt log_0_serial_beggining.txt (10.8 KB, 5 views)
psosnows is offline   Reply With Quote

Old   January 2, 2013, 09:17
Default
  #2
New Member
 
Sina Saremi
Join Date: May 2011
Location: Copenhagen, Denmark
Posts: 5
Rep Power: 14
sina.s is on a distinguished road
Hi Pawel!

I am facing exactly the same problem as you have described in this thread (I am using OF 2.1.x), and it would be really helpful to me to know how have you managed to solve this problem?, or which set of parameters are responsible for such behaviour in parallel run? (is it the BC's? or type of solvers? or ..?.. )

Thank you

Sina
sina.s is offline   Reply With Quote

Old   January 2, 2013, 10:41
Default
  #3
Senior Member
 
Pawel Sosnowski
Join Date: Mar 2009
Location: Munich, Germany
Posts: 105
Rep Power: 18
psosnows is on a distinguished road
Hello Sina,

unfortunately, as you can see from the short "history" of this post, it has not driven much attention and most likely just got lost in the depths of the forum. Unfortunate...

Regarding the problem, since I work mostly on simple cases, for time sake I just did serial computation. In recent days I was to get back to parallel runs, and have to say that am quite disappointed that the problem is still present in 2.1.x.

I believe that the thing is somewhere in the code, and it is deep. And that was the reason I made that thread.

Regarding suggestions, try to check the size of the smallest cell. I know that this has little to do with parallel computing, but quite often you end up with cells that are of size 1e-30 m^3, and that is below the double precision limit that can be stored accurately- easy to miss. Also "playing" with fvSchemes and fvSolution may prove to help. I do not know what are you doing. If it just has to "shine", maybe changing a scheme or precision will help.

Anyway, good luck. Hope that if this topic gets some attention, soon the problem may be solved or explained how to deal with.

Best,
Pawel
psosnows is offline   Reply With Quote

Old   March 19, 2013, 12:22
Default
  #4
Member
 
Join Date: Feb 2012
Posts: 35
Rep Power: 14
Matt_B is on a distinguished road
Guys,
I'm experiencing the same problem as you: my solver runs without any problem in serial mode, but quickly blows up in parallel mode with the pressure equation not converging at all in parallel mode. Searching in this forum I found this thread in which probably is explained the reason of why pressure is very badly calculated: http://www.cfd-online.com/Forums/ope...am-solved.html

Here it suggest to change, just for pressure, the Matrix calculator from PCG to GAMG, because PCG is parallelization-resistant.

Matteo
psosnows and songwukong like this.
Matt_B is offline   Reply With Quote

Old   March 19, 2013, 13:58
Default
  #5
Senior Member
 
Pawel Sosnowski
Join Date: Mar 2009
Location: Munich, Germany
Posts: 105
Rep Power: 18
psosnows is on a distinguished road
Hello Matteo,

thank you very much for that tip. In not so distant time I am moving towards bigger simulations, and parallel processing will be a necessity. Knowing a possible solution to arising instabilities is very helpful.

Best,
Pawel
psosnows is offline   Reply With Quote

Old   June 17, 2013, 20:56
Default
  #6
New Member
 
Join Date: Apr 2012
Posts: 21
Rep Power: 14
Yahoo is on a distinguished road
Hi
I have been stuck on a problem related to simulation of a solidification problem on multiple processors. The problem is that at the pressure reference cell (pRefCell), I get a very weird behavior. Here is the liquid fraction contour when the simulation is performed on multiple processors.
Untitled.jpg
Similar problem has been reported on:
(1) Poisson eq w setReference works serial diverges in parallel
http://www.cfd-online.com/Forums/ope...-parallel.html

(2) interfoam blows up on parallel run
http://www.cfd-online.com/Forums/ope...allel-run.html

(3) temperature anomaly at pressure reference cell
http://www.cfd-online.com/Forums/ope...ence-cell.html
What has been suggested in these posts is: (1) using GAMG solver instead of PCG as the pressure solver on parallel run, and (2) adjusting the fluxes after including the buoyancy term. I have applied both of the comments, although the second comment does not make a full sense for me, but still have the problem.
Please let me know if you have any comments.
Yahoo is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Problem with decomposePar tool vinz OpenFOAM Pre-Processing 18 January 26, 2011 02:17
[snappyHexMesh] external flow with snappyHexMesh chelvistero OpenFOAM Meshing & Mesh Conversion 11 January 15, 2010 19:43
air bubble is disappear increasing time using vof xujjun CFX 9 June 9, 2009 07:59
IcoFoam parallel woes msrinath80 OpenFOAM Running, Solving & CFD 9 July 22, 2007 02:58
Parallel LES computation stops with reason vvqf OpenFOAM Running, Solving & CFD 4 March 6, 2006 07:55


All times are GMT -4. The time now is 09:56.