CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Radically Different GAMG Pressure Solve Iterations with Varying Processor Count

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 30, 2012, 12:32
Default Radically Different GAMG Pressure Solve Iterations with Varying Processor Count
  #1
Member
 
Matthew J. Churchfield
Join Date: Nov 2009
Location: Boulder, Colorado, USA
Posts: 49
Rep Power: 18
mchurchf is on a distinguished road
I am performing a scaling study of OpenFOAM and using channel flow DNS to study that. I am finding that PCG scales well to the point that roughly 10K-20K cells/core is reached. GAMG does seem to scale well, not down to the point that PCG does, but is much faster than PCG.

However, there is some anomalous behavior that I am trying to understand with GAMG. The best example is as follows: I ran a case with 315M cells for roughly 2000 time steps. I tried this on 1024, 2048, and 4096 cores. I am using default scotch decomposition. Everything is exactly the same in all three cases, except for the number of cores used. The 2048 case requires roughly twice the final pressure solve iterations to achieve the same tolerance as the 1024 and 4096 cases. In general, I have seen that for a fixed problem size, as the number of cores used is increased, the number of final pressure solve iterations required increases slightly, but this 2048 case is an outlier. Does anyone have any idea why this may have occurred?

My pressure solver settings are as follows, and I used OF-2.1.0:


p
{
solver GAMG;
tolerance 1e-5;
relTol 0.05;
smoother DIC;
nPreSweeps 0;
nPostSweeps 2;
nFinestSweeps 2;
cacheAgglomeration true;
nCellsInCoarsestLevel 100;
agglomerator faceAreaPair;
mergeLevels 1;
}

pFinal
{
solver GAMG;
tolerance 1e-6;
relTol 0.0;
smoother DIC;
nPreSweeps 0;
nPostSweeps 2;
nFinestSweeps 2;
cacheAgglomeration true;
nCellsInCoarsestLevel 100;
agglomerator faceAreaPair;
mergeLevels 1;
}


Thank you,

Matt Churchfield
mchurchf is offline   Reply With Quote

Old   August 31, 2012, 06:23
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Matt,

Looks like you've hit a corner case due to the number of divisions. I know I've seen some explanations on this subject... OK, two I've found:
As for the question you had: possibly the tolerance values are affecting the way the error addition is occurring, which leads to this specific problem. For example: imagine that you have well distributed cumulative errors between each matrix operation with 1024 and 4096 processors, but with 2048 the cumulative errors aren't well distributed. This would lead to the perfect error storm.


Another possibility is the number of cells available per processor: 315M cells / 2048 processors ~= 154k cells > 50k cells. Mmm, I guess that it's enough cells to go around. Of course you should confirm if scotch isn't unbalancing the distribution, by shifting 40k to a single processor and the remaining 110k spread through other processors.


By the way, another detail I've found sometime ago that might help: http://www.cfd-online.com/Forums/ope...tml#post367979 post #8 - it's possible to do multi-level decomposition!

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Reply

Tags
gamg scaling multigrid


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
transsonic nozzle with rhoSimpleFoam Unseen OpenFOAM Running, Solving & CFD 8 July 1, 2022 06:54
Extrusion with OpenFoam problem No. Iterations 0 Lord Kelvin OpenFOAM Running, Solving & CFD 8 March 28, 2016 11:08
Low Mach number Compressible jet flow using LES ankgupta8um OpenFOAM Running, Solving & CFD 7 January 15, 2011 13:38
Negative value of k causing simulation to stop velan OpenFOAM Running, Solving & CFD 1 October 17, 2008 05:36
Unknown error sivakumar OpenFOAM Pre-Processing 9 September 9, 2008 12:53


All times are GMT -4. The time now is 18:44.