|August 30, 2012, 12:32||
Radically Different GAMG Pressure Solve Iterations with Varying Processor Count
Matthew J. Churchfield
Join Date: Nov 2009
Location: Boulder, Colorado, USA
Posts: 49Rep Power: 10
I am performing a scaling study of OpenFOAM and using channel flow DNS to study that. I am finding that PCG scales well to the point that roughly 10K-20K cells/core is reached. GAMG does seem to scale well, not down to the point that PCG does, but is much faster than PCG.
However, there is some anomalous behavior that I am trying to understand with GAMG. The best example is as follows: I ran a case with 315M cells for roughly 2000 time steps. I tried this on 1024, 2048, and 4096 cores. I am using default scotch decomposition. Everything is exactly the same in all three cases, except for the number of cores used. The 2048 case requires roughly twice the final pressure solve iterations to achieve the same tolerance as the 1024 and 4096 cases. In general, I have seen that for a fixed problem size, as the number of cores used is increased, the number of final pressure solve iterations required increases slightly, but this 2048 case is an outlier. Does anyone have any idea why this may have occurred?
My pressure solver settings are as follows, and I used OF-2.1.0:
|August 31, 2012, 06:23||
Join Date: Mar 2009
Location: Lisbon, Portugal
Blog Entries: 39Rep Power: 103
Looks like you've hit a corner case due to the number of divisions. I know I've seen some explanations on this subject... OK, two I've found:
Another possibility is the number of cells available per processor: 315M cells / 2048 processors ~= 154k cells > 50k cells. Mmm, I guess that it's enough cells to go around. Of course you should confirm if scotch isn't unbalancing the distribution, by shifting 40k to a single processor and the remaining 110k spread through other processors.
By the way, another detail I've found sometime ago that might help: SnappyHexmesh crashes with many processes post #8 - it's possible to do multi-level decomposition!
|gamg scaling multigrid|
|Thread||Thread Starter||Forum||Replies||Last Post|
|Extrusion with OpenFoam problem No. Iterations 0||Lord Kelvin||OpenFOAM Running, Solving & CFD||8||March 28, 2016 11:08|
|transsonic nozzle with rhoSimpleFoam||Unseen||OpenFOAM Running, Solving & CFD||7||April 16, 2014 03:38|
|Low Mach number Compressible jet flow using LES||ankgupta8um||OpenFOAM Running, Solving & CFD||7||January 15, 2011 14:38|
|Negative value of k causing simulation to stop||velan||OpenFOAM Running, Solving & CFD||1||October 17, 2008 05:36|
|Unknown error||sivakumar||OpenFOAM Pre-Processing||9||September 9, 2008 12:53|