CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM (https://www.cfd-online.com/Forums/openfoam/)
-   -   best values for GAMG solver (nCellsInCoarsestLevel , nPreSweeps 0, nPostSweeps) (https://www.cfd-online.com/Forums/openfoam/117256-best-values-gamg-solver-ncellsincoarsestlevel-npresweeps-0-npostsweeps.html)

mechy May 5, 2013 13:52

best values for GAMG solver (nCellsInCoarsestLevel , nPreSweeps 0, nPostSweeps)
 
what is the best values for GAMG parameters specially for
nCellsInCoarsestLevel , nPreSweeps 0, nPostSweeps

is there a formula to calculate those based on the number of cells ?

Regards

chegdan May 9, 2013 11:17

Try this

http://www.cfd-online.com/Forums/ope...on-meshes.html

There are several other threads, but the settings and concepts from this thread I use with my own simulations.

mechy May 10, 2013 12:09

Dear Daniel
thanks so much for your answer.
in most of OF tutorials the value of nCellsInCoarsestLevel is set to 10 (or 20)
but in your link it is set to 500 (or sqrt(#cell))
do you know why OF used the 10 or 20 values ?

Regards

chegdan May 10, 2013 12:23

This is one of those situations where something has been tested and found to work for a particular situation. You can always try the sqrt(ncells), 10, 20, 50 or 100 and then see which one is the best for your case.

I have read that 10 or 20 for larger cases is too low and can be inefficient. Hence why I stay a bit higher with sqrt(nCells). However, its a case by case change.

kmooney May 11, 2013 12:24

I've seen similar changes in convergence speed by altering the nCellsInCoarsestLevel setting. In once case I had pressure converge twice as fast for a relatively large case @ 600k cells on about 50 processors. I don't have the solver settings on hand unfortunately.

It is definitely worth investigating before starting big parametric or large parallel cases.

mechy May 12, 2013 01:09

Quote:

Originally Posted by chegdan (Post 426556)
This is one of those situations where something has been tested and found to work for a particular situation. You can always try the sqrt(ncells), 10, 20, 50 or 100 and then see which one is the best for your case.

I have read that 10 or 20 for larger cases is too low and can be inefficient. Hence why I stay a bit higher with sqrt(nCells). However, its a case by case change.


Hi Daniel

by increasing the nCellsInCoarsestLevel value, the convergency increased but the high value of nCellsInCoarsestLevel may be increase the cpu time for solution
in other word by the high value of nCellsInCoarsestLevel solution converged, certainly
am I right ?

Regards

KateEisenhower February 9, 2017 07:42

Quote:

Originally Posted by chegdan (Post 426556)
This is one of those situations where something has been tested and found to work for a particular situation. You can always try the sqrt(ncells), 10, 20, 50 or 100 and then see which one is the best for your case.

Does anyone know where this recommendation (nCellsInCoarsestLevel = sqrt(ncells) comes from?

Best regards,

Kate

Artur.Ant October 5, 2019 09:46

Remember that the OF tutorials are only demo tutorials. In openFoam there is no default options like in Fluent or any other commercial software; don't take the tutorial options as the one that work for sure!

trevorkahan October 7, 2019 05:34

In most of OF tutorials the value of nCellsInCoarsestLevel is set to 10 (or 20).
basketball legends

Geon-Hong December 10, 2019 19:26

Test on the nCellsInCoarsestLevel value
 
I tested the parameter nCellsInCoarsestLevel and would like to share it here:

First of all, I tested using a ship simulation case with 5 Mil. Cells., which is a moderately large case. The GAMG solver was used to solve the pressure poisson equation. The test has been conducted three times for each case to measure the elapsed time for the first 100 iterations and then averaged. The domain was decomposed into 64, 128, and 256 subdomains. All the rest parameters, such as nPresweeps, nPostsweeps, etc., remained as the default values. Additionally, I also run the simulation by using a PCG solver with the DIC preconditioner to compare the results.


Code:

Elapsed Time(in seconds):

Solver      256      128        64
-------  -------  -------  -------
PCG        177.88    425.11    987.52
10        953.95  1102.91  1421.36
100      1426.52  1441.30  1592.25
200      1721.87  1657.35  1706.38
400      2043.28  1976.25  1901.76
800      2603.98  2530.56  2274.23
1600      3235.37  3445.93  2897.09
3200      4751.91  4750.79  3962.82

The numbers 10, 100, 200... in the first column are the nCellsInCoarsestLevel. As you can see, the computation took longer as the nCellsInCoarsestLevel increased.

Before I started this test, I expected that there should be an optimum point of the nCellsInCoarsestLevel parameter, but it wasn't. Rather the computational time gradually increased with the parameter. So I doubt if the sqrt(#Cells) is the optimum setup for the nCellsInCoarsestLevel (where does this recommendation come from?)

Even the GAMG solver showed worse performance relative to the DICPCG solver. (So I do not use the GAMG solver any longer)

Interestingly, the GAMG showed less degrade in speed performance than the DICPCG though the number of subdomains increased. Even lower subdomains showed faster performances in cases nCellsInCoarsestLevel>=200.

I hope to discuss this result with you and hear your opinion on it. If you have any suggestions or things to test, please let me know.

Regards,
Geon-Hong

makaveli_lcf February 5, 2021 05:21

Nice test Geon-Hong!


I have extended that for my self, and can report you that the optimum here will be which level of tolerance you get.


For my case (some MHD flow) the CG stopped converging and with 400 iterations remained at 1e-07.

Despite faster CG iterations, the GAMG allowed easily to get the convergence to 1e-14 in 15-20 iterations.


So, when comparing GAMG with CG solvers, you should check what you actually get for the solution of your problem.


Cheers,
Alexander





Quote:

Originally Posted by Geon-Hong (Post 752092)
I tested the parameter nCellsInCoarsestLevel and would like to share it here:

First of all, I tested using a ship simulation case with 5 Mil. Cells., which is a moderately large case. The GAMG solver was used to solve the pressure poisson equation. The test has been conducted three times for each case to measure the elapsed time for the first 100 iterations and then averaged. The domain was decomposed into 64, 128, and 256 subdomains. All the rest parameters, such as nPresweeps, nPostsweeps, etc., remained as the default values. Additionally, I also run the simulation by using a PCG solver with the DIC preconditioner to compare the results.


Code:

Elapsed Time(in seconds):

Solver      256      128        64
-------  -------  -------  -------
PCG        177.88    425.11    987.52
10        953.95  1102.91  1421.36
100      1426.52  1441.30  1592.25
200      1721.87  1657.35  1706.38
400      2043.28  1976.25  1901.76
800      2603.98  2530.56  2274.23
1600      3235.37  3445.93  2897.09
3200      4751.91  4750.79  3962.82

The numbers 10, 100, 200... in the first column are the nCellsInCoarsestLevel. As you can see, the computation took longer as the nCellsInCoarsestLevel increased.

Before I started this test, I expected that there should be an optimum point of the nCellsInCoarsestLevel parameter, but it wasn't. Rather the computational time gradually increased with the parameter. So I doubt if the sqrt(#Cells) is the optimum setup for the nCellsInCoarsestLevel (where does this recommendation come from?)

Even the GAMG solver showed worse performance relative to the DICPCG solver. (So I do not use the GAMG solver any longer)

Interestingly, the GAMG showed less degrade in speed performance than the DICPCG though the number of subdomains increased. Even lower subdomains showed faster performances in cases nCellsInCoarsestLevel>=200.

I hope to discuss this result with you and hear your opinion on it. If you have any suggestions or things to test, please let me know.

Regards,
Geon-Hong



All times are GMT -4. The time now is 03:08.