CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

AMG parallelisation and convection schemes

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   November 20, 2007, 06:14
Default I'm running LES on internal fl
  #1
Member
 
Christian Lindbäck
Join Date: Mar 2009
Posts: 55
Rep Power: 8
christian is on a distinguished road
I'm running LES on internal flow passing an orifice plate. There are 3 million hexahedral cells. AMG is used with the following settings:

p GAMG
{
p GAMG
{
tolerance 1e-04;
relTol 0;
smoother GaussSeidel;
nCellsInCoarsestLevel 30;
mergeLevels 1;
agglomerator faceAreaPair;
cacheAgglomeration on;
// nPreSweeps 0;
// nPostSweeps 2;
// nFinestSweeps 2;
// scaleCorrection yes;
// directSolveCoarsest no;
};
pFinal GAMG
{
tolerance 1e-04;
relTol 0;
smoother GaussSeidel;
nCellsInCoarsestLevel 30;
mergeLevels 1;
agglomerator faceAreaPair;
cacheAgglomeration on;
// nPreSweeps 0;
// nPostSweeps 2;
// nFinestSweeps 2;
// scaleCorrection yes;
// directSolveCoarsest no;

I'm running on a dual core machine. I've compared running on 1 CPU and running on 2 CPU:s, see below:

-------------- 1 CPU -----------------------

Time = 0.0127

Courant Number mean: 0.029355 max: 1.3072
DILUPBiCG: Solving for Ux, Initial residual = 0.00181139, Final residual = 3.63695e-06, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 0.0207766, Final residual = 3.83961e-06, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0209138, Final residual = 3.63663e-06, No Iterations 3
GAMG: Solving for p, Initial residual = 0.247526, Final residual = 9.88771e-05, No Iterations 62
time step continuity errors : sum local = 2.27516e-08, global = -5.17908e-09, cumulative = 1.71557e-07
GAMG: Solving for p, Initial residual = 0.0506679, Final residual = 9.04641e-05, No Iterations 12
time step continuity errors : sum local = 2.07179e-08, global = -3.83544e-09, cumulative = 1.67722e-07
ExecutionTime = 37679.3 s ClockTime = 45218 s

Time = 0.01275

Courant Number mean: 0.0293701 max: 1.3049
DILUPBiCG: Solving for Ux, Initial residual = 0.00182424, Final residual = 3.75133e-06, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 0.0208073, Final residual = 2.84017e-07, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0209474, Final residual = 3.01873e-07, No Iterations 3
GAMG: Solving for p, Initial residual = 0.249727, Final residual = 9.40309e-05, No Iterations 61
time step continuity errors : sum local = 2.15569e-08, global = -4.70248e-09, cumulative = 1.63019e-07
GAMG: Solving for p, Initial residual = 0.0506867, Final residual = 9.95558e-05, No Iterations 12
time step continuity errors : sum local = 2.30218e-08, global = -4.36871e-09, cumulative = 1.58651e-07
ExecutionTime = 37848.4 s ClockTime = 45388 s

-------------------------------------------

------------- 2 CPU -----------------------

Time = 0.0142

Courant Number mean: 0.0298694 max: 1.25606
DILUPBiCG: Solving for Ux, Initial residual = 0.00415434, Final residual = 3.0531e-06, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 0.026599, Final residual = 1.81261e-07, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0327937, Final residual = 9.15543e-06, No Iterations 2
GAMG: Solving for p, Initial residual = 0.27006, Final residual = 9.92377e-05, No Iterations 178
time step continuity errors : sum local = 3.20367e-08, global = -3.3541e-09, cumulative = -1.08914e-08
GAMG: Solving for p, Initial residual = 0.0598454, Final residual = 9.88563e-05, No Iterations 37
time step continuity errors : sum local = 3.10378e-08, global = -3.38161e-09, cumulative = -1.4273e-08
ExecutionTime = 2829.56 s ClockTime = 2969 s

Time = 0.01425

Courant Number mean: 0.0298782 max: 1.21613
DILUPBiCG: Solving for Ux, Initial residual = 0.00416074, Final residual = 1.55868e-06, No Iterations 2
DILUPBiCG: Solving for Uy, Initial residual = 0.0264671, Final residual = 3.28098e-07, No Iterations 3
DILUPBiCG: Solving for Uz, Initial residual = 0.0325682, Final residual = 3.72951e-07, No Iterations 3
GAMG: Solving for p, Initial residual = 0.277771, Final residual = 9.83602e-05, No Iterations 178
time step continuity errors : sum local = 3.08924e-08, global = 3.23832e-09, cumulative = -1.10347e-08
GAMG: Solving for p, Initial residual = 0.0579332, Final residual = 9.59878e-05, No Iterations 37
time step continuity errors : sum local = 3.10846e-08, global = 3.44964e-09, cumulative = -7.58509e-09
ExecutionTime = 3115 s ClockTime = 3269 s

-------------------------------------------

1) How come the AMG solver needs to do more iterations when running parallel? This is not what I've seen before on this machine and a similar case. Then there was a little speed-up when running on 2 CPUs. In this case, I'm losing time running parallel.

2) What are recommended settings for the AMG solver? I'm especially interested in the number of cells in the coarsest level when running serial and parallel, respectively.

3) In general, isn't it possible to get a scaling of 2 when running on two CPUs with AMG? What kind of case will give me a scaling of two?

4) I'm using Crank-Nicholson for time discretisation. From your experience, what is the gain in accuracy (compared to all sources of errors in the computation) and increase in computational time compared to Backward?

5) What would you say to be a reasonable convergence criteria (for p)? I don't want a situation where p is converged to much to drown in all other errors.

6) For convection, both midpoint and linear discretisation give me the velocity field in the plot below (after a few time steps). I assume what is seen are wiggles??? I used vanLeer for the RANS solution to start from. Do you believe vanLeer to be to diffusive for LES?



Best regards,
Christian
christian is offline   Reply With Quote

Old   November 20, 2007, 14:53
Default Hi Christian, I can only an
  #2
Senior Member
 
Srinath Madhavan (a.k.a pUl|)
Join Date: Mar 2009
Location: Edmonton, AB, Canada
Posts: 699
Rep Power: 12
msrinath80 is on a distinguished road
Hi Christian,

I can only answer some of your questions:

Q2) What are recommended settings for the AMG solver? I'm especially interested in the number of cells in the coarsest level when running serial and parallel, respectively.

A2) See response by Hrv in this[1] post.


Q3) In general, isn't it possible to get a scaling of 2 when running on two CPUs with AMG? What kind of case will give me a scaling of two?

A3) Yes. I get it all the time. Some of the important requirements for getting a scaling of two are:

i) Decent problem size (at least 0.25 million cells)
ii) Make sure you have a decent interconnect (gigabit is good, infiniband is better, SMP is best)
iii) Don't run your simulation on the following configurations as they at best give 1.2 speedup:

a) hyperthreaded CPUs (a load of crap is what this is for parallel CFD)
b) Dual/Quad/Octo-core CPU offerings from Intel/AMD (These CPUs steadily approach the performance of hyperthreaded CPUs as you move from left to right)

References:

[1] http://www.cfd-online.com/OpenFOAM_D...es/1/3360.html
msrinath80 is offline   Reply With Quote

Old   December 17, 2007, 07:45
Default When running this LES of the f
  #3
Member
 
Christian Lindbäck
Join Date: Mar 2009
Posts: 55
Rep Power: 8
christian is on a distinguished road
When running this LES of the flow past the orifice plate I'm using a turbulentInlet. Running with fluctuations of 0 % gives me a solution that runs smoothly with a few pressure iterations per time step. However, when running with 5 % fluctuations I get temporal oscillations and a lot of pressure iterations are needed. Why is this? How can I come up with a remedy?

I using a fluctuation scale of (0.05 0.05 0.05). The inlet uses a mapped RANS profile and is normal to the x axis. As a reference scale (to the fluctuations) I'm using the mapped velocity profile.

If you can help me somehow, please respond ASAP.

Best regards,
Christian Svensson
christian is offline   Reply With Quote

Old   December 17, 2007, 09:21
Default The turbulentInlet boundary pr
  #4
Senior Member
 
Eugene de Villiers
Join Date: Mar 2009
Posts: 725
Rep Power: 12
eugene is on a distinguished road
The turbulentInlet boundary produces inlet velocities that are not divergence free, i.e. they do not obey continuity. The pressure solver therefore has to work a lot harder to converge. turbulentInlet is a very poor choice of inlet condition for velocity, since the random oscillations will be damped out before they can transition to proper turbulence anyway.
eugene is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Convection discretization schemes for LES sek OpenFOAM Running, Solving & CFD 36 May 18, 2015 11:55
Recommended convection schemes for swirling flow in diffuser hani OpenFOAM Running, Solving & CFD 12 August 23, 2005 11:05
differencing schemes for 3-D Convection-diffusion problems Nuray Kayakol Main CFD Forum 20 September 16, 1999 04:16
Higher-order bounded convection schemes for pure advection with discontinuity Anthony Main CFD Forum 3 June 13, 1999 02:36


All times are GMT -4. The time now is 08:26.