CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Convergence/flow development airfoil (https://www.cfd-online.com/Forums/openfoam-solving/86663-convergence-flow-development-airfoil.html)

MadsR March 29, 2011 12:13

Convergence/flow development airfoil
 
Hello,

Sorry this gets a bit lengthy, but I hope you don't mind too much, dear OpenFOAM user.

We have a NACA0012 test case in OpenFOAM (simpleFoam, k-omega nothing else is fancy) and in ANSYS CFX (also k-omega) and we compare the two cases. Lift and drag compares pretty good.
One thing that really shows a difference though, is the time it takes for convergence of the lift and drag coefficients. We monitor the U,p,k,omega-residuals and watch them go down fairly quickly but then the flow needs development until we reach steady behaviour of the lift and drag coefficients. This is all know and expected, of course :-)
Now, the number of iterations before we reach Cl,Cd-convergence is around 700 iterations in CFX and 50.000 (!) with OpenFOAM. A huge difference - also in wall clock time. CFX is around 2-3 hours (single core) whereas OpenFOAM is more than 24 hours (single core). Cell count is 150.000, structured hex. Bear with me on these rough figures.

Now, there is a huge caveat here of course: CFX is a coupled solver and OpenFOAM segregated (in our case SIMPLE). Usually, a steady simulation converges in much fewer iterations using a coupled solver compared to a segregated one. Therefore I would expect CFX to converge in much fewer iterations than OpenFOAM. But, I would also expect iterations to go much faster in OpenFOAM for the same reasons - and they are faster. But at 50.000 iterations OpenFOAM is still much, much slower than CFX.
In CFX we can accelerate convergence (and the flow development to steady state) by changing the CFL number (a pseudo CFL-number since our simulation is steady, not transient). This gives fast convergence of not only the residuals but also the integrated variables as lift and drag.

One thing I could think of is the differencing schemes we use in CFX and OpenFOAM - in CFX we use "High resolution" (upwind-linearUpwind blend) or Specified Blend (linearUpwind), there is not substantial difference on convergence times, but in OpenFOAM we use QUICK (divSchemes) and linear (the rest) which can/(will be) more unstable. But still: such a huge difference? And residuals converge fairly quickly (at least 3-4 decades) in OpenFOAM.

A second thing could be to run pisoFoam instead and the increase the time step?

I have searched around this forum without luck. I hope that someone can enlighten me on this one.

Help is greatly appreciated,
Thanks,
Mads

FelixL March 29, 2011 13:20

Hello, Mads,


your OpenFOAM settings don't seem to be optimal to me. I've run airfoil configurations with slightly more complex configurations and grids and I didn't have such extreme convergence times. I used SpalartAllmaras most of the time though, but I don't think the choice of turbulence model would make such a huge difference in terms of convergence rate.

In my opinion, simple airfoil configurations (low AoA, incompressible, 100k-400k cells) should be reasonably converged within a few thousand iterations. But this of course depends upon a lot of things.

What do you use as a starting field? Running potentialFoam first usually accelerates convergence a lot. PCG for solving the pressure equation usually helped a lot for me too (instead of GAMG).

A look at your numerical setup in OpenFOAM would be great.


Greetings,
Felix.

MadsR March 30, 2011 04:04

1 Attachment(s)
Hi Felix, thanks for your answer. I have attached the ./system files.

We are already using potentialFoam for initializing the simulations but thanks for the advice. We found that potentialFoam really helps a lot. Actually, I have made a simple channel test-case which shows that we get wild oscillations in the U-field and completely unrecognizable results when we do not initialize with potentialFoam. A bit strange.
We are using PCG, but have tried GAMG without any noticeable speedup (probably because we don't do many iterations in each "time"-step and our 2D setup is relatively small).

/Mads

Attachment 7088

FelixL March 30, 2011 10:01

Hi, Mads,


in my experience, OpenFOAM is very sensitive to initial conditions, especially on grids with some non-orthogonal cells. Most of the time, running potentialFoam works like a charm so it's best practice to me.

A few question:
- are you using the k-Omega turbulence model or the 4-equation transitional model? (I'm asking because there are entries for gamma and ReThetatTilda)
- do you really want to use QUICK? It reduces to 2nd order when using FVM anyway, so why not using linear or some other scheme which is 2nd order accurate? You avoid occurence of numerical dissipation then.

In fvSolution, setting the UR factors to 0.3 for p and 0.7 for U and 0.5 for the other quantities might speed up your simulation. (a rule of thumb is: alpha_p + alpha_U should be 1.0 for optimum performance)
And maybe you should decrease the relTol for your p equation to the order of relTol for the other equations or maybe one order of magnitude below that.


Convergence behavior also depends upon the mesh and the BCs. For example, if your far field boundaries are too close to the airfoil, OpenFOAM might have troubles, even when CFX doesn't really struggle.


Greetings,
Felix.

MadsR March 30, 2011 11:25

Felix,

- thank you for the comprehensive answer. Actually I didn't know that QUICK reduces to 2nd order (the surprise of the day :-), could you please send me a reference on this? I don't have much experience with QUICK either, I should add. I searched around the web and it seems that there is some controversy on this. Interesting. I do know that QUICK can be unstable and lead to overshoots, but we don't face such issues on the case at hand. I don't think QUICK will create significant numerical dissipation though, this is one of the main advantages of QUICK.

-yes we are running validating the Menter/Langtry transition model implementation hence the entries for Re_theta and gamma

- we are aware of the relax_U + relax_p = 1.0 recommendation, as you also see in the fvSolution file. We have tried to play around with these settings and found no change as long as we obey this recommendation and wiggles and/or divergence when going beyond a sum of 1.0.

- you are completely right about the influence of the outer boundaries, it is remarkable how much they can influence the result. I have done a lot of sensitivity study using CFX for 2D airfoils on most conceivable parameters, including distance to the outer boundaries and we are using 60 chords in each direction (upstream, downstream, above, below - total 120x120chords) which is well within asymptotic convergence of the Cl and Cd values for BC-distance-sensitivity-analysis in CFX.

thanks again, keep the valuable information coming :-)

/Mads

FelixL March 31, 2011 05:15

Hello, Mads,


in finite volume methods using the midpoint rule (which is of 2nd order accuracy) every higher-than-2nd order interpolation scheme (such as QUICK) effectively reduces to 2nd order accuracy. I can't refer you to a specific passage right now, but I am pretty sure that Ferziger & Peric cover this topic in their textbook. Anyway, it doesn't hurt to use higher order schemes, I just wanted to "warn" you that using a 3rd order scheme doesn't mean you'll have 3rd order accuracy. And because this is a 3rd order scheme, it produces numerical dissipation (not much, but it's there) - that's why personally I prefer to use 2nd order schemes which don't have any numerical dissipation at all.


So you're using the Menter/Langtry model - this is another cup of tea. Is it your own implementation or are you using mine? This is a 4-eq turbulence model so it doesn't make me wonder that OpenFOAM needs lots of iterations to solve 6 (actually 7) strongly coupled equations.

But here's another idea: what about your omega residuals, when do they reach 1e-8? Can you get me a residual plot of the whole simulation and the - let's say - first 5000 iterations?


Greetings,
Felix.

MadsR March 31, 2011 07:58

1 Attachment(s)
Hi Felix,

thanks. I just re-read the page in Ferziger :-) and you are completely right, thanks for pointing that out. Ferziger says as you: due to the mid point rule used to get the surface integrals, the overall accuracy is typically not more than 2nd order. One lose the improved accuracy of QUICK (and other >2nd-order-schemes for that matter).

The first truncation term on UDS looks like a diffusion term, dphi/dx, for CDS is d^2phi/dx^2 for QUICK it is d^3phi/dx^3 so I would not expect more numerical diffusion from QUICK than from CDS, or what do you say?

The Menter/Langtry (your implementation with altered correlation factors) entries should not confuse things in this thread (sorry about that), we see the same loooong convergence times with and without this transition model.

The plot is attached.

Thanks,
Mads

FelixL March 31, 2011 08:24

Hello, Mads,


yes, you are right, the first truncation term of QUICK acts like an additional diffusion term and it's of the order of 3. In CDS this term doesn't exist (and so don't every other diffusion like terms with odd powers), because it vanishes when doing a taylor expansion. This means CDS theoretically has no numerical dissipation at all! Since the term in QUICK is of third order, this shouldn't really be an issue, though. I didn't mean to confuse you, I've been working too much with LES lately, where even third order numerical dissipation might be an issue.

Thanks for the residuals plot. Some of the residuals seem to be stalling, especially omega. I suggest to set the tolerance level for every quantity (especially p) to something like 1e-10. This makes sure the equations are being solved, even if the residuals are fairly low.
Plus it wouldn't hurt to set tolerance for omega to 1e-16 since the omega residuals usually decrease pretty quick and sometimes the omega equation isn't being solved anymore after a few iterations.


Hope this helps,
Greetings,
Felix.

linnemann March 31, 2011 09:07

Hi guys

Thx for a good discussion :-)

just a small input, instead of this

Quote:

Plus it wouldn't hurt to set tolerance for omega to 1e-16 since the omega residuals usually decrease pretty quick and sometimes the omega equation isn't being solved anymore after a few iterations.
just put these two lines in the fvSolution where you solve for omega (or any other variable)

maxIter 100;
minIter 1;

Then you can control min and max iterations and always make sure that the equations are as minimum solved one time.

FelixL March 31, 2011 09:40

Hey, linnemann,


minIter=1 sounds great to me! I didn't know of this one before, but it seems pretty straightforward to me when there already is a maxIter ;)

I know, this is offtopic but I can't try right now 'cause I am at work: will the solver still stop when each equation has reached it's tolerance level?

Thanks for that remark!


Greetings,
Felix.

linnemann March 31, 2011 09:42

To put it short

Yes :)

FelixL March 31, 2011 09:47

Good news, I am excited to try this later when I come home ;) Thanks again.


Greetings,
Felix.

aerothermal April 1, 2011 07:58

congrats by the high level discussion.
i will try some of these tips right on. :)

FelixL April 1, 2011 13:17

Quote:

Originally Posted by linnemann (Post 301675)
Hi guys

Thx for a good discussion :-)

just a small input, instead of this



just put these two lines in the fvSolution where you solve for omega (or any other variable)

maxIter 100;
minIter 1;

Then you can control min and max iterations and always make sure that the equations are as minimum solved one time.


Thanks for the input, but unfortunately minIter isn't available in the official OpenFOAM distributions, so I have to go the old fashioned way.

Thanks again, though!


Greetings,
Felix.

MadsR April 4, 2011 09:03

Hey,

thanks for the great inputs.

minIter was a brilliant idea though :-)

We have changed all residual tolerances down to 1e-10 - 1e-16 but we still see very slow convergence, or put more correct: slow flow development.
Admittedly, it's been a while since I've been outside the safety-zone of the coupled solver of CFX so this might be just how it is...still OpenFOAM is much, much slower than CFX on this case. There must be something I can do about it because I don't see why OF should be any slower than CFX.

/Mads

FelixL April 4, 2011 23:16

Good morning, Mads,


Quote:

Originally Posted by MadsR (Post 302136)
Admittedly, it's been a while since I've been outside the safety-zone of the coupled solver of CFX so this might be just how it is...still OpenFOAM is much, much slower than CFX on this case.

I still can't believe OpenFOAM's performing so badly for this case! As I said this is totally against my personal experience. When you say "flow development" I guess you mean the convergence of integral parameters of the flow like lift and drag?

Here's an idea: if it's not confidential you might wanna upload or send me the mesh and I will do some simulations from scratch in between worktimes. There's gotta be something to improve this behavior!


Edit: Wait - I just had another look at your controlDict file and saw that you're simulating an AoA of 17°. Did you observe slow convergence also for the 0° degree AoA case or is it just for higher angles? If it's the latter forget all my comments about personal experience because it mostly constrains to low AoA cases, sorry about not noticing this earlier.

Just a few comments on that: High AoA means complicated flow patterns with much more closely coupled pressure and velocity (and turbulence) fields. So an increase of neccessary iterations with a segregated solver at higher angles of attack seems natural to me. Still, 50.000 is a big number and maybe someone else who ran similar cases with SIMPLE based solvers can comment on his personal experience.


Greetings,
Felix.

MadsR April 5, 2011 02:34

Hi Felix, and thanks again for your input.

Yes with flow development I mean the convergence of the integrated values. The high angle of attack in the controlDict file is coincidental, we see the long convergence times for all AoAs. I completely agree with you on the large difference of the flow pattern when going from the linear region at lower AoAs to the separated flow behaviour in the stalled region.

My masters student, José is actually responsible for these simulations and he is using a NACA0012 as a reference case, and starting point for more confidential geometries. I will ask him to upload his case in this thread.

Thanks,
best regards
Mads

jms April 5, 2011 11:08

Hi Felix,

I have sent to your e-mail the case you asked for to Mads because I cannot upload it in here.
Thank you very much for your help.

Regards,

José

FelixL April 5, 2011 14:13

1 Attachment(s)
Hello, José, hello Mads,


thanks for the case files.

I'm currently editing the files according to my personal experience and I try to limit the numerical schemes as little as possible. The limiters can cause some trouble regarding the convergence of the simulation (see e.g. this thread: http://www.cfd-online.com/Forums/ope...ime-steps.html ). I'm having problems keeping the simulation bounded, though (even with upwinding the divergence terms!), because, well, the mesh doesn't look too good in some places in my opinion.

Especially the corners of the blunt trailing edge are troublesome: if you have a look at the attached picture you can see that the skewness there is very high! checkMesh doesn't complain about that, but in my eyes this doesn't look good and I am pretty sure that's a reason why you needed to apply limiters to nearly every discretization scheme! I know that CFX is working without problems but OpenFOAM really is picky when it comes to mesh quality.

I'm still fiddeling around, maybe I can get it to work without running out of bounds - I surely won't make it this evening, maybe tomorrow. Currently my only suggestion is: get rid of the limiters and the only way to do this would be to remesh especially the trailing edge. An H-topology would surely increase the cell count but it'd remove the skewness issues there.


Greetings,
Felix.

FelixL April 6, 2011 02:08

1 Attachment(s)
Good morning,


I'll make it quick: I think it's working now, even without much limiting. I can't finish the run now, but I think it'll converge within about 5000 iterations, maybe less.

The changes I made are attached. Please do some upwinding for the turbulent quantities when you first start simpleFoame, otherwise the simulation will blow up. After 100-200 iterations you can comment out the upwind entries in fvSchemes.

I hope this works, now I gotta rush to work.


Greetings,
Felix.


All times are GMT -4. The time now is 08:19.