CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > ANSYS > FLUENT

Shifting from SIMPLE 2nd order to COUPLED pseudo-transient bumps comptn. time hugely

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 12, 2019, 17:44
Exclamation Shifting from SIMPLE 2nd order to COUPLED pseudo-transient bumps comptn. time hugely
  #1
New Member
 
Sri Harsha Maddila
Join Date: Mar 2018
Posts: 10
Rep Power: 8
harsha2398 is on a distinguished road
Hello,

I am running a cold-flow simulation of a combustion chamber in a turbine for my project and basically, I move from SIMPLE first-order schemes (250 it.) to second order upwind (500 it.) and finally, COUPLED pseudo-transient (2000+ it.) to arrive at the solution. While the SIMPLE schemes' iterations take around 2 hours to complete, the pseudo-transient COUPLED (or just COUPLED for that matter) iterations - estimated according to FLUENT in the console - take 400+ hours and keep increasing iteration after iteration!!

Here's a picture of the mesh cross-section: https://1drv.ms/u/s!As6k8_h0tQpljv5xqp7UyEwktHg3JQ

The mesh has 3,314,331 polyhedral cells.
Operating conditions: 4% pressure drop from the plenum (left) to the outlet (right, the exit of the combustion chamber)
I am running this case on my Inspiron 5558 in parallel (Solver: 4 processes) (16GB RAM, dual-core, i7 Ultrabook processor).

Could anyone tell me why this is happening? Is this merely a lack of computational resources? What else could be done to achieve convergence?

P.S: A layman's explanation as to what a timescale factor is and what it means would be much appreciated as well (I used 0.1 in this simulation)
harsha2398 is offline   Reply With Quote

Old   April 15, 2019, 13:21
Default
  #2
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,676
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
1)Per iteration, COUPLED is slightly slower than SIMPLE algorithm. As long as COUPLED is running only slightly slower each iteration, then it is not a hardware problem. If suddenly it takes 100x longer to do 1 iteration then it's a hardware issue. This is easily checked during runtime by just checking if you have used up all your RAM or not.

2) Many iterations needed to not converge means you have a convergence problem. You just need to tweak the settings until you get the desired behavior. There's no real clear way to do this because you have to figure out what is limiting your convergence.

3) Time-scale.... You must read this and look at equation 20-92. Fluent takes a wild guess at a length scale (it takes 0.3* the domain length) and another wild guess at a velocity scale (it looks for the maximum velocity). It uses this to calculate a time-scale for your case. This way to estimate the time-scale is not a one-method-to-rule-them-all time-scale. If you arbitrarily extend your domain length (by extending the outlet) by 1km, your new length scale is... voila longer despite the flow in the nozzle being exactly the same. Because of this arbitrary method to automatically calculate a time-scale, there is an arbitrary timescale factor setting for you to tweak for your case. You should adjust the timescale factor until you get a reasonable pseudo-time-step size. Or you can do pseudo-time-stepping the right way and just specify the pseudo-time-size.


If you specify a very small time-step then you'll need more iterations to converge (assuming convergence is not limited by numerical instabilities). If you specify a very large time-step then you can converge in fewer iterations (at risk of being numerically unstable). If you have numerical instabilities, then all hell breaks loose.

Last edited by LuckyTran; April 15, 2019 at 16:28.
LuckyTran is offline   Reply With Quote

Old   April 15, 2019, 14:25
Default
  #3
New Member
 
Sri Harsha Maddila
Join Date: Mar 2018
Posts: 10
Rep Power: 8
harsha2398 is on a distinguished road
Well, then it is a hardware limitation. Thank you for the insight! I couldn't really open any app while Fluent was running at the time, because I was using all 4 processes and a single click of a mouse would mean my system crashing.

Sorry for the ambiguous question, there was no problem achieving convergence. The simulation used to converge in SIMPLE second order iterations, typically around 600 iterations in (1e-03). But is there no other way to get the scaled residuals down to 1e-06, than using COUPLED schemes? Because the SIMPLE 2nd order scheme isn't able to go below 1e-04.

Thank you for your help!
harsha2398 is offline   Reply With Quote

Old   April 15, 2019, 16:32
Default
  #4
Senior Member
 
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,676
Rep Power: 66
LuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura aboutLuckyTran has a spectacular aura about
Yeah COUPLED uses up a bit more RAM than SIMPLE. So if you were close to your RAM limit on SIMPLE, it's easy to suddenly be out of memory.



Quote:
Originally Posted by harsha2398 View Post
But is there no other way to get the scaled residuals down to 1e-06, than using COUPLED schemes? Because the SIMPLE 2nd order scheme isn't able to go below 1e-04.

This is a feature, not necessarily a problem. Residuals going down is a hint that your solution is converging. But eventually you hopefully achieve iterative convergence with some arbitrary combination of settings you saturate your grid. When you change settings, you change the way you solve the equations and of course the residuals will change. Residuals being at 1e-04 isn't necessarily better or worse than residuals at 1e-06 and neither are outright wrong.
LuckyTran is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[General] Extracting ParaView Data into Python Arrays Jeffzda ParaView 30 November 6, 2023 21:00
Moving mesh Niklas Wikstrom (Wikstrom) OpenFOAM Running, Solving & CFD 122 June 15, 2014 06:20
How to write k and epsilon before the abnormal end xiuying OpenFOAM Running, Solving & CFD 8 August 27, 2013 15:33
IcoFoam parallel woes msrinath80 OpenFOAM Running, Solving & CFD 9 July 22, 2007 02:58
Could anybody help me see this error and give help liugx212 OpenFOAM Running, Solving & CFD 3 January 4, 2006 18:07


All times are GMT -4. The time now is 10:10.