An strange situation:bigger maxCo leads to slower run
I changed maxCo and observe that
maxCo=.1 then 1micro second in simulation time takes 32s in run time
maxCo=.2 run time=50s
maxCo=.15 run time=45s
how can such thing be possible?(it's possible because has occurred:D but what may be the cause?)
my case is compressible and unsteady.
I thought maybe it was because of unsteadiness but it was 33s when I changed to .1s again.
I figured it out!
its because of tolerances.when timeStep is larger because solver should reach residuals to the specified tolerance MORE iterations should be done.then maybe time duration takes even more!
it depends on tolerances that specified too.
"HIGHER timeStep is NOT necessarily equal to LOWER Time"( at least in unsteady simulations I know its so)
is there any other comment that I may have missed?
|All times are GMT -4. The time now is 05:55.|