Which factors can suppress the amplitude of a wave?
I give a small time-dependent disturbance to a steady flow field (boundary layer) to generate a wave (TS wave). However, I find that the wave’s amplitude is smaller than what it should be at high Reynolds number region. Is there anybody who knows the reason?
The numerical model I use is 2nd order back difference in time (fully implicit) and 2nd order centered difference in space. Is it possible that the amplitude of the wave is suppressed by the numerical dissipation of the 2nd order back difference?
In my experience the 2nd order backward difference in time is highly dissipative for courant numbers higher than 1-2 (see Hirsch), actually the whole frequency spectrum will be highly damped (except for a narrow band in the highest resolved frequency region). So this could be the cause of your problem.
Usually, for time dependent problems, the convective part is treated explicitly because of this (also because big courant numbers are not physical for the unsteady convection). If you're using fluent than you have to use a time step such that an optimal value of 0.1-0.5 for the courant number is reached.
|All times are GMT -4. The time now is 10:43.|