Temperature loss in compressible solvers in high speed flows
Hi all, during the past week I've made some validation tests concerning the heat transfer modeling performances of the chtMultiRegionFoam solver (1.7.1 version with the last updates released for the 1.7.x version). The test case is very simple and very easily comparable with basic scientific literaure: flow at 20°C over a flat plate at a constant temperature of 100°C. Well, the laminar validation was satisfactory (the agreement between the calculated mean convective heat transfer coefficient and the analytical Nu(Re,Pr) correlation for laminar boundary layers was within 10%), but when I tried to switch to the fully turbulent boundary layer case some strange things occured...As the domain in question is not very big (the lenght of the plate is 0.5 m), to achieve a fully turbulent flow condition (Lenght of the plate >> transition lenght) I was forced to increase the far field velocity of the incoming flow from 2 m/s to 50 m/s: after doing that, the simulation exhibits a very unstable behavior, but the most disappointing effect was that in some portions of the fluid domain the temperature falls significantly below 20°C (about 0°C in "worst cases"), which of course is not physical!
After a lot (really a lot) of time spended in trials and investigations to find where the problem is, my conclusions (till today) are:
-it is not a mesh quality or discretization schemes matter: the mesh is the best possible (hexas with power-law growing inflation layers near the solid surface), and both first order (upwind) and limited second order (linearUpwind with limiters on gradient) schemes exhibits the same kind of behavior;
-it should not be also a time step lenght problem: there was actually no effect on the solution passing from a maxCo limiter of 0.95 to 0.3;
-I also don't think the problem is i turbulence modeling: I used RAS models with wall functions, and the temperature loss appeared in standard as well as RNG k-epsilon simulations; additionally, the temperature loss appeared also with turbulence modeling turned off, by simply increasing the far field velocity of the flow.
So, judging also from a coupple of older posts about similar solver-behavior observations (see for instance http://www.cfd-online.com/Forums/ope...-enthalpy.html or http://www.cfd-online.com/Forums/ope...ture-loss.html) , it seems like there are some (in my opinion) non-negligible performance limitations in a number of OpenFOAM's compressible solvers when the velocities become something higher than a few m/s...I still have to better investigate the problem from a modeling point ov view (implementation of energy equation), but I think this could be an interesting discussion matter. Can someone share some experience about it?
Thank you all
|All times are GMT -4. The time now is 07:52.|