heat transfer with RANS wall function, over a flat plate (validation with fluent)
Hi all,
I am following this test case,(forced convection over a flat plate) https://confluence.cornell.edu/displ...+Specification OpenFOAM vs (Fluent & Theory & Experiment) It is a compressible, RANS, lowRe grid with realizable kepsilon model. The above link matches that fluent , theory , experiment results in agreement. My aim is to prove the same from OpenFOAM side. I use 2.1.x (latest git) rhoSimpleFOAM solver, realizableKE (RANS turbulence model !) So i try to use same settings as in fluent above. Test 1: laminar The heat transfer values at the plate are in agreement with fluent. I have used (wallHeatFlux latestTime) the standard utility Test 2: with turbulence model The heat transfer values are quite different :eek: !! My doubt is either the realizableKE model or wall functions. U: mutkWallFunction k: compressible::kqRWallFunction epsilon: compressible:epsilonWallFunction alphat: alphatJayatillekeWallFunction Since the pressure variations are very small, it is not a good idea to work with abs. pressure field , so i modified thermophysical models and recompiled rhoSimpleFoam solver. Now that, i have guage pressure formulation for pressure !!! Now i do not know why i get different wall heat flux value on the plate when compare to fluent (of course, fluent results are in agreement with experiment and theory as i said above) inside rhoSimpleFoam: run ./Allwmake if you need laminar test case , let me know. The heat flux value from fluent and OpenFOAM (in case you do not have fluent) Code:
#position Fluent OpenFOAM If someone is curious to validate OpenFOAM Wall function here, Link for OpenFOAM test case, http://cdn.anonfiles.com/1335277397283.zip Thanks 
Hi,
I observe the same differences for the heat flux value between fluent and OpenFOAM for this test case. I use buoyantSimpleFoam solver, realizableKE. I don't understand these differences. If someone has an idea of the reason for these differences? I think if the can come from the difference of calculated heat flow. I calculate the heat flux: k * magGradT and magGradT is calculated with "foamCalc magGrad T" And I compare in Fluent with Total Surface Heat Flux but I don't sure that it is equivalent. Best regard, 
Hi,
In order to simplify my case, i created blockMesh coarse grid with yPlus (or yStar) from 16 to 23. And have changed turbulence model to standard kepsilon instead of realizableKE. I have results from fluent with standard kepsilon with standard wall function so that i would verify this in OpenFOAM. Unfortunately, it is still not comparable. I feel that there could be a bug some where. I used rhoSimpleFoam although buoyancy solver can also be used, it is the same for our case. Heat flux in OpenFOAM is: qDot = alphaEff v> f * grad(h) but i am not sure about fluent. here is the complete case set: http://cdn.anonfiles.com/1336156429443.zip Upon generating results which is comparable to fluent, one can say that OpenFOAM wall functions are working as so. Thanks 
Hello,
Did you succeed to get accurate results ? Did you find any improvement ? Thx, Fred 
I cannot run either for some reason  I am trying to find a validation case where heat transfer is solved to the walls, utilizes a wall function, and is internal flow. Does anyone know of any cases that can be validated?

Hi,
as far as i remember, my test case is correct. OpenFOAM and Fluent gave well comparable results. The problem was in my side while calculating heat flux. The values posted also correct. May be it will help you to further expriment with. Rgds, 
All times are GMT 4. The time now is 18:30. 