Problem with grid dependency test
I'm doing a grid dependency test on a simple case and i got weird results: when i decrease the size of my mesh (it gets finer) my results oscillate more and more in a random way.
My case is the following: it is a simple box (1.5x2x0.9 m^3) with one inlet cylinder (5cm for radius) and one outlet cylinder on the opposite face (radius 10cm). In this box there is 2 smaller boxes (about 0.3x0.2x0.2 m^3) which produce a heat flux of about 1500 W/m^2. A mass flux is imposed on the inlet face and the inlet temperature is 283K.
I imposed the target size of the surface mesh of the two inside boxes to 0.02m and 5 prism layers with a total thickness of 0.01m. For the walls of the big box, i imposed 3 prism layers with a thickness of 0.01m as well. I remeshed the inlet cylinder.
All these parameters are now fixed and i want to modify the base size to get its optimal value. I decreased the base size from 0,1m to 0.03m, my results (surface average temperature of the inside boxes) converge and are stable from 0.1m to 0.5m but with a finer mesh they start oscillating with an amplitude of 10 degres or more and in a random way. I would have expected that the more i decrease the size of the mesh, the better would be my results, it's the case.
Do you have an idea why is it going that way? Am i doing wrong something?
Thank you very much for your help!
|All times are GMT -4. The time now is 02:41.|