Turbulence model and mesh size
I am currnetly working on the modelling of the flow behaviour in a quite big reservoir. Basicly the inlet comes from a plant while the outlet is a circular weir. Both are modelled as mass sources.
I used a finer nested mesh around the inlet and a wider one anywhere else. Thanks to this model, I was able to model a correct behaviour that was described by previous measurments on site.
I decided then to run sensibility tests on the mesh. Setting a finer uniform mesh (of the size of the previously nested one), I realised I was not able to find the same flow behaviour, even after a longer simulated time...
I cannot figure out why at the moment. I am using RNG model, setting a constant value of TLEN (which works better in this case tha the new algorithm) at 7% of the smallest domain dimension. I suspect it could come from turbulence modelling as I have checked I doesn't come from the interpolation at the nested mesh interface...
Is it possible that the size of the cells could impact the rate of turbulent kinetic energy dissipation, transportation or production?
Thanks a lot for your help !
The current FLOW-3D newsletter hints & tips section describes the mesh sensitivity of the turbulence model. Basically, the mesh can't be too fine or too coarse, and there are ways of determining the right size range using assumptions about the flow behavior. See http://www.flow3d.com/resources/news...ewsletter.html for details. See if you can fit the results you're seeing (both correct and incorrect results) to the guidelines in the newsletter.
|All times are GMT -4. The time now is 23:59.|