Modelling (CHT) Natural Convection over a Heatsink
Hi,
I've created a very dense mesh for a heatsink and surrounding fluid. For unknown reasons I've been struggling to get the model not crash (fatal overflow in the linear solver). This has led me to believe that the mesh is refined enough to be picking up vortex events and the steady state assumption is causing the model to fail. The ANSYS support team suggest upping the timescales (probably to smooth over these flow features) and to use a turbulence model to add stability to the solution. My calculations show the flow should be laminar, but the ANSYS service request team have suggested running the model with the SST turbulence model as this provides necessary damping. They say that as the velocities are very low it shouldn't effect the heat transfer between the heatsink and fluid much. My understanding is that the SST model enforces a wall function based around the turbulent velocity profile and the law of the wall. Doesn't this mean that there will be much higher velocity gradients in the near wall region as a result of this wall function being applied and therefore higher rates of heat transfer or am I missing something? I'm really just looking for an explanation or a direction to some relevant reading. I'm not sure why this is an acceptable solution. Thanks. |
Quote:
And also this FAQ: http://www.cfd-online.com/Wiki/Ansys...gence_criteria Quote:
Quote:
Quote:
Quote:
I recommend following the FAQ advice on "my simulation converges for a while..." instead of putting a turbulence model in a laminar flow. |
All times are GMT -4. The time now is 21:29. |