temperature dependent heat flux
So I am trying to set up a model in Fluent where in I specify a modified flux boundary condition on the wall in my domain. In this case, the wall is adjacent to a solid zone. I intend to model the wall as a thin opaque surface, which when subjected to a certain flux value, absorbs and re-emits radiation. in this case, the difference between what is absorbed and emitted (to the external surroundings) is essentially the amount of heat that is conducted through the solid adjacent to the wall.
So, my boundary condition at the wall becomes:
wall_abs*(Q_wall) = wall_eps*sigma_sbc*(T_wall^4) - k_solid*grad(T)
wall_abs = absorptivity of the wall; wall_eps = emissivity of the wall; sigma_sbc = stefan-boltzmann constant; k_solid = thermal conductivity of the solid adjacent to the wall; grad(T) = temperature gradient at the wall; T_wall = wall temperature.
When I implement my BC at the wall to be temperature dependent by using DEFINE_PROFILE, it doesn't seem to work out very well and returns unphysical results. I basically had my profile written out as:
real eps_wall=0.2; /*emissivity of the wall*/
real abs_wall=0.2; /*absoprtivity of the wall*/
real H = 1000; /*Incident irradiation*/
F_PROFILE(f,t,i) = (abs_wall*H) - (eps_wall*SIGMA_SBC*pow(F_T(f,t),4)); /*Net flux that is conducted from the wall into the solid cells adjacent to the wall*/
Does anybody have any suggestions on how to go about this?
|All times are GMT -4. The time now is 01:38.|