combined convection radiation boundary
Hi everybody,
I am currently investigating a cavity which is subject to radiation from a solar simulator (an assembly of mirrors and a Xe arc lamp) and the convective losses from it. Therefore, I set up a large cylindrical fluid domain around the cavity in order for the walls not to influence the buoyancy-driven flow around the cavity. From a raytracing code I obtain radiative fluxes on the different surfaces within the cavity, but as the cavity is open there is a certain fraction of reradiation from those surfaces leaving the cavity and heating up the surrounding walls. (which sooner or later results in a solver blowup) I attempted to model the surroundings as openings but I got very unphysical and large mass fluxes and velocities so I decided to keep the setup as an enclosure. I am now in need for a boundary condition for these surrounding walls, which acts as a fully transparent wall, such that the surroundings are not constantly heated by reradiation. I would be very happy to read about any experiences you have with this issue. Meanwhile, my first attempt will be an inet/outlet, setting all velocity components to zero, blackbody T to 0 and T to T_inf (about 293K). Please discuss with me the degree of modelling absurdity I am proposing... :) Cheers, Max |
All times are GMT -4. The time now is 09:47. |