|April 16, 2009, 07:41||
Join Date: Mar 2009
Posts: 4Rep Power: 9
I'm doing some sensitivity analysis on altering the absorption coefficient a subdomain, and am getting a bit muddled trying to figure out the actual impact.
The set up is as follows (and shown in the attached image):
A 100W/m2 directional radition source is applied to an opening (inlet), set at 303K fixed temperature, with a very small velocity to aid convergence. 10metres away, the opening (outlet) is set to the same fixed temperature. All other surfaces are symmetry planes, there is no buoyancy, and the Monte Carlo radiation model is set.
In the results file, the incident radiation for a zero absorption coefficent on the xmax face is around 2008W/m2, and similarly for the xmin face. When the radiation source is set to zero, the incident radiation is reduced to 1909W/m2 on the inlet and outlet, confirming the 100W/m2 difference. My question is, where is the other 1900W/m2 coming from??
Thanks in advance,
|Thread||Thread Starter||Forum||Replies||Last Post|
|Surface integrals report: incident radiation||Novice||FLUENT||2||August 11, 2009 15:51|
|Incident Radiation||yilmaz||FLUENT||1||December 16, 2008 09:05|
|radiation temperature ,G(the incident radiation)||ymg||FLUENT||0||October 16, 2008 08:57|
|DO Surface incident radiation Problem||Thomas||FLUENT||0||March 18, 2005 04:55|
|Fluent incident radiation problem||Michael Schwarz||Main CFD Forum||0||October 21, 1999 05:56|