C_P(c,t) is giving incorrect values.
I have been struggling with this problem in my UDF. I am working on getting density to be a function of "Static Pressure". I have read that C_P(c,t) is supposed to give me static pressure relative to the operating pressure set. I interpret that as:
C_P(c,t) = Static Pressure - Operating Pressure
So in my UDF I define Pressure to be:
P = C_P(c,t)+RP_Get_Real("operating-pressure")
However when I include a Message Statement to printout the values of pressure, I get values that are anywhere between -7000000 to 7000000. I need my static pressure to be around 100 Pa in order for my equation to work.
Does anyone have suggestions as to how I should handle C_P(c,t)?
Density is changing with pressure in a manner similar to:
rho = rho_initial/(1+c/P)
where c is a value changing with time.
As I initialize the solution (even using a UDF DEFINE_INIT) to make the entire domain 100 Pa, the Message printout which is:
Keeps displaying zero.
If I run a quick contour plot, the static pressure is 100 Pa.
The solution immediately diverges at the beginning of the simulation because if Pressure is equal to zero, then obviously density is being divided by zero from how I have it defined.
Do you solve this issue?
I am facing the similar problem.
Yes, I did solve my issue. There was a problem with another part of my code which gave incorrect values.
To properly use static pressure you should have in your UDF:
real P_operating = RP_Get_Real("operating-pressure");
P_absolute = C_P(c,t)+P_operating;
The implementation obviously depends on what your UDF looks like. If you attach it, I can try to take a look at it.
|All times are GMT -4. The time now is 06:45.|