Problems with the density at walls for a compressible flow
I am trying to run a simulation for a compressible flow in a pipe by using the Launder-Sharma K-Epsilon turbulence model in the version 1.5-dev from OpenFOAM (I have some interfaces in my mesh).
For this simulation, I use a compressible solver : rhoPorousSimpleFoam and I have defined the following boundary conditions :
- Inlet : total pressure
- Outlet : static pressure
- Wall : k and epsilon = 1.e-20 (Launder-Sharma k-Epsilon model), and for p and T, I use the type zeroGradient.
For this simulation, I use a relaxation factor from 0.05 for the density.
My problem is that, after some iterations, the density becomes more and more higher and I can obtain values up to 100 (and more) for the density.
By using Paraview for the postprocessing, I have seen that the high densities are at the walls (the first cells at the walls) but I do not understand why.
I already have remeshed my geometry to try to get a better mesh (orthogonal cell at the walls) but the problem persists.
I already have increased the number of nNonOrthogonalCorrectors but the problem persists.
To reproduce my problem, I have meshed a straight pipe and used the same setup. With the straight pipe (hexahedra), there is no problem and with a complex geometry (hexcore mesh from Tgrid), I do not understand what happens.
I presume that there are some problems with my setup but I do not understand where.
Do you have any idea ?
|All times are GMT -4. The time now is 21:00.|