Numerical dispersion?
Hello, i'm modelling water flow between 2 parallel smooth plates. The model is then like a long thin box with a length of 26.8cm a width of 17.7cm and a height of 1.06mm. I'm using a laminar solution and i'm recording the discharge through the outlet gradually increasing the gradient from 0.0005 to 1. I expect the discharge to be linearly related to the gradient. But i found out that increasing the gradient the ratio discharge/gradient is not constant but it sensibly decreases. Do you think in this system non linear pressure losses could arise affecting the linearity, or is it possible that numerical dispersion causes this discrepancy(i tried to refine the mesh but it doesn't affect the results)?
|