Measuring numerical diffusion and dispersion - strange effects
I was attempting to measure numerical diffusion and dispersion when I encountered unexpected behavior. The test case is attached.
The test case is a 1D case. An acoustic wave is initialized. Cyclic boundary conditions are used in the direction of wave propagation. In the example 30 wave lengths are fit into the domain. The compressible solver coodles is used. A small acoustic CFL number of 0.2 is set. In general, solver parameters are set to overly conservative values. Then I intended to compare the analytical time series for the pressure at a monitor point with the numerical solution.
Oddly, I found mean pressure, velocity as well as temperature to increase during the run time.
For typical solver parameters, results are much worse. I would realy like to know if this is an issue of the setup or the code and would appreciate your input.
|All times are GMT -4. The time now is 22:05.|