CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM (http://www.cfd-online.com/Forums/openfoam/)
-   -   Cumulative continuity error large in parallel simulations (http://www.cfd-online.com/Forums/openfoam/81457-cumulative-continuity-error-large-parallel-simulations.html)

xiao October 27, 2010 09:42

Cumulative continuity error large in parallel simulations
 
1 Attachment(s)
Dear all,

It is not the first time people have reported that they get different results in serial and parallel simulations using OpenFOAM.
One immediate explanation is that the preconditioning in parallel indeed has some randomness, depending on the order or operations.
http://www.cfd-online.com/Forums/ope...llel-runs.html
Also, one has to admit that preconditioning and iterative linear solver is no easy task in parallel.

However, the problem I have now cannot be explained by any of the things I found in the forum or anything I can imagine:

My cumulative continuity error is large in the parallel simulations,
even if the pressure equation converged OK! Serial simulations are fine.

Here is a summary of my solver and case:
Solver: pisoFoam (RANS + LaunderSharma KE model), I added a random forcing in certain regions as source terms in momentum equation. (without the forcing, everything was OK).

Case: periodic channel flow, with a small hump in the middle (even with the simple Re395 channel flow case, the problem below still exists, but less sever). Mesh is OK, with max non-orthogonality of 30 degrees.

BC: walls on the top and bottom. Cyclic BCs on the streamwise and spanwise directions.

fvSolution/fvScheme: both GAMG and PCG/PBiCG are tried. Pressure converges to tolerance of 1E-8 (relTol=0) for the final solving (pFinal), othewise to 1E-6 (relTol 0.04). Velocity and k, epsilon converged to 1E-8 or even smaller.
nCorrectors = 2;
nNonOrthogonalCorrectors 1;
Time stepping: Euler
Spatial: limitedLinear for div. Linear for Laplacian.

Here is the output for the last step:

Time = 8000

Courant Number mean: 0.00746705 max: 0.0869933
GAMG: Solving for Ux, Initial residual = 0.00035758, Final residual = 9.2098e-09, No Iterations 1
GAMG: Solving for Uy, Initial residual = 0.00168457, Final residual = 4.57093e-12, No Iterations 2
GAMG: Solving for Uz, Initial residual = 0.00175593, Final residual = 6.0287e-12, No Iterations 2
GAMG: Solving for p, Initial residual = 0.00536092, Final residual = 0.000108022, No Iterations 2
GAMG: Solving for p, Initial residual = 0.000290028, Final residual = 7.85811e-06, No Iterations 3
RAS time step continuity errors : sum local = 6.63938e-07, global = -6.63775e-07, cumulative = -0.0306293
GAMG: Solving for p, Initial residual = 6.99397e-05, Final residual = 2.91147e-06, No Iterations 2
GAMG: Solving for p, Initial residual = 1.1464e-05, Final residual = 8.45853e-09, No Iterations 10
time step continuity errors : sum local = 6.63775e-07, global = -6.63775e-07, cumulative = -0.03063
GAMG: Solving for epsilon, Initial residual = 0.00102844, Final residual = 2.50781e-12, No Iterations 2
GAMG: Solving for k, Initial residual = 0.00102089, Final residual = 2.54785e-12, No Iterations 2
RAS uncorrected Ubar = 0.020188 gradP = 1.27122e-05
ExecutionTime = 22810.4 s ClockTime = 24581 s

The time history of the cumulative continuity error and (instantaneous) global continuity error is plotted and attached. Note the latter is scaled by 1E4, in order to appear on the same plot.

The serial runs are normal, with the cumulative error in the same order as instantaneous cont. error or smaller, even with large forcing on the momentum equation.

I think the cyclic boundary condition may be the reason. The flux through those boundaries are not exactly zero, but in the order of 1E-5, as I examined using patchIntegrate. But why is the continuity error in parallel cases are all in the same sign, so that they accumulate instead of cancel each other? Why does it only appear in parallel? I don't understand.

Any insights or suggestions are appreciated!


Best
Heng

iznal February 23, 2011 09:04

same situation
 
hi

I've got the same PROBLEM like yours. Have you tried to simulate it with the whole system, is this Problem real from the cyclicggi face?

thx

xiao February 23, 2011 12:22

The problem is due to the cyclic faces, parallel computing, plus a forcing term in my momentum equation. So, when do serial computation, everything was fine. when I don't have the forcing term, everything was fine ... the combination of all these (cyclic + parallel computing + forcing) lead this problem.

Heng

iznal March 7, 2011 05:13

one more question
 
Quote:

Originally Posted by xiao (Post 296587)
The problem is due to the cyclic faces, parallel computing, plus a forcing term in my momentum equation. So, when do serial computation, everything was fine. when I don't have the forcing term, everything was fine ... the combination of all these (cyclic + parallel computing + forcing) lead this problem.

Heng

Thx for the quick replying. I'm really confused here. Does OpenFOAM have some possibilities to solve this problem? I'm using OpenFOAM 1.5 right now, have u tried the new versions? Could the solvers be better than them in OpenFOAM 1.5?


All times are GMT -4. The time now is 16:44.