CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (http://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Serial run OK parallel one fails (http://www.cfd-online.com/Forums/openfoam-solving/60427-serial-run-ok-parallel-one-fails.html)

r2d2 November 16, 2005 11:36

HI, I wonder if someone coul
 
HI,
I wonder if someone could give me some help/advice on the problem that I have:
I am doing a run with some modified version of "oodles" and the problem that I encounter is that in the "serial" version, on one processor that is, everyting works fine (not sure that it produces the right result but at least is running) whilst in the parallel case (on e.g. 4 processors with LAM) it fails at the first step with the message:

Starting time loop

Time = 0.25

Mean and max Courant Numbers = 0 0.001875
deltaT = 0.275
BICCG: Solving for Ux, Initial residual = 1, Final residual = 1.94e-06, No Iterations 1
BICCG: Solving for Uy, Initial residual = 1, Final residual = 1.94e-06, No Iterations 1
BICCG: Solving for Uz, Initial residual = 1, Final residual = 2.38839e-06, No Iterations 1


--> FOAM FATAL ERROR : Continuity error cannot be removed by adjusting the outflow.
Please check the velocity boundary conditions and/or run potentialFoam to initialise the outflow.

From function adjustPhi(surfaceScalarField& phi, const volVectorField& U,const volScalarField& p
in file adjustPhi/adjustPhi.C----------------------------------------------------------- ------------------

...whils in the "serial" case I get

Starting time loop

Time = 0.25

Mean and max Courant Numbers = 0 0.001875
deltaT = 0.275
BICCG: Solving for Ux, Initial residual = 1, Final residual = 8.64385e-07, No Iterations 1
BICCG: Solving for Uy, Initial residual = 1, Final residual = 8.64385e-07, No Iterations 1
BICCG: Solving for Uz, Initial residual = 1, Final residual = 1.19653e-06, No Iterations 1
ICCG: Solving for p, Initial residual = 0.893427, Final residual = 8.78504e-07, No Iterations 165
time step continuity errors : sum local = 1.94843e-09, global = 1.40878e-21, cumulative = 1.40878e-21
ICCG: Solving for p, Initial residual = 0.000204348, Final residual = 7.83401e-07, No Iterations 15
time step continuity errors : sum local = 2.93751e-09, global = 1.21109e-21, cumulative = 2.61988e-21
BICCG: Solving for cscl, Initial residual = 1, Final residual = 5.25619e-15, No Iterations 2
min(cscl) = 0
max(cscl) = 1
ExecutionTime = 20.35 s

..and etc...

I think I did all the setup steps right for the parallel case, cause I used to run oodles and others in parallel and no problem.
Cheers,
Radu

r2d2 November 16, 2005 11:47

I think I got what was wrong.
 
I think I got what was wrong. Splitting the domain so that the periodic patches do not match anymore, as in if x and y directions are periodic one has to split along the non-periodic one, z that is. Now it runs allright.
Cheers again,
Radu

maka November 16, 2005 13:44

This is not true. if x and
 
This is not true.

Quote:

if x and y directions are periodic one has to split along the non-periodic one
.

You do NOT have to split the domain in the non-periodic direction. I tried to run the channelOodles in parallel and split in the x direction (periodic) and it works. Also, logically if the code dectates the direction in which you have to split the domain, this may be a serious limitation to scalability. This because that direction should be decided based on minimum area/volume ratio of the partions (among other considerations, load balance, similicity of communication pattern) to reduce communication overhead. channelOodles does NOT put constrain on the direction of spliting.

I think the error message is trying to tell you something. You may start by trying to run the stadard channelOodles in parallel with the same partitions and if it run (it did for me), you can start suspecting the modifications you made to the solver for being the reason behind the error.

best regards,
Maka.


All times are GMT -4. The time now is 18:54.