CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Serial run OK parallel one fails

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 16, 2005, 10:36
Default HI, I wonder if someone coul
  #1
Member
 
Radu Mustata
Join Date: Mar 2009
Location: Zaragoza, Spain
Posts: 99
Rep Power: 17
r2d2 is on a distinguished road
HI,
I wonder if someone could give me some help/advice on the problem that I have:
I am doing a run with some modified version of "oodles" and the problem that I encounter is that in the "serial" version, on one processor that is, everyting works fine (not sure that it produces the right result but at least is running) whilst in the parallel case (on e.g. 4 processors with LAM) it fails at the first step with the message:

Starting time loop

Time = 0.25

Mean and max Courant Numbers = 0 0.001875
deltaT = 0.275
BICCG: Solving for Ux, Initial residual = 1, Final residual = 1.94e-06, No Iterations 1
BICCG: Solving for Uy, Initial residual = 1, Final residual = 1.94e-06, No Iterations 1
BICCG: Solving for Uz, Initial residual = 1, Final residual = 2.38839e-06, No Iterations 1


--> FOAM FATAL ERROR : Continuity error cannot be removed by adjusting the outflow.
Please check the velocity boundary conditions and/or run potentialFoam to initialise the outflow.

From function adjustPhi(surfaceScalarField& phi, const volVectorField& U,const volScalarField& p
in file adjustPhi/adjustPhi.C----------------------------------------------------------- ------------------

...whils in the "serial" case I get

Starting time loop

Time = 0.25

Mean and max Courant Numbers = 0 0.001875
deltaT = 0.275
BICCG: Solving for Ux, Initial residual = 1, Final residual = 8.64385e-07, No Iterations 1
BICCG: Solving for Uy, Initial residual = 1, Final residual = 8.64385e-07, No Iterations 1
BICCG: Solving for Uz, Initial residual = 1, Final residual = 1.19653e-06, No Iterations 1
ICCG: Solving for p, Initial residual = 0.893427, Final residual = 8.78504e-07, No Iterations 165
time step continuity errors : sum local = 1.94843e-09, global = 1.40878e-21, cumulative = 1.40878e-21
ICCG: Solving for p, Initial residual = 0.000204348, Final residual = 7.83401e-07, No Iterations 15
time step continuity errors : sum local = 2.93751e-09, global = 1.21109e-21, cumulative = 2.61988e-21
BICCG: Solving for cscl, Initial residual = 1, Final residual = 5.25619e-15, No Iterations 2
min(cscl) = 0
max(cscl) = 1
ExecutionTime = 20.35 s

..and etc...

I think I did all the setup steps right for the parallel case, cause I used to run oodles and others in parallel and no problem.
Cheers,
Radu
r2d2 is offline   Reply With Quote

Old   November 16, 2005, 10:47
Default I think I got what was wrong.
  #2
Member
 
Radu Mustata
Join Date: Mar 2009
Location: Zaragoza, Spain
Posts: 99
Rep Power: 17
r2d2 is on a distinguished road
I think I got what was wrong. Splitting the domain so that the periodic patches do not match anymore, as in if x and y directions are periodic one has to split along the non-periodic one, z that is. Now it runs allright.
Cheers again,
Radu
r2d2 is offline   Reply With Quote

Old   November 16, 2005, 12:44
Default This is not true. if x and
  #3
Senior Member
 
Maka Mohu
Join Date: Mar 2009
Posts: 305
Rep Power: 18
maka is on a distinguished road
This is not true.

Quote:
if x and y directions are periodic one has to split along the non-periodic one
.

You do NOT have to split the domain in the non-periodic direction. I tried to run the channelOodles in parallel and split in the x direction (periodic) and it works. Also, logically if the code dectates the direction in which you have to split the domain, this may be a serious limitation to scalability. This because that direction should be decided based on minimum area/volume ratio of the partions (among other considerations, load balance, similicity of communication pattern) to reduce communication overhead. channelOodles does NOT put constrain on the direction of spliting.

I think the error message is trying to tell you something. You may start by trying to run the stadard channelOodles in parallel with the same partitions and if it run (it did for me), you can start suspecting the modifications you made to the solver for being the reason behind the error.

best regards,
Maka.
maka is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
More DPM incompletes in parallel than in serial Paul FLUENT 0 December 16, 2008 09:27
Serial vs parallel different results luca OpenFOAM Bugs 2 December 3, 2008 10:12
Parallel convergence worse than serial! Michael B FLUENT 3 December 9, 2006 21:46
Problem with Parallel not with Serial iyer_arvind OpenFOAM Running, Solving & CFD 0 September 18, 2006 06:03
parallel Vs. serial co2 FLUENT 1 December 31, 2003 02:19


All times are GMT -4. The time now is 08:22.