
[Sponsors] 
April 3, 2007, 07:44 
I was performing a parallel pe

#1 
Senior Member

I was performing a parallel performance study on our new cluster. I decomposed a problem using metis into 2, 4, 8, 12, 16, 24, 32, 48, and 64 processors. In each case, the processorWeights were set uniformly to 1.
Strangely, the 24 processor case diverged, while all other cases ran OK. If I changed the 24processor case to 23 or 25processors, it ran fine. I also tried changing one of the processorWeights in a 24proc job to 1.2. In this case, the simulation also ran fine. Therefore, the 24proc processorWeights=1 case seems to have hit a situation where the Metis decomp did something unexpected. Has anyone else seem something similar? What is the best way to diagnose this problem, and to avoid it in future simulations? Thanks, Eric 

April 3, 2007, 07:56 
(I bet) you are using a Krylov

#2 
Senior Member
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,783
Rep Power: 22 
(I bet) you are using a Krylov space solver which carries a dependency on domain decomposition in its path to convergence. Just tighten the tolerance on the offending equations (again, I would guess pressure) and try again. It will work fine.
It usually helps to provide some more info than just "it diverged". What happened: a Nan, divergence in a single equation, a linear solver bottoming out or something else? Hrv Hrv
__________________
Hrvoje Jasak Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk 

April 3, 2007, 08:24 
OK, here are a few more detail

#3 
Senior Member

OK, here are a few more details:
1. The simulation is of an experimental model in our water tunnel for tipvortex/rudder interaction. 2. Grid has approx. 1e6 cells with wallspacing set for wall functions. 3. Current focus is steady simulation. Therefore, I was using simpleFoam. All variables were underrelaxed (p = 0.1, everything else at 0.3) 4. Simulation first showed signs of divergence in the epsilon equation, with a "bounding epsilon, min...." statement. A few time steps later, continuity errors became large and the sim fell apart. Pressure does not appear to be the culprit. 5. Other decomposed cases ran fine and the ones that I have plotted (4proc, 16proc, 48proc) look good. 6. I tried rerunning with tightenedup convergence criteria for epsilon, and reduced relaxation for epsilon. No luck. 7. Let me study this case a bit more, and I'll report back. Eric 

April 3, 2007, 08:29 
Do you have bounded discretisa

#4 
Senior Member
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,783
Rep Power: 22 
Do you have bounded discretisation on k and epsilon (i.e. TVD or upwind on convection and Gauss limited linear 1 or similar on diffusion)?
If so, your unbounded epsilon is due to insufficient convergence of the pressure equation, giving poor fluxes. Sorting out the discretisation and/or tightening the pressure tolerance will work. Enjoy, Hrv
__________________
Hrvoje Jasak Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk 

November 20, 2008, 09:08 
Hallo,
yes I had the same p

#6 
Senior Member
Wolfgang Heydlauff
Join Date: Mar 2009
Location: Germany
Posts: 136
Rep Power: 10 
Hallo,
yes I had the same problems (OF 1.5). a very simple box with 100x100 cells. equal ydecomposed to 2, 4 and 8 processors. running good with 1, 2 and 8. got divergend after half to be calculated with 4 processors. making it more complex (simpleGrading 1000 1000 1) points out, that the more CPUs one runs the more instable the calculation is. Any comments on that? 

November 20, 2008, 17:15 
If you are running double prec

#7 
Senior Member
Francesco Del Citto
Join Date: Mar 2009
Location: Zürich Area, Switzerland
Posts: 219
Rep Power: 10 
If you are running double precision, you can try to switch of floatTransfer in etc/controlDict (it should be 1 now, put it to 0) and run again.
There is a thread in this forum about this topic... Hope this helps, Francesco 

November 21, 2008, 06:18 
Hallo Dragos,
yes thank you

#8 
Senior Member
Wolfgang Heydlauff
Join Date: Mar 2009
Location: Germany
Posts: 136
Rep Power: 10 
Hallo Dragos,
yes thank you. It run MORE stable and cuts of later. By findtuning the precission it runs. Thx for the others look at topic http://www.cfdonline.com/OpenFOAM_D.../126/9155.html 

May 29, 2009, 09:01 
boundary problem in parallel processing

#9 
Member
Join Date: Mar 2009
Location: adelaide, SA, Australia
Posts: 31
Rep Power: 9 
Hi all,
I'm running parallel processing for a flow over a square cylinder by decomposing the domain into 32 (processors). When i reconstruct the domain and plot vorticity contour, there were misalignment of the vortex contour along the wake centerline and also at a few lines across the streamwise axis. Any idea what happens? Thanks 

July 9, 2009, 00:37 

#10 
New Member
Alireza Mahdavifar
Join Date: Jul 2009
Location: Kingston, ON, Canada
Posts: 4
Rep Power: 9 
I have the same problem with flow over a cylinder. why do you think this is related to decomposition and parallel processing?


July 9, 2009, 02:41 

#11 
Member
Join Date: Mar 2009
Location: adelaide, SA, Australia
Posts: 31
Rep Power: 9 
I simulated the same case using one and also using 2 processors and found no misalignment on the vortex contour. I only got that problems when i simulated using 32 processors. That why i was though it must be due to decomposition or parallel processing.
But i solved that problems by using higher order convective scheme. From 2nd order upwind to Quick, third order accuracy. I'm not sure why higher order scheme can prevent this from happen. Thanks 

Thread Tools  
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Optimization Sensitivity  rohtav  OpenFOAM  1  June 16, 2010 07:46 
Metis decomposeParDict problem certain patches on one processor  diana  OpenFOAM Running, Solving & CFD  3  September 9, 2009 05:33 
DecomposeParDict  hsieh  OpenFOAM Running, Solving & CFD  2  March 21, 2006 11:15 
sensitivity analysis  Silvia  FLUENT  1  November 3, 2005 05:48 
Sensitivity and uncertainty of CFD  Jim  Main CFD Forum  4  March 9, 2004 20:01 