CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Sensitivity to decomposeParDict variables metis

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 3, 2007, 07:44
Default I was performing a parallel pe
  #1
egp
Senior Member
 
egp's Avatar
 
Eric Paterson
Join Date: Mar 2009
Location: Blacksburg, VA
Posts: 197
Blog Entries: 1
Rep Power: 18
egp is on a distinguished road
I was performing a parallel performance study on our new cluster. I decomposed a problem using metis into 2, 4, 8, 12, 16, 24, 32, 48, and 64 processors. In each case, the processorWeights were set uniformly to 1.

Strangely, the 24 processor case diverged, while all other cases ran OK. If I changed the 24-processor case to 23- or 25-processors, it ran fine. I also tried changing one of the processorWeights in a 24-proc job to 1.2. In this case, the simulation also ran fine.

Therefore, the 24-proc processorWeights=1 case seems to have hit a situation where the Metis decomp did something unexpected.

Has anyone else seem something similar? What is the best way to diagnose this problem, and to avoid it in future simulations?

Thanks, Eric
egp is offline   Reply With Quote

Old   April 3, 2007, 07:56
Default (I bet) you are using a Krylov
  #2
Senior Member
 
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,905
Rep Power: 33
hjasak will become famous soon enough
(I bet) you are using a Krylov space solver which carries a dependency on domain decomposition in its path to convergence. Just tighten the tolerance on the offending equations (again, I would guess pressure) and try again. It will work fine.

It usually helps to provide some more info than just "it diverged". What happened: a Nan, divergence in a single equation, a linear solver bottoming out or something else?

Hrv

Hrv
__________________
Hrvoje Jasak
Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk
hjasak is offline   Reply With Quote

Old   April 3, 2007, 08:24
Default OK, here are a few more detail
  #3
egp
Senior Member
 
egp's Avatar
 
Eric Paterson
Join Date: Mar 2009
Location: Blacksburg, VA
Posts: 197
Blog Entries: 1
Rep Power: 18
egp is on a distinguished road
OK, here are a few more details:

1. The simulation is of an experimental model in our water tunnel for tip-vortex/rudder interaction.

2. Grid has approx. 1e6 cells with wall-spacing set for wall functions.

3. Current focus is steady simulation. Therefore, I was using simpleFoam. All variables were under-relaxed (p = 0.1, everything else at 0.3)

4. Simulation first showed signs of divergence in the epsilon equation, with a "bounding epsilon, min...." statement. A few time steps later, continuity errors became large and the sim fell apart. Pressure does not appear to be the culprit.

5. Other decomposed cases ran fine and the ones that I have plotted (4-proc, 16-proc, 48-proc) look good.

6. I tried re-running with tightened-up convergence criteria for epsilon, and reduced relaxation for epsilon. No luck.

7. Let me study this case a bit more, and I'll report back.

Eric
egp is offline   Reply With Quote

Old   April 3, 2007, 08:29
Default Do you have bounded discretisa
  #4
Senior Member
 
Hrvoje Jasak
Join Date: Mar 2009
Location: London, England
Posts: 1,905
Rep Power: 33
hjasak will become famous soon enough
Do you have bounded discretisation on k and epsilon (i.e. TVD or upwind on convection and Gauss limited linear 1 or similar on diffusion)?

If so, your unbounded epsilon is due to insufficient convergence of the pressure equation, giving poor fluxes. Sorting out the discretisation and/or tightening the pressure tolerance will work.

Enjoy,

Hrv
__________________
Hrvoje Jasak
Providing commercial FOAM/OpenFOAM and CFD Consulting: http://wikki.co.uk
hjasak is offline   Reply With Quote

Old   April 3, 2007, 08:51
Default Hrv, thanks for the suggestion
  #5
egp
Senior Member
 
egp's Avatar
 
Eric Paterson
Join Date: Mar 2009
Location: Blacksburg, VA
Posts: 197
Blog Entries: 1
Rep Power: 18
egp is on a distinguished road
Hrv, thanks for the suggestions! I'll test and report later today.
egp is offline   Reply With Quote

Old   November 20, 2008, 08:08
Default Hallo, yes I had the same p
  #6
Senior Member
 
Wolfgang Heydlauff
Join Date: Mar 2009
Location: Germany
Posts: 136
Rep Power: 21
wolle1982 will become famous soon enough
Hallo,

yes I had the same problems (OF 1.5).

a very simple box with 100x100 cells. equal ydecomposed to 2, 4 and 8 processors. running good with 1, 2 and 8. got divergend after half to be calculated with 4 processors.

making it more complex (simpleGrading 1000 1000 1) points out, that the more CPUs one runs the more instable the calculation is.

Any comments on that?
wolle1982 is offline   Reply With Quote

Old   November 20, 2008, 16:15
Default If you are running double prec
  #7
Senior Member
 
Francesco Del Citto
Join Date: Mar 2009
Location: Zürich Area, Switzerland
Posts: 237
Rep Power: 18
fra76 is on a distinguished road
If you are running double precision, you can try to switch of floatTransfer in etc/controlDict (it should be 1 now, put it to 0) and run again.
There is a thread in this forum about this topic...

Hope this helps,
Francesco
fra76 is offline   Reply With Quote

Old   November 21, 2008, 05:18
Default Hallo Dragos, yes thank you
  #8
Senior Member
 
Wolfgang Heydlauff
Join Date: Mar 2009
Location: Germany
Posts: 136
Rep Power: 21
wolle1982 will become famous soon enough
Hallo Dragos,

yes thank you. It run MORE stable and cuts of later. By findtuning the precission it runs.

Thx

for the others look at topic
http://www.cfd-online.com/OpenFOAM_D.../126/9155.html
wolle1982 is offline   Reply With Quote

Old   May 29, 2009, 09:01
Default boundary problem in parallel processing
  #9
Member
 
Join Date: Mar 2009
Location: adelaide, SA, Australia
Posts: 32
Rep Power: 17
mali is on a distinguished road
Hi all,

I'm running parallel processing for a flow over a square cylinder by decomposing the domain into 32 (processors).

When i reconstruct the domain and plot vorticity contour, there were misalignment of the vortex contour along the wake centerline and also at a few lines across the streamwise axis.

Any idea what happens?

Thanks

mali is offline   Reply With Quote

Old   July 9, 2009, 00:37
Default
  #10
New Member
 
Alireza Mahdavifar
Join Date: Jul 2009
Location: Kingston, ON, Canada
Posts: 4
Rep Power: 16
ali84 is on a distinguished road
I have the same problem with flow over a cylinder. why do you think this is related to decomposition and parallel processing?
ali84 is offline   Reply With Quote

Old   July 9, 2009, 02:41
Default
  #11
Member
 
Join Date: Mar 2009
Location: adelaide, SA, Australia
Posts: 32
Rep Power: 17
mali is on a distinguished road
I simulated the same case using one and also using 2 processors and found no misalignment on the vortex contour. I only got that problems when i simulated using 32 processors. That why i was though it must be due to decomposition or parallel processing.

But i solved that problems by using higher order convective scheme. From 2nd order upwind to Quick, third order accuracy. I'm not sure why higher order scheme can prevent this from happen.

Thanks
mali is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Optimization Sensitivity rohtav OpenFOAM 1 June 16, 2010 07:46
Metis decomposeParDict problem certain patches on one processor diana OpenFOAM Running, Solving & CFD 3 September 9, 2009 05:33
DecomposeParDict hsieh OpenFOAM Running, Solving & CFD 2 March 21, 2006 10:15
sensitivity analysis Silvia FLUENT 1 November 3, 2005 04:48
Sensitivity and uncertainty of CFD Jim Main CFD Forum 4 March 9, 2004 19:01


All times are GMT -4. The time now is 10:55.