CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   Sensitivity to decomposeParDict variables metis (https://www.cfd-online.com/Forums/openfoam-solving/58309-sensitivity-decomposepardict-variables-metis.html)

egp April 3, 2007 07:44

I was performing a parallel pe
 
I was performing a parallel performance study on our new cluster. I decomposed a problem using metis into 2, 4, 8, 12, 16, 24, 32, 48, and 64 processors. In each case, the processorWeights were set uniformly to 1.

Strangely, the 24 processor case diverged, while all other cases ran OK. If I changed the 24-processor case to 23- or 25-processors, it ran fine. I also tried changing one of the processorWeights in a 24-proc job to 1.2. In this case, the simulation also ran fine.

Therefore, the 24-proc processorWeights=1 case seems to have hit a situation where the Metis decomp did something unexpected.

Has anyone else seem something similar? What is the best way to diagnose this problem, and to avoid it in future simulations?

Thanks, Eric

hjasak April 3, 2007 07:56

(I bet) you are using a Krylov
 
(I bet) you are using a Krylov space solver which carries a dependency on domain decomposition in its path to convergence. Just tighten the tolerance on the offending equations (again, I would guess pressure) and try again. It will work fine.

It usually helps to provide some more info than just "it diverged". What happened: a Nan, divergence in a single equation, a linear solver bottoming out or something else?

Hrv

Hrv

egp April 3, 2007 08:24

OK, here are a few more detail
 
OK, here are a few more details:

1. The simulation is of an experimental model in our water tunnel for tip-vortex/rudder interaction.

2. Grid has approx. 1e6 cells with wall-spacing set for wall functions.

3. Current focus is steady simulation. Therefore, I was using simpleFoam. All variables were under-relaxed (p = 0.1, everything else at 0.3)

4. Simulation first showed signs of divergence in the epsilon equation, with a "bounding epsilon, min...." statement. A few time steps later, continuity errors became large and the sim fell apart. Pressure does not appear to be the culprit.

5. Other decomposed cases ran fine and the ones that I have plotted (4-proc, 16-proc, 48-proc) look good.

6. I tried re-running with tightened-up convergence criteria for epsilon, and reduced relaxation for epsilon. No luck.

7. Let me study this case a bit more, and I'll report back.

Eric

hjasak April 3, 2007 08:29

Do you have bounded discretisa
 
Do you have bounded discretisation on k and epsilon (i.e. TVD or upwind on convection and Gauss limited linear 1 or similar on diffusion)?

If so, your unbounded epsilon is due to insufficient convergence of the pressure equation, giving poor fluxes. Sorting out the discretisation and/or tightening the pressure tolerance will work.

Enjoy,

Hrv

egp April 3, 2007 08:51

Hrv, thanks for the suggestion
 
Hrv, thanks for the suggestions! I'll test and report later today.

wolle1982 November 20, 2008 08:08

Hallo, yes I had the same p
 
Hallo,

yes I had the same problems (OF 1.5).

a very simple box with 100x100 cells. equal ydecomposed to 2, 4 and 8 processors. running good with 1, 2 and 8. got divergend after half to be calculated with 4 processors.

making it more complex (simpleGrading 1000 1000 1) points out, that the more CPUs one runs the more instable the calculation is.

Any comments on that?

fra76 November 20, 2008 16:15

If you are running double prec
 
If you are running double precision, you can try to switch of floatTransfer in etc/controlDict (it should be 1 now, put it to 0) and run again.
There is a thread in this forum about this topic...

Hope this helps,
Francesco

wolle1982 November 21, 2008 05:18

Hallo Dragos, yes thank you
 
Hallo Dragos,

yes thank you. It run MORE stable and cuts of later. By findtuning the precission it runs.

Thx

for the others look at topic
http://www.cfd-online.com/OpenFOAM_D.../126/9155.html

mali May 29, 2009 09:01

boundary problem in parallel processing
 
Hi all,

I'm running parallel processing for a flow over a square cylinder by decomposing the domain into 32 (processors).

When i reconstruct the domain and plot vorticity contour, there were misalignment of the vortex contour along the wake centerline and also at a few lines across the streamwise axis.

Any idea what happens?

Thanks

http://img297.imageshack.us/img297/9162/vortex.jpg

ali84 July 9, 2009 00:37

I have the same problem with flow over a cylinder. why do you think this is related to decomposition and parallel processing?

mali July 9, 2009 02:41

I simulated the same case using one and also using 2 processors and found no misalignment on the vortex contour. I only got that problems when i simulated using 32 processors. That why i was though it must be due to decomposition or parallel processing.

But i solved that problems by using higher order convective scheme. From 2nd order upwind to Quick, third order accuracy. I'm not sure why higher order scheme can prevent this from happen.

Thanks


All times are GMT -4. The time now is 23:17.