
[Sponsors] 
March 12, 2009, 05:23 
Hi, All,
I am runing a tran

#1 
Member
Vivien
Join Date: Mar 2009
Posts: 52
Rep Power: 10 
Hi, All,
I am runing a transient case with both icoFoam and turbFoam, the mesh composed of 300,000 elements. I use parallel computing with 15 processors(64 bits), but it turns out to be very slow. adaptive time step is used. Below is part of output from 2 solvers. ICOFOAM: Time = 0.090534016 Courant Number mean: 0.0042189121 max: 0.60001494 deltaT = 5.051708e06 DILUPBiCG: Solving for Ux, Initial residual = 1.5933376e05, Final residual = 9.3428658e09, No Iterations 2 DILUPBiCG: Solving for Uy, Initial residual = 1.7857106e05, Final residual = 1.1072651e09, No Iterations 2 DILUPBiCG: Solving for Uz, Initial residual = 1.6937772e05, Final residual = 4.5817176e09, No Iterations 2 DICPCG: Solving for p, Initial residual = 3.6714912e05, Final residual = 9.753555e09, No Iterations 937 DICPCG: Solving for p, Initial residual = 0.00022323789, Final residual = 9.9495147e09, No Iterations 870 time step continuity errors : sum local = 1.1644498e12, global = 3.4127869e14, cumulative = 5.5155785e13 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 139686.95 s ClockTime = 140261 s Time = 0.090539068 Courant Number mean: 0.0042189072 max: 0.60001581 deltaT = 5.0515749e06 DILUPBiCG: Solving for Ux, Initial residual = 1.6457677e05, Final residual = 4.6470092e09, No Iterations 3 DILUPBiCG: Solving for Uy, Initial residual = 1.8183019e05, Final residual = 1.1097131e09, No Iterations 3 DILUPBiCG: Solving for Uz, Initial residual = 1.7496236e05, Final residual = 8.7850725e09, No Iterations 2 DICPCG: Solving for p, Initial residual = 3.916581e05, Final residual = 9.5537303e09, No Iterations 934 DICPCG: Solving for p, Initial residual = 0.00022626984, Final residual = 9.5267804e09, No Iterations 914 time step continuity errors : sum local = 1.1548003e12, global = 1.7614837e14, cumulative = 5.3394302e13 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 139696.76 s ClockTime = 140271 s Time = 0.090544119 Courant Number mean: 0.0042188909 max: 0.60001492 deltaT = 5.0514493e06 DILUPBiCG: Solving for Ux, Initial residual = 1.5920686e05, Final residual = 9.2325137e09, No Iterations 2 DILUPBiCG: Solving for Uy, Initial residual = 1.7845416e05, Final residual = 9.8299531e10, No Iterations 2 DILUPBiCG: Solving for Uz, Initial residual = 1.6922093e05, Final residual = 4.5258186e09, No Iterations 2 DICPCG: Solving for p, Initial residual = 3.8415655e05, Final residual = 9.4206439e09, No Iterations 938 DICPCG: Solving for p, Initial residual = 0.00023378581, Final residual = 9.4035601e09, No Iterations 864 time step continuity errors : sum local = 1.1591606e12, global = 2.6889353e15, cumulative = 5.3663195e13 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 139706.31 s ClockTime = 140281 s Time = 0.090549171 Courant Number mean: 0.0042188863 max: 0.60001584 deltaT = 5.051316e06 DILUPBiCG: Solving for Ux, Initial residual = 1.6469925e05, Final residual = 4.7423517e09, No Iterations 3 DILUPBiCG: Solving for Uy, Initial residual = 1.8189916e05, Final residual = 1.2159212e09, No Iterations 3 DILUPBiCG: Solving for Uz, Initial residual = 1.7507658e05, Final residual = 8.9428823e09, No Iterations 2 DICPCG: Solving for p, Initial residual = 3.9920706e05, Final residual = 9.8779321e09, No Iterations 933 DICPCG: Solving for p, Initial residual = 0.00023749663, Final residual = 9.958994e09, No Iterations 914 time step continuity errors : sum local = 1.1595706e12, global = 1.3994702e14, cumulative = 5.5062665e13 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 139716.1 s ClockTime = 140291 s TURBFOAM: Time = 0.011246492 Courant Number mean: 0.0021497253 max: 0.29183874 deltaT = 4.1666667e06 DILUPBiCG: Solving for Ux, Initial residual = 2.2243458e05, Final residual = 6.4891902e08, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 2.7101833e05, Final residual = 1.6199912e07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 2.3161504e05, Final residual = 1.6636072e07, No Iterations 1 DICPCG: Solving for p, Initial residual = 6.4919684e05, Final residual = 9.5536623e07, No Iterations 870 DICPCG: Solving for p, Initial residual = 6.7920792e05, Final residual = 9.9707514e07, No Iterations 13 DICPCG: Solving for p, Initial residual = 8.065135e06, Final residual = 7.5610997e07, No Iterations 3 time step continuity errors : sum local = 4.3769708e12, global = 4.630939e14, cumulative = 3.7206299e11 DICPCG: Solving for p, Initial residual = 4.9009229e06, Final residual = 9.478687e07, No Iterations 65 DICPCG: Solving for p, Initial residual = 7.1595084e06, Final residual = 8.0333983e07, No Iterations 2 DICPCG: Solving for p, Initial residual = 1.1750355e06, Final residual = 5.0730276e07, No Iterations 1 time step continuity errors : sum local = 2.93423e12, global = 1.9983413e13, cumulative = 3.7406133e11 DILUPBiCG: Solving for epsilon, Initial residual = 1.0007713e05, Final residual = 4.9980981e08, No Iterations 1 DILUPBiCG: Solving for k, Initial residual = 3.374637e05, Final residual = 2.9900197e07, No Iterations 1 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 17468.16 s ClockTime = 157562 s Time = 0.011250659 Courant Number mean: 0.0021497677 max: 0.29185204 deltaT = 4.1666667e06 DILUPBiCG: Solving for Ux, Initial residual = 2.2222925e05, Final residual = 6.6488393e08, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 2.7102149e05, Final residual = 1.639788e07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 2.3168021e05, Final residual = 1.8278849e07, No Iterations 1 DICPCG: Solving for p, Initial residual = 5.2848402e05, Final residual = 9.6086881e07, No Iterations 867 DICPCG: Solving for p, Initial residual = 6.600024e05, Final residual = 9.7140246e07, No Iterations 13 DICPCG: Solving for p, Initial residual = 7.8676029e06, Final residual = 7.3652595e07, No Iterations 3 time step continuity errors : sum local = 4.270429e12, global = 4.5538476e14, cumulative = 3.7360594e11 DICPCG: Solving for p, Initial residual = 4.7932553e06, Final residual = 9.1380874e07, No Iterations 24 DICPCG: Solving for p, Initial residual = 4.0275424e06, Final residual = 7.9633809e07, No Iterations 1 DICPCG: Solving for p, Initial residual = 9.6547429e07, Final residual = 9.6547429e07, No Iterations 0 time step continuity errors : sum local = 5.3984369e12, global = 5.6480473e13, cumulative = 3.679579e11 DILUPBiCG: Solving for epsilon, Initial residual = 9.9681975e06, Final residual = 9.9681975e06, No Iterations 0 DILUPBiCG: Solving for k, Initial residual = 3.3737499e05, Final residual = 2.9891437e07, No Iterations 1 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 17474.9 s ClockTime = 157626 s Time = 0.011254826 Courant Number mean: 0.0021498106 max: 0.29186488 deltaT = 4.1666667e06 DILUPBiCG: Solving for Ux, Initial residual = 2.2233614e05, Final residual = 6.5251044e08, No Iterations 1 DILUPBiCG: Solving for Uy, Initial residual = 2.7089605e05, Final residual = 1.6233944e07, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 2.3153049e05, Final residual = 1.6663021e07, No Iterations 1 DICPCG: Solving for p, Initial residual = 6.453333e05, Final residual = 9.5711004e07, No Iterations 870 DICPCG: Solving for p, Initial residual = 6.8593867e05, Final residual = 9.8837497e07, No Iterations 16 DICPCG: Solving for p, Initial residual = 8.3200747e06, Final residual = 7.3665399e07, No Iterations 3 time step continuity errors : sum local = 4.2746891e12, global = 1.250224e13, cumulative = 3.6920812e11 DICPCG: Solving for p, Initial residual = 4.9091131e06, Final residual = 9.0890313e07, No Iterations 65 DICPCG: Solving for p, Initial residual = 6.9693982e06, Final residual = 7.8703718e07, No Iterations 2 DICPCG: Solving for p, Initial residual = 1.1489768e06, Final residual = 4.9787673e07, No Iterations 1 time step continuity errors : sum local = 2.8983148e12, global = 1.6075718e13, cumulative = 3.7081569e11 DILUPBiCG: Solving for epsilon, Initial residual = 1.0005115e05, Final residual = 5.0014711e08, No Iterations 1 DILUPBiCG: Solving for k, Initial residual = 3.3738596e05, Final residual = 2.9894466e07, No Iterations 1 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for T, Initial residual = 0, Final residual = 0, No Iterations 0 ExecutionTime = 17482.23 s ClockTime = 157694 s Is there any way to improve the computational performence? Thanks in advance! Vivien 

March 12, 2009, 07:15 
Hi Vivien,
On occasions I h

#2 
New Member
Greg Collecutt
Join Date: Mar 2009
Location: Brisbane, Queensland, Australia
Posts: 21
Rep Power: 10 
Hi Vivien,
On occasions I have forgotten to put the parallel option on the command line and have run 8 or more simultaneous copies of the single process version. How I got my PhD I don't know! I doubt you have made this mistake. I also see that the DICPCG solver is working pretty hard to solve the pressure field. Normally I would expect it to take less than 10 iterations to converge when running a courant number of 0.3. Sometimes I find the GAMG solver better for stubborn fields, but I have noted that this solver does not parallelise very well. How well does it run when you use a constant inlet condition (rather than the time varying one I think you are using)? Greg. 

March 12, 2009, 09:30 
Hi, Greg,
I did put the "p

#3 
Member
Vivien
Join Date: Mar 2009
Posts: 52
Rep Power: 10 
Hi, Greg,
I did put the "parallel" command... May I know how fine is your mesh and how long is your simulation? Which solver did you use? Many thanks!! Vivien 

March 13, 2009, 05:25 
Hi Vivien,
I read somewher

#4 
Senior Member
Rishi .
Join Date: Mar 2009
Posts: 143
Rep Power: 10 
Hi Vivien,
I read somewhere on the forum that for efficient parallel execution, each processor should get atleast 100k cells. In your case 300k/15cpu = 20k cells per cpu. So there is a lot of communication involved and possibly the network speed is bottleneck. Variant of Conjugate Gradient(CG) solvers need global data in each iteration step. My suggestion: 1. Run the case on 1CPU and then 2 or 4 cpus which are on the same node(depending if you have dual cores or quad) and check the speedup, if any. 2. Increase the domain size to so that ~100cells/cpu 3. Can you post your decomposePar logfile. It will give information about the communication required. Rishi 

March 13, 2009, 07:48 
Rishi, Vivien,
I am using a

#5 
New Member
Greg Collecutt
Join Date: Mar 2009
Location: Brisbane, Queensland, Australia
Posts: 21
Rep Power: 10 
Rishi, Vivien,
I am using a custom solver which is a variant of reactingFoam and includes a fair amount of lagranigian particle tracking and computation. In my case I have 300K cells + 200K particles and obtain near linear speed improvement all the way to 64 CPUs (i.e. less than 5000 cells per process). My mesh is long and thin and so I use simple decomposition on the long axis, so each processor mesh has at most two processor boundaries. Greg. 

March 16, 2009, 04:32 

#6  
Senior Member

Quote:
In my opinion the problem is in the p solver, try tu use GAMG as linear solver, in my incompressible simulations it put the p iterations down from 900 up to 20... Ivan 

Thread Tools  
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Record computational time  Edwin  FLUENT  5  October 22, 2008 02:28 
Relation between physical and computational time  Salman  Main CFD Forum  2  August 8, 2005 09:17 
Relation of computational time step with real time  Salman  Main CFD Forum  2  August 3, 2005 14:13 
computational time dynmesh  mange  FLUENT  0  November 19, 2004 13:34 
computational time and convergence rate  AK  FLUENT  1  January 20, 2004 14:53 