
[Sponsors] 
Velocity blows up suddenly after 30,000+ iterations 

LinkBack  Thread Tools  Search this Thread  Display Modes 
October 25, 2010, 18:58 
Velocity blows up suddenly after 30,000+ iterations

#1 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
Hello all,
I am running a 2d turbDyMFoam transient external flow simulation using komegasst and ggi for rotating parts. Qualitatively the simulation is good, all of the flow regimes I have seen in other research papers are forming, and a steady state appears to be almost reached up until the 34,000th time step where the Uy suddenly blows up (output: no. iterations 1000, large final residual), then time step continuity error blows up, etc. Is it unusual for this to happen at such a high iteration count? Can wrong BCs cause this? I have for the k and omega zerogradient on all surfaces except fixedvalue at the freestream inlet and inletoutlet at the freestream outlet. For U, fixedvalue at freestream inlet, inletoutlet at freestream outlet (inletvalue uniform (0 0 0)), movingwallvelocity on physical surfaces, 'slip' on the top and bottom freestream patches. For p, zerogradiwent everywhere except slip on the top and bottom freestream patches and fixedvalue at freestream outlet. 

October 26, 2010, 04:33 

#2 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Hi robert,
may I see the log file of the last 5 iterations before the simulation crashed? It is always a good idea to post not only the description of your problem but also errors/schemes setup and everything else can be useful to understand what is going wrong. Cheers mad 

October 26, 2010, 10:50 

#3 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
I stopped it before it could crash when I saw astronomical numbers for velocity magnitude and cumulative time step continuity error.
I have attached my fvSchemes file. The last time steps before the first abnormal number appeared (where the simulation first messes up that I could find is highlighted in red): Code:
Courant Number mean: 0.00800153 max: 0.249977 velocity magnitude: 19.2813 deltaT = 4.27214e06 GGI pair (outsideSlider, insideSlider) : 0.00219448 0.00219443 Diff = 1.32848e08 or 0.000605373 % Time = 0.142095 Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider Evaluation of GGI weighting factors: Largest slave weighting factor correction : 2.06758e05 average: 1.92224e05 Largest master weighting factor correction: 4.44089e16 average: 7.35279e17 PBiCG: Solving for Ux, Initial residual = 2.46226e05, Final residual = 5.11939e08, No Iterations 1 PBiCG: Solving for Uy, Initial residual = 6.39498e05, Final residual = 3.51525e08, No Iterations 2 PCG: Solving for p, Initial residual = 0.00242034, Final residual = 9.68673e07, No Iterations 195 PCG: Solving for p, Initial residual = 0.00017299, Final residual = 9.96505e07, No Iterations 94 PCG: Solving for p, Initial residual = 1.41325e05, Final residual = 8.53351e07, No Iterations 4 time step continuity errors : sum local = 2.1204e12, global = 3.02346e13, cumulative = 4.10983e10 PCG: Solving for p, Initial residual = 6.03939e05, Final residual = 9.89376e07, No Iterations 77 PCG: Solving for p, Initial residual = 1.51437e05, Final residual = 9.50447e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.38782e06, Final residual = 7.04864e07, No Iterations 1 time step continuity errors : sum local = 1.75143e12, global = 2.2841e13, cumulative = 4.10755e10 PCG: Solving for p, Initial residual = 5.53689e06, Final residual = 8.46412e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.09958e06, Final residual = 6.67435e07, No Iterations 1 PCG: Solving for p, Initial residual = 6.8453e07, Final residual = 6.8453e07, No Iterations 0 time step continuity errors : sum local = 1.7009e12, global = 2.70951e13, cumulative = 4.10484e10 PBiCG: Solving for omega, Initial residual = 6.95995e06, Final residual = 3.11373e08, No Iterations 1 PBiCG: Solving for k, Initial residual = 1.13897e05, Final residual = 3.64994e08, No Iterations 1 ExecutionTime = 23466.6 s ClockTime = 24298 s Courant Number mean: 0.00800217 max: 0.249963 velocity magnitude: 19.2672 deltaT = 4.27277e06 GGI pair (outsideSlider, insideSlider) : 0.00219454 0.00219444 Diff = 1.80767e08 or 0.000823716 % Time = 0.142099 Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider Evaluation of GGI weighting factors: Largest slave weighting factor correction : 2.2433e05 average: 2.09926e05 Largest master weighting factor correction: 4.44089e16 average: 8.0832e17 PBiCG: Solving for Ux, Initial residual = 2.46306e05, Final residual = 5.12302e08, No Iterations 1 PBiCG: Solving for Uy, Initial residual = 6.39718e05, Final residual = 7.7565e08, No Iterations 2 PCG: Solving for p, Initial residual = 0.00235717, Final residual = 9.72543e07, No Iterations 194 PCG: Solving for p, Initial residual = 0.000171122, Final residual = 9.86475e07, No Iterations 286 PCG: Solving for p, Initial residual = 2.08418e05, Final residual = 8.15853e07, No Iterations 5 time step continuity errors : sum local = 2.02727e12, global = 9.79359e15, cumulative = 4.10474e10 PCG: Solving for p, Initial residual = 6.06036e05, Final residual = 9.72301e07, No Iterations 72 PCG: Solving for p, Initial residual = 1.48379e05, Final residual = 9.36621e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.3686e06, Final residual = 6.83694e07, No Iterations 1 time step continuity errors : sum local = 1.69887e12, global = 6.1656e14, cumulative = 4.10536e10 PCG: Solving for p, Initial residual = 5.634e06, Final residual = 8.38707e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.09776e06, Final residual = 6.58525e07, No Iterations 1 PCG: Solving for p, Initial residual = 6.76011e07, Final residual = 6.76011e07, No Iterations 0 time step continuity errors : sum local = 1.67978e12, global = 1.03209e14, cumulative = 4.10525e10 PBiCG: Solving for omega, Initial residual = 6.96026e06, Final residual = 3.12084e08, No Iterations 1 PBiCG: Solving for k, Initial residual = 1.13946e05, Final residual = 3.65695e08, No Iterations 1 ExecutionTime = 23467.3 s ClockTime = 24299 s Courant Number mean: 0.00800324 max: 0.24995 velocity magnitude: 19.252 deltaT = 4.27362e06 GGI pair (outsideSlider, insideSlider) : 0.00219457 0.00219446 Diff = 2.29347e08 or 0.00104507 % Time = 0.142103 Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider Evaluation of GGI weighting factors: Largest slave weighting factor correction : 2.37478e05 average: 2.23244e05 Largest master weighting factor correction: 4.44089e16 average: 7.74234e17 PBiCG: Solving for Ux, Initial residual = 2.46498e05, Final residual = 5.12892e08, No Iterations 1 PBiCG: Solving for Uy, Initial residual = 6.40071e05, Final residual = 2.56052e07, No Iterations 2 PCG: Solving for p, Initial residual = 0.00231218, Final residual = 9.92148e07, No Iterations 403 PCG: Solving for p, Initial residual = 0.00017901, Final residual = 9.7537e07, No Iterations 289 PCG: Solving for p, Initial residual = 1.88655e05, Final residual = 9.51366e07, No Iterations 4 time step continuity errors : sum local = 2.3647e12, global = 1.22702e14, cumulative = 4.10513e10 PCG: Solving for p, Initial residual = 6.0832e05, Final residual = 9.80863e07, No Iterations 73 PCG: Solving for p, Initial residual = 1.54561e05, Final residual = 9.73776e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.42861e06, Final residual = 7.12978e07, No Iterations 1 time step continuity errors : sum local = 1.77216e12, global = 4.73874e14, cumulative = 4.1056e10 PCG: Solving for p, Initial residual = 5.49348e06, Final residual = 8.22964e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.07637e06, Final residual = 6.54928e07, No Iterations 1 PCG: Solving for p, Initial residual = 6.71248e07, Final residual = 6.71248e07, No Iterations 0 time step continuity errors : sum local = 1.66844e12, global = 7.73072e14, cumulative = 4.10638e10 PBiCG: Solving for omega, Initial residual = 6.96094e06, Final residual = 3.12768e08, No Iterations 1 PBiCG: Solving for k, Initial residual = 1.14e05, Final residual = 3.66464e08, No Iterations 1 ExecutionTime = 23468.1 s ClockTime = 24300 s Courant Number mean: 0.00800474 max: 0.249937 velocity magnitude: 19.2358 deltaT = 4.2747e06 GGI pair (outsideSlider, insideSlider) : 0.00219458 0.00219447 Diff = 2.74837e08 or 0.00125234 % Time = 0.142108 Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider Evaluation of GGI weighting factors: Largest slave weighting factor correction : 2.46202e05 average: 2.32176e05 Largest master weighting factor correction: 4.44089e16 average: 7.18236e17 PBiCG: Solving for Ux, Initial residual = 2.46486e05, Final residual = 5.13601e08, No Iterations 1 PBiCG: Solving for Uy, Initial residual = 6.40261e05, Final residual = 3.55732e07, No Iterations 3 PCG: Solving for p, Initial residual = 0.00227735, Final residual = 9.77136e07, No Iterations 403 PCG: Solving for p, Initial residual = 0.000179432, Final residual = 9.72128e07, No Iterations 289 PCG: Solving for p, Initial residual = 1.88636e05, Final residual = 9.50111e07, No Iterations 4 time step continuity errors : sum local = 2.36249e12, global = 1.5894e14, cumulative = 4.10622e10 PCG: Solving for p, Initial residual = 6.15472e05, Final residual = 9.65375e07, No Iterations 76 PCG: Solving for p, Initial residual = 1.63223e05, Final residual = 8.82104e07, No Iterations 4 PCG: Solving for p, Initial residual = 1.4109e06, Final residual = 6.21973e07, No Iterations 1 time step continuity errors : sum local = 1.54656e12, global = 4.73359e14, cumulative = 4.10669e10 PCG: Solving for p, Initial residual = 5.52386e06, Final residual = 7.93676e07, No Iterations 3 PCG: Solving for p, Initial residual = 1.05287e06, Final residual = 6.74179e07, No Iterations 1 PCG: Solving for p, Initial residual = 6.88394e07, Final residual = 6.88394e07, No Iterations 0 time step continuity errors : sum local = 1.71171e12, global = 9.08264e14, cumulative = 4.1076e10 PBiCG: Solving for omega, Initial residual = 6.96232e06, Final residual = 3.13412e08, No Iterations 1 PBiCG: Solving for k, Initial residual = 1.14059e05, Final residual = 3.67302e08, No Iterations 1 ExecutionTime = 23469 s ClockTime = 24301 s Courant Number mean: 0.00800667 max: 0.249924 velocity magnitude: 19.2185 deltaT = 4.27601e06 GGI pair (outsideSlider, insideSlider) : 0.00219458 0.00219448 Diff = 3.1391e08 or 0.00143038 % Time = 0.142112 Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider Evaluation of GGI weighting factors: Largest slave weighting factor correction : 2.50498e05 average: 2.36721e05 Largest master weighting factor correction: 4.44089e16 average: 6.74412e17 PBiCG: Solving for Ux, Initial residual = 2.46482e05, Final residual = 5.14485e08, No Iterations 1 PBiCG: Solving for Uy, Initial residual = 6.40494e05, Final residual = 2.55792e+19, No Iterations 1000 PCG: Solving for p, Initial residual = 1, Final residual = 9.67511e07, No Iterations 803 PCG: Solving for p, Initial residual = 0.127367, Final residual = 9.9397e07, No Iterations 673 PCG: Solving for p, Initial residual = 0.0152502, Final residual = 9.66232e07, No Iterations 483 time step continuity errors : sum local = 1.42352e+11, global = 2.26409e+09, cumulative = 2.26409e+09 PCG: Solving for p, Initial residual = 0.660487, Final residual = 9.85906e07, No Iterations 793 PCG: Solving for p, Initial residual = 0.499159, Final residual = 9.9406e07, No Iterations 750 PCG: Solving for p, Initial residual = 0.10088, Final residual = 9.8186e07, No Iterations 535 time step continuity errors : sum local = 2.29023e+10, global = 2.6318e+09, cumulative = 3.67707e+08 PCG: Solving for p, Initial residual = 0.81849, Final residual = 9.8874e07, No Iterations 599 PCG: Solving for p, Initial residual = 0.307484, Final residual = 9.67672e07, No Iterations 724 PCG: Solving for p, Initial residual = 0.0769423, Final residual = 9.21628e07, No Iterations 654 time step continuity errors : sum local = 3.11283e+09, global = 2.92897e+07, cumulative = 3.96997e+08 PBiCG: Solving for omega, Initial residual = 1, Final residual = 7.05779, No Iterations 1000 bounding omega, min: 1.66497e+29 max: 1.63748e+29 average: 8.64467e+24 PBiCG: Solving for k, Initial residual = 0.866969, Final residual = 2.14241e07, No Iterations 11 ExecutionTime = 23476.6 s ClockTime = 24308 s 

October 26, 2010, 11:07 

#4 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Are you running on multiple processors?
Also, how the vector field looks like at the last saved time step? Actually, I never run any simulation using your solver, but maybe the cause of your problem is similar to something I have already seen... mad 

October 26, 2010, 11:27 

#5 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
Yes, I am using metis decomposition for 4 processors on a single desktop.
As for the vector field on the iteration where it first display bad numbers, the whole field is just one color at some high velocity (on the order of the speed of light). The pressure field is all just one color with the scale set at infinity to negative infinity, with infinity and negative infinity both being the same color. At the iteration just before, (I have controldict to save every 25 iterations) the flow field seems normal. I think the iteration that I posted above is the first messed up one, because the velocity magnitude still seems normal before the uy residual blows up. 

October 26, 2010, 11:38 

#6 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Could you run the very last iterations using GAMG on p and smoothSolver on U and turbulence? Start form the last saved time step and go on. Alternatively, run the very last iterations on a single processor. If the solver run successfully the time step you had problems, than we may have found the solution...
Telling this because it remembers me a problem I had some times ago with incompatibility between DILUPBiCG in parallel and turbulence quantities... crossed fingers... mad 

October 26, 2010, 11:55 

#7 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
Could you give me some pointers on setting up GAMG? I see that there are many parameters to define, and I have not used it before. How do I choose the velue for nSweeps for smoothSolver? (I have looked through the openfoam 1.7 manual) (I use 1.5dev though for ggi)
Thanks for the suggestions, will try out the single processor run and let you know if it works or not. 

October 26, 2010, 12:06 

#8 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Here it is:
Code:
p { solver GAMG; tolerance 1e05; relTol 0; smoother GaussSeidel; nPreSweeps 0; nPostSweeps 2; cacheAgglomeration true; nCellsInCoarsestLevel 10; agglomerator faceAreaPair; mergeLevels 1; } U { solver smoothSolver; smoother GaussSeidel; tolerance 1e03; relTol 0; } mad 

October 26, 2010, 12:18 

#9 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
How dependent is convergence on nsweeps and ncoarsestlevel?
My mesh (ggi, turbomachinery external flow) has about 70000 cells, is ncoarsestlevel=10 appropriate? 

October 26, 2010, 12:24 

#10 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
I really do not know. never performed any study on that... That is a usually suggested setup...
... well, my mesh are usually double than yours, and I use the setup posted above... 

October 26, 2010, 13:28 

#11 
Member
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17 
It seems that I have the same problem as you said:
"incompatibility between DILUPBiCG in parallel and turbulence quantities". Switching from DILUPBiCG to GAMG helped. It is stilling running right now, but it at least has not yet crashed so far (further than it went using PBiCG). I am running pisoFoam with RANS. Periodic and wall boundary conditions on my domain. 

October 26, 2010, 14:13 

#12  
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Quote:
Let me know if you will perform any study on the influence of different parameters on GAMG. Enjoy mad 

October 27, 2010, 11:46 

#13  
Member
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17 
Could anybody give some insight on that point?
incompatibility between DILUPBiCG in parallel and turbulence quantities. Why is the incompatibility? I am not really satisfied with that solution without explanation. Best, Heng Quote:


October 27, 2010, 12:00 

#14 
Senior Member
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 16 
Thanks, maddalena! Your diagnosis seems to have been correct.
Since piso is segregated solver (right?) and U was always the first to suddenly mess up, I only changed U to smoothSolver, and everything else in the case was the same. The simulation ran to completion (at least twice as far as previous cases). I will change every other variable besides p to smoothSolver and see if there is any difference. I am hesitant to use GAMG because I dont know how to at least give educated guesses to the parameters. I imagine the standard parameter values are not applied to external turbomachine flow. But thanks again! I dont think I would ever have caught such an error. 

October 27, 2010, 12:13 

#15 
Senior Member
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23 
Well, to say the truth, I simply reported an explanation found over here: http://www.cfdonline.com/Forums/ope...tml#post246561.
But happy to be useful!!! mad 

October 19, 2015, 14:52 

#16 
New Member
Han Shao
Join Date: Apr 2015
Posts: 1
Rep Power: 0 
Hi,
Do you have some new ideas about this problem? Running with smoothsolver, OK; running with PBiCG in serials, OK; running with PBiCG in parallel, crash. I have the same problem with h equation in reactingFoam. Changing to smooth solver fixes this, but I am curious about the reason. Could you please share some ideas? 

Thread Tools  Search this Thread 
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Compressible Nozzle Flow  sebastian  OpenFOAM Running, Solving & CFD  14  September 21, 2016 11:47 
How to write k and epsilon before the abnormal end  xiuying  OpenFOAM Running, Solving & CFD  8  August 27, 2013 16:33 
Running in parallel crashed  zhajingjing  OpenFOAM  4  September 15, 2010 08:12 
ForcesCoeffs  ronaldo  OpenFOAM  4  September 14, 2009 08:11 
MRFSimpleFoam amp cyclic patches  david  OpenFOAM Running, Solving & CFD  36  October 21, 2008 22:55 