CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Velocity blows up suddenly after 30,000+ iterations

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By maddalena

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 25, 2010, 18:58
Default Velocity blows up suddenly after 30,000+ iterations
  #1
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
Hello all,

I am running a 2d turbDyMFoam transient external flow simulation using komegasst and ggi for rotating parts. Qualitatively the simulation is good, all of the flow regimes I have seen in other research papers are forming, and a steady state appears to be almost reached up until the 34,000th time step where the Uy suddenly blows up (output: no. iterations 1000, large final residual), then time step continuity error blows up, etc.

Is it unusual for this to happen at such a high iteration count? Can wrong BCs cause this? I have for the k and omega zerogradient on all surfaces except fixedvalue at the freestream inlet and inletoutlet at the freestream outlet. For U, fixedvalue at freestream inlet, inletoutlet at freestream outlet (inletvalue uniform (0 0 0)), movingwallvelocity on physical surfaces, 'slip' on the top and bottom freestream patches. For p, zerogradiwent everywhere except slip on the top and bottom freestream patches and fixedvalue at freestream outlet.
lordvon is offline   Reply With Quote

Old   October 26, 2010, 04:33
Default
  #2
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Hi robert,
may I see the log file of the last 5 iterations before the simulation crashed?
It is always a good idea to post not only the description of your problem but also errors/schemes setup and everything else can be useful to understand what is going wrong.
Cheers

mad
maddalena is offline   Reply With Quote

Old   October 26, 2010, 10:50
Default
  #3
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
I stopped it before it could crash when I saw astronomical numbers for velocity magnitude and cumulative time step continuity error.

I have attached my fvSchemes file.

The last time steps before the first abnormal number appeared (where the simulation first messes up that I could find is highlighted in red):
Code:
Courant Number mean: 0.00800153 max: 0.249977 velocity magnitude: 19.2813
deltaT = 4.27214e-06
GGI pair (outsideSlider, insideSlider) : 0.00219448 0.00219443 Diff = -1.32848e-08 or 0.000605373 %
Time = 0.142095

Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider
Evaluation of GGI weighting factors:
  Largest slave weighting factor correction : 2.06758e-05 average: 1.92224e-05
  Largest master weighting factor correction: 4.44089e-16 average: 7.35279e-17

PBiCG:  Solving for Ux, Initial residual = 2.46226e-05, Final residual = 5.11939e-08, No Iterations 1
PBiCG:  Solving for Uy, Initial residual = 6.39498e-05, Final residual = 3.51525e-08, No Iterations 2
PCG:  Solving for p, Initial residual = 0.00242034, Final residual = 9.68673e-07, No Iterations 195
PCG:  Solving for p, Initial residual = 0.00017299, Final residual = 9.96505e-07, No Iterations 94
PCG:  Solving for p, Initial residual = 1.41325e-05, Final residual = 8.53351e-07, No Iterations 4
time step continuity errors : sum local = 2.1204e-12, global = -3.02346e-13, cumulative = 4.10983e-10
PCG:  Solving for p, Initial residual = 6.03939e-05, Final residual = 9.89376e-07, No Iterations 77
PCG:  Solving for p, Initial residual = 1.51437e-05, Final residual = 9.50447e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.38782e-06, Final residual = 7.04864e-07, No Iterations 1
time step continuity errors : sum local = 1.75143e-12, global = -2.2841e-13, cumulative = 4.10755e-10
PCG:  Solving for p, Initial residual = 5.53689e-06, Final residual = 8.46412e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.09958e-06, Final residual = 6.67435e-07, No Iterations 1
PCG:  Solving for p, Initial residual = 6.8453e-07, Final residual = 6.8453e-07, No Iterations 0
time step continuity errors : sum local = 1.7009e-12, global = -2.70951e-13, cumulative = 4.10484e-10
PBiCG:  Solving for omega, Initial residual = 6.95995e-06, Final residual = 3.11373e-08, No Iterations 1
PBiCG:  Solving for k, Initial residual = 1.13897e-05, Final residual = 3.64994e-08, No Iterations 1
ExecutionTime = 23466.6 s  ClockTime = 24298 s

Courant Number mean: 0.00800217 max: 0.249963 velocity magnitude: 19.2672
deltaT = 4.27277e-06
GGI pair (outsideSlider, insideSlider) : 0.00219454 0.00219444 Diff = -1.80767e-08 or 0.000823716 %
Time = 0.142099

Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider
Evaluation of GGI weighting factors:
  Largest slave weighting factor correction : 2.2433e-05 average: 2.09926e-05
  Largest master weighting factor correction: 4.44089e-16 average: 8.0832e-17

PBiCG:  Solving for Ux, Initial residual = 2.46306e-05, Final residual = 5.12302e-08, No Iterations 1
PBiCG:  Solving for Uy, Initial residual = 6.39718e-05, Final residual = 7.7565e-08, No Iterations 2
PCG:  Solving for p, Initial residual = 0.00235717, Final residual = 9.72543e-07, No Iterations 194
PCG:  Solving for p, Initial residual = 0.000171122, Final residual = 9.86475e-07, No Iterations 286
PCG:  Solving for p, Initial residual = 2.08418e-05, Final residual = 8.15853e-07, No Iterations 5
time step continuity errors : sum local = 2.02727e-12, global = -9.79359e-15, cumulative = 4.10474e-10
PCG:  Solving for p, Initial residual = 6.06036e-05, Final residual = 9.72301e-07, No Iterations 72
PCG:  Solving for p, Initial residual = 1.48379e-05, Final residual = 9.36621e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.3686e-06, Final residual = 6.83694e-07, No Iterations 1
time step continuity errors : sum local = 1.69887e-12, global = 6.1656e-14, cumulative = 4.10536e-10
PCG:  Solving for p, Initial residual = 5.634e-06, Final residual = 8.38707e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.09776e-06, Final residual = 6.58525e-07, No Iterations 1
PCG:  Solving for p, Initial residual = 6.76011e-07, Final residual = 6.76011e-07, No Iterations 0
time step continuity errors : sum local = 1.67978e-12, global = -1.03209e-14, cumulative = 4.10525e-10
PBiCG:  Solving for omega, Initial residual = 6.96026e-06, Final residual = 3.12084e-08, No Iterations 1
PBiCG:  Solving for k, Initial residual = 1.13946e-05, Final residual = 3.65695e-08, No Iterations 1
ExecutionTime = 23467.3 s  ClockTime = 24299 s

Courant Number mean: 0.00800324 max: 0.24995 velocity magnitude: 19.252
deltaT = 4.27362e-06
GGI pair (outsideSlider, insideSlider) : 0.00219457 0.00219446 Diff = -2.29347e-08 or 0.00104507 %
Time = 0.142103

Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider
Evaluation of GGI weighting factors:
  Largest slave weighting factor correction : 2.37478e-05 average: 2.23244e-05
  Largest master weighting factor correction: 4.44089e-16 average: 7.74234e-17

PBiCG:  Solving for Ux, Initial residual = 2.46498e-05, Final residual = 5.12892e-08, No Iterations 1
PBiCG:  Solving for Uy, Initial residual = 6.40071e-05, Final residual = 2.56052e-07, No Iterations 2
PCG:  Solving for p, Initial residual = 0.00231218, Final residual = 9.92148e-07, No Iterations 403
PCG:  Solving for p, Initial residual = 0.00017901, Final residual = 9.7537e-07, No Iterations 289
PCG:  Solving for p, Initial residual = 1.88655e-05, Final residual = 9.51366e-07, No Iterations 4
time step continuity errors : sum local = 2.3647e-12, global = -1.22702e-14, cumulative = 4.10513e-10
PCG:  Solving for p, Initial residual = 6.0832e-05, Final residual = 9.80863e-07, No Iterations 73
PCG:  Solving for p, Initial residual = 1.54561e-05, Final residual = 9.73776e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.42861e-06, Final residual = 7.12978e-07, No Iterations 1
time step continuity errors : sum local = 1.77216e-12, global = 4.73874e-14, cumulative = 4.1056e-10
PCG:  Solving for p, Initial residual = 5.49348e-06, Final residual = 8.22964e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.07637e-06, Final residual = 6.54928e-07, No Iterations 1
PCG:  Solving for p, Initial residual = 6.71248e-07, Final residual = 6.71248e-07, No Iterations 0
time step continuity errors : sum local = 1.66844e-12, global = 7.73072e-14, cumulative = 4.10638e-10
PBiCG:  Solving for omega, Initial residual = 6.96094e-06, Final residual = 3.12768e-08, No Iterations 1
PBiCG:  Solving for k, Initial residual = 1.14e-05, Final residual = 3.66464e-08, No Iterations 1
ExecutionTime = 23468.1 s  ClockTime = 24300 s

Courant Number mean: 0.00800474 max: 0.249937 velocity magnitude: 19.2358
deltaT = 4.2747e-06
GGI pair (outsideSlider, insideSlider) : 0.00219458 0.00219447 Diff = -2.74837e-08 or 0.00125234 %
Time = 0.142108

Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider
Evaluation of GGI weighting factors:
  Largest slave weighting factor correction : 2.46202e-05 average: 2.32176e-05
  Largest master weighting factor correction: 4.44089e-16 average: 7.18236e-17

PBiCG:  Solving for Ux, Initial residual = 2.46486e-05, Final residual = 5.13601e-08, No Iterations 1
PBiCG:  Solving for Uy, Initial residual = 6.40261e-05, Final residual = 3.55732e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 0.00227735, Final residual = 9.77136e-07, No Iterations 403
PCG:  Solving for p, Initial residual = 0.000179432, Final residual = 9.72128e-07, No Iterations 289
PCG:  Solving for p, Initial residual = 1.88636e-05, Final residual = 9.50111e-07, No Iterations 4
time step continuity errors : sum local = 2.36249e-12, global = -1.5894e-14, cumulative = 4.10622e-10
PCG:  Solving for p, Initial residual = 6.15472e-05, Final residual = 9.65375e-07, No Iterations 76
PCG:  Solving for p, Initial residual = 1.63223e-05, Final residual = 8.82104e-07, No Iterations 4
PCG:  Solving for p, Initial residual = 1.4109e-06, Final residual = 6.21973e-07, No Iterations 1
time step continuity errors : sum local = 1.54656e-12, global = 4.73359e-14, cumulative = 4.10669e-10
PCG:  Solving for p, Initial residual = 5.52386e-06, Final residual = 7.93676e-07, No Iterations 3
PCG:  Solving for p, Initial residual = 1.05287e-06, Final residual = 6.74179e-07, No Iterations 1
PCG:  Solving for p, Initial residual = 6.88394e-07, Final residual = 6.88394e-07, No Iterations 0
time step continuity errors : sum local = 1.71171e-12, global = 9.08264e-14, cumulative = 4.1076e-10
PBiCG:  Solving for omega, Initial residual = 6.96232e-06, Final residual = 3.13412e-08, No Iterations 1
PBiCG:  Solving for k, Initial residual = 1.14059e-05, Final residual = 3.67302e-08, No Iterations 1
ExecutionTime = 23469 s  ClockTime = 24301 s

Courant Number mean: 0.00800667 max: 0.249924 velocity magnitude: 19.2185
deltaT = 4.27601e-06
GGI pair (outsideSlider, insideSlider) : 0.00219458 0.00219448 Diff = -3.1391e-08 or 0.00143038 %
Time = 0.142112

Initializing the GGI interpolator between master/shadow patches: outsideSlider/insideSlider
Evaluation of GGI weighting factors:
  Largest slave weighting factor correction : 2.50498e-05 average: 2.36721e-05
  Largest master weighting factor correction: 4.44089e-16 average: 6.74412e-17

PBiCG:  Solving for Ux, Initial residual = 2.46482e-05, Final residual = 5.14485e-08, No Iterations 1
PBiCG:  Solving for Uy, Initial residual = 6.40494e-05, Final residual = 2.55792e+19, No Iterations 1000
PCG:  Solving for p, Initial residual = 1, Final residual = 9.67511e-07, No Iterations 803
PCG:  Solving for p, Initial residual = 0.127367, Final residual = 9.9397e-07, No Iterations 673
PCG:  Solving for p, Initial residual = 0.0152502, Final residual = 9.66232e-07, No Iterations 483
time step continuity errors : sum local = 1.42352e+11, global = -2.26409e+09, cumulative = -2.26409e+09
PCG:  Solving for p, Initial residual = 0.660487, Final residual = 9.85906e-07, No Iterations 793
PCG:  Solving for p, Initial residual = 0.499159, Final residual = 9.9406e-07, No Iterations 750
PCG:  Solving for p, Initial residual = 0.10088, Final residual = 9.8186e-07, No Iterations 535
time step continuity errors : sum local = 2.29023e+10, global = 2.6318e+09, cumulative = 3.67707e+08
PCG:  Solving for p, Initial residual = 0.81849, Final residual = 9.8874e-07, No Iterations 599
PCG:  Solving for p, Initial residual = 0.307484, Final residual = 9.67672e-07, No Iterations 724
PCG:  Solving for p, Initial residual = 0.0769423, Final residual = 9.21628e-07, No Iterations 654
time step continuity errors : sum local = 3.11283e+09, global = 2.92897e+07, cumulative = 3.96997e+08
PBiCG:  Solving for omega, Initial residual = 1, Final residual = 7.05779, No Iterations 1000
bounding omega, min: -1.66497e+29 max: 1.63748e+29 average: 8.64467e+24
PBiCG:  Solving for k, Initial residual = 0.866969, Final residual = 2.14241e-07, No Iterations 11
ExecutionTime = 23476.6 s  ClockTime = 24308 s
Attached Files
File Type: txt fvSchemes.txt (2.1 KB, 29 views)
lordvon is offline   Reply With Quote

Old   October 26, 2010, 11:07
Default
  #4
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Are you running on multiple processors?
Also, how the vector field looks like at the last saved time step?
Actually, I never run any simulation using your solver, but maybe the cause of your problem is similar to something I have already seen...

mad
maddalena is offline   Reply With Quote

Old   October 26, 2010, 11:27
Default
  #5
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
Yes, I am using metis decomposition for 4 processors on a single desktop.

As for the vector field on the iteration where it first display bad numbers, the whole field is just one color at some high velocity (on the order of the speed of light). The pressure field is all just one color with the scale set at infinity to negative infinity, with infinity and negative infinity both being the same color.

At the iteration just before, (I have controldict to save every 25 iterations) the flow field seems normal. I think the iteration that I posted above is the first messed up one, because the velocity magnitude still seems normal before the uy residual blows up.
lordvon is offline   Reply With Quote

Old   October 26, 2010, 11:38
Default
  #6
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Could you run the very last iterations using GAMG on p and smoothSolver on U and turbulence? Start form the last saved time step and go on. Alternatively, run the very last iterations on a single processor. If the solver run successfully the time step you had problems, than we may have found the solution...
Telling this because it remembers me a problem I had some times ago with incompatibility between DILUPBiCG in parallel and turbulence quantities...

crossed fingers...

mad
maddalena is offline   Reply With Quote

Old   October 26, 2010, 11:55
Default
  #7
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
Could you give me some pointers on setting up GAMG? I see that there are many parameters to define, and I have not used it before. How do I choose the velue for nSweeps for smoothSolver? (I have looked through the openfoam 1.7 manual) (I use 1.5dev though for ggi)

Thanks for the suggestions, will try out the single processor run and let you know if it works or not.
lordvon is offline   Reply With Quote

Old   October 26, 2010, 12:06
Default
  #8
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Quote:
Originally Posted by lordvon View Post
Could you give me some pointers on setting up GAMG?
Here it is:
Code:
    p
    {
        solver          GAMG;
        tolerance       1e-05;
        relTol          0;
        smoother        GaussSeidel;
        nPreSweeps      0;
        nPostSweeps     2;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }

    U
    {
        solver          smoothSolver;
        smoother        GaussSeidel;
        tolerance       1e-03;
        relTol          0;
    }
Adjust relTol following your needs. These are not very strict tolerances...

mad
lxwd likes this.
maddalena is offline   Reply With Quote

Old   October 26, 2010, 12:18
Default
  #9
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
How dependent is convergence on nsweeps and ncoarsestlevel?

My mesh (ggi, turbomachinery external flow) has about 70000 cells, is ncoarsestlevel=10 appropriate?
lordvon is offline   Reply With Quote

Old   October 26, 2010, 12:24
Default
  #10
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Quote:
Originally Posted by lordvon View Post
How dependent is convergence on nsweeps and ncoarsestlevel
I really do not know. never performed any study on that... That is a usually suggested setup...
Quote:
Originally Posted by lordvon View Post
My mesh has about 70000 cells, is ncoarsestlevel=10 appropriate?
... well, my mesh are usually double than yours, and I use the setup posted above...
maddalena is offline   Reply With Quote

Old   October 26, 2010, 13:28
Default
  #11
Member
 
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17
xiao is on a distinguished road
It seems that I have the same problem as you said:
"incompatibility between DILUPBiCG in parallel and turbulence quantities".

Switching from DILUPBiCG to GAMG helped. It is stilling running right now, but it at least has not yet crashed so far (further than it went using PBiCG).

I am running pisoFoam with RANS. Periodic and wall boundary conditions on my domain.
xiao is offline   Reply With Quote

Old   October 26, 2010, 14:13
Default
  #12
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Quote:
Originally Posted by xiao View Post
Switching from DILUPBiCG to GAMG helped. It is stilling running right now, but it at least has not yet crashed so far (further than it went using PBiCG).
So, in my opinion, your problem is solved!
Let me know if you will perform any study on the influence of different parameters on GAMG.

Enjoy

mad
maddalena is offline   Reply With Quote

Old   October 27, 2010, 11:46
Default
  #13
Member
 
Heng Xiao
Join Date: Mar 2009
Location: Zurich, Switzerland
Posts: 58
Rep Power: 17
xiao is on a distinguished road
Could anybody give some insight on that point?
incompatibility between DILUPBiCG in parallel and turbulence quantities.

Why is the incompatibility? I am not really satisfied with that solution without explanation.

Best,
Heng


Quote:
Originally Posted by maddalena View Post
Could you run the very last iterations using GAMG on p and smoothSolver on U and turbulence? Start form the last saved time step and go on. Alternatively, run the very last iterations on a single processor. If the solver run successfully the time step you had problems, than we may have found the solution...
Telling this because it remembers me a problem I had some times ago with incompatibility between DILUPBiCG in parallel and turbulence quantities...

crossed fingers...

mad
xiao is offline   Reply With Quote

Old   October 27, 2010, 12:00
Default
  #14
Senior Member
 
Robert
Join Date: Sep 2010
Posts: 158
Rep Power: 15
lordvon is on a distinguished road
Thanks, maddalena! Your diagnosis seems to have been correct.

Since piso is segregated solver (right?) and U was always the first to suddenly mess up, I only changed U to smoothSolver, and everything else in the case was the same. The simulation ran to completion (at least twice as far as previous cases).

I will change every other variable besides p to smoothSolver and see if there is any difference. I am hesitant to use GAMG because I dont know how to at least give educated guesses to the parameters. I imagine the standard parameter values are not applied to external turbomachine flow.

But thanks again! I dont think I would ever have caught such an error.
lordvon is offline   Reply With Quote

Old   October 27, 2010, 12:13
Default
  #15
Senior Member
 
maddalena's Avatar
 
maddalena
Join Date: Mar 2009
Posts: 436
Rep Power: 23
maddalena will become famous soon enough
Well, to say the truth, I simply reported an explanation found over here: http://www.cfd-online.com/Forums/ope...tml#post246561.
But happy to be useful!!!

mad
maddalena is offline   Reply With Quote

Old   October 19, 2015, 14:52
Default
  #16
New Member
 
Han Shao
Join Date: Apr 2015
Posts: 1
Rep Power: 0
showhand is on a distinguished road
Hi,


Do you have some new ideas about this problem?

Running with smooth-solver, OK; running with PBiCG in serials, OK; running with PBiCG in parallel, crash.
I have the same problem with h equation in reactingFoam.
Changing to smooth solver fixes this, but I am curious about the reason. Could you please share some ideas?

Quote:
Originally Posted by xiao View Post
Could anybody give some insight on that point?
incompatibility between DILUPBiCG in parallel and turbulence quantities.

Why is the incompatibility? I am not really satisfied with that solution without explanation.

Best,
Heng
showhand is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Compressible Nozzle Flow sebastian OpenFOAM Running, Solving & CFD 14 September 21, 2016 11:47
How to write k and epsilon before the abnormal end xiuying OpenFOAM Running, Solving & CFD 8 August 27, 2013 16:33
Running in parallel crashed zhajingjing OpenFOAM 4 September 15, 2010 08:12
ForcesCoeffs ronaldo OpenFOAM 4 September 14, 2009 08:11
MRFSimpleFoam amp cyclic patches david OpenFOAM Running, Solving & CFD 36 October 21, 2008 22:55


All times are GMT -4. The time now is 05:44.