|
[Sponsors] |
December 4, 2012, 05:45 |
An alarming issue?
|
#1 |
Member
Diego Angeli
Join Date: Mar 2009
Posts: 31
Rep Power: 17 |
Dear FOAMers,
yesterday I experienced a very weird situation. I was in the middle of a lab session and I was teaching a custom tutorial to the students. The case is the famous "mixing elbow", in its isothermal feature, to be solved with simpleFoam. I set it up successfully this time last year, and I was feeling very comfortable in doing it all over again. You can download the zipped case from this link: http://www.mimesis.eu/pool/elbow.zip As you may see, the mesh is quite coarse but not horrible, I built it specifically to achieve a fast and safe execution in the lab sessions. The model, numerics and solver settings are entirely inherited from the pitzDaily case. Questionable but easy for a "first cup of openFoam" besides the usual cavity stuff. I imposed very standard BCs. I retained consciously the corrected laplacianSchemes, well aware that these may cause problems with unstructured grids. I wish to stress that I made such a choice not because I am stupid or masochist, but only to be able to finish the case from start to end, focusing on modifying the files in the 0 folder, in the time span of one session. I have 120 students, and 99% of them never saw any linux and/or text UI before. Next lesson is about fvSchemes and its use. And there it went: - last year it was OF-2.0.x, compiled DP with gcc 4.6 on an ubuntu 11.10 32bit virtual machine (I distributed it to students in order to let them work independently). Everything worked fine (and still does: I re-tried today), "convergence" (with coarse thresholds) is achieved within a hundred of iterations or so, and the result is meaningful. So far, so good. - this year, OF-2.1.x, compiled DP with gcc 4.4 on a debian etch machine (Xeon) and cloned onto all the PCs of the lab: the case stops claiming to be converged after 10 iterations, but it's completely screwed. Then it was utter "panic at the lab". Luckily, we changed the initial BC for epsilon on the fly (from 1e-4 to 1) and it worked. This morning I did some more tests. I will sum them up here: - OF-2.0.x - October 2011 - OpenSUSE 11.4 64bit - gcc 4.4 DP: fpe after 3 iter - OF-2.0.x - October 2012 - Ubuntu 12.04 64bit - gcc 4.4 SP: fpe after 3 iter - OF-2.1.x - January 2012 - Ubuntu 11.10 64bit - gcc 4.6 DP: fpe after 10 iterations, works after changing the IC for epsilon In addition, some students with the 2.1.1 precompiled release for ubuntu got it working instead, with no changes! Hence, it seems that (roughly) we have a dependence on the 32/64bit of the stability of the computation, and a dependence on the version of the sensitivity to initial conditions on epsilon. That said, I think I will revert to uncorrected laplacianSchemes for next year (which always fix the issue, btw), in order to avoid such a mess. But, indeed, in my opinion this version-dependence is rather alarming. Please tell me if I got it wrong somewhere. Thanks in advance, Regards diego |
|
December 4, 2012, 20:25 |
|
#2 |
Senior Member
ATM
Join Date: May 2009
Location: United States
Posts: 104
Rep Power: 16 |
This looks interesting. Let me see if my experience is of any use to you:
I have faced some strange issues like this in the past (but I did not delve much into it). The most recent thing I faced is that I am running an LES simulation in parallel on a supercomputer. My case runs normally when I use 12 processors in parallel. However, when I use 20 or 24 processors, it gives me a sigFpe in after a few time steps ( this value is same, for different fvScheme/fvSolution etc. settings I have tried). The supercomp I use utilizes Intel Xeon x5650 CPUs. I wonder whats going on here. the same case with the same mesh ( Mesh is of good quality) runs normally in serial or parallel with 12 procs. |
|
December 5, 2012, 05:21 |
|
#3 |
Member
Diego Angeli
Join Date: Mar 2009
Posts: 31
Rep Power: 17 |
Thank you atm for the feedback. Your experience looks even worse...
|
|
December 5, 2012, 10:19 |
|
#4 |
Senior Member
Olivier
Join Date: Jun 2009
Location: France, grenoble
Posts: 272
Rep Power: 17 |
hello,
I'm not an expert, but i would just add my 2 cents: 1) I use 2.1.x, and before, 2.0.x and 1.7.x, and update with git approx once per month: usually, you can see some change, event on k-epsilon model (yesteday i have see some update on komegaSST). I don't follow what was changed (sometime this is typo), but you should investigate. 2) If i remember, there where a change in the calculation of residual in the 2.0 or 2.1 version: this may affect your setting, specially if this is a case with bad mesh. 3) about the case of "atm", this is curious, because "i work for me", but i would say: You should not keep this in a grey area and check why some decomposition doesn't work. Do you use GAMG for pressure ? try PCG ? ... another decomposition method, .... and share your results. regards, olivier |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
convergence issue with multiphaseInterFoam | sachinlb | OpenFOAM Running, Solving & CFD | 2 | October 12, 2012 11:45 |
[ICEM] ICEM CFD orthogonal smoother issue | thinesunny | ANSYS Meshing & Geometry | 3 | July 3, 2012 13:38 |
Pressure boundary condition issue | Vijay | FLUENT | 0 | April 6, 2012 13:35 |
simpleFoam serious mass balance issue | fivos | OpenFOAM Running, Solving & CFD | 2 | November 6, 2011 08:21 |
Meshing related issue in Flow EFD | appu | FloEFD, FloWorks & FloTHERM | 1 | May 22, 2011 08:27 |