CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Bugs (https://www.cfd-online.com/Forums/openfoam-bugs/)
-   -   OpenFOAM-2.1 instability problems? (https://www.cfd-online.com/Forums/openfoam-bugs/102222-openfoam-2-1-instability-problems.html)

aliqasemi May 25, 2012 08:35

OpenFOAM-2.1 instability problems?
 
I had some problems with using OpenFOAM-2.1, and I was wondering if anybody else had similar problems, or if they have worked on finding the solution.

I have noticed when I run my two phase simulations using OpenFOAM-2.1 they are more prone to instability, specially when run in parallel. When using the OpenFOAM-1.6-extend, however, my simulation are more stable,

The instabilities occur after some time from the beginning of the simulation and appear in terms of some non-physical velocities slowly increasing and finally destroying the simulation results. When I run my code in serial if I just use smaller time-steps then the simulations are less likely to diverge. I haven't seen these instabilities when using OF-1.6-ext. So my first question is what has made the OF-1.6-ext more stable. It should be some improvements in discretization algorithms, but it would be great if we know more exactly what are these improvements.

There is another problem which is more an issue for me. It happens when using OF-2.1 in parallel, where I have seen instabilities mainly occurring in the processor boundaries. This time the problem can be due to a looser coupling of the linear equation solvers (I use GAMG most of the times), or maybe improper handling of the boundary conditions. I don't have this problem when using OF-1.6-ext, with identical algorithm, so this is not a problem with my code. My simulations are two-phase flow at low capillary numbers, and these instabilities happen when capillary forces are active on the processor boundaries.

At this time, I use OF-2.1 for preprocessing and post processing, while OF-1.6-ext for running my simulations. So it is not that bad, but it would be better if I could figure out what is the problem.

Is there anybody else having similar problems?

olivierG May 25, 2012 08:55

hello,

If i remember correctely, Co calculation has changed with the 2.0/2.1 version, so this may be a starting point to investigate : try with lower time step and compare with 1.6-ext .

regards,
olivier

wyldckat May 25, 2012 17:42

Greetings to both of you!

I knew I had seen reports about this sometime ago in OpenFOAM.org bug tracker and here they are:
Best regards,
Bruno

alberto May 26, 2012 16:07

What solver(s) is showing this problem? InterFoam?

aliqasemi May 26, 2012 17:38

Dear Alberto, Bruno and Olivier

Thanks for your help. The modification on Courant number calculation can explain my problem, or part of it at least. I didn't know about it and were using the same maxCo for both OF1.6 and the OF2.1. Nevertheless, I have put an additional constraint on time step size myself, so I need to check to see whether fixing this solves the problem.

The code is something based on the interFoam code, but heavily modified.

I will test this out further and post back.

Thanks again,
Ali

aliqasemi September 24, 2012 14:23

update
 
Just to confirm that this problem is NOT due to the time-step size. I have to mention that this happens for parallel cases; the difference is not that significant for the single-CPU simulations. I suspect it is due to the linear eqn solver (the computed pressure is different from the beginning of the simulation), or I may have done something with the boundary conditions which is not consistent with the new version. Anyway, I rather stick to the old OF for now; I might spend more time on this if I have got the time.

mara61 May 17, 2016 11:46

Dear all,

I am experiencing similar problem with my solver (laminar melting with buoyant forces), it works in serial and crashes at some point in parallel. I tried modifying linear solvers, but without much success. Have you solved the problem? Do you have any suggestions?

Thanks

yeharav January 16, 2017 05:51

Dear all,

I have exactly the same problem. Using fixed time steps in a boussinesq solver
the solution explodes when used in parallel and not in single processor

yeharav January 16, 2017 06:25

More details
 
Just to give more details:

I use my own solver based on the buoyant boussinesq solver.
The simulation runs for a while with adaptable time step.
The average time step is approximately 0.23. At ~50 the parallel solver breaks.
I reconstruct and run in single and it passes. Then I decompose and run imn parallel
and it runs for a while before it breaks.

When the simulation breaks I also tried using small fixed time steps (1e-3), but to no avail.


I use a hierarchical decomposition into 32 processors.

arzi February 15, 2017 07:39

Pressure Diverges
 
2 Attachment(s)
Hello dear users,

I'v got stock within the problem that the pressure solver diverges with the following error. This is naca airfoil low reynolds simulation using pimpleFoam. I would be eager if anybody can help me with this problem.
a timestep 1E-8 results in the same problem.

Bests,


Error:
------------------------------------------------
Courant Number mean: 5.23271e+031 max: 6.07791e+036
Time = 8e-008

PIMPLE: iteration 1
smoothSolver: Solving for Ux, Initial residual = nan, Final residual = nan, No Iterations 1000
smoothSolver: Solving for Uy, Initial residual = nan, Final residual = nan, No Iterations 1000
GAMG: Solving for p, Initial residual = nan, Final residual = nan, No Iterations 1000
GAMG: Solving for p, Initial residual = nan, Final residual = nan, No Iterations 1000


--> FOAM FATAL IO ERROR:
wrong token type - expected Scalar, found on line 0 the word 'nan'

file: C:/OpenFOAM/16.10/ACER-3.0.x/run/NACA2415-fine-grid/system/data.solverPerformance.p at line 0.

From function operator>>(Istream&, Scalar&)
in file lnInclude/Scalar.C at line 93.

FOAM exiting




Mesh:
-------------------------------------------------------------------
Mesh stats
points: 69172
internal points: 0
faces: 136901
internal faces: 67729
cells: 34105
faces per cell: 6
boundary patches: 4
point zones: 0
face zones: 0
cell zones: 0

Overall number of cells of each type:
hexahedra: 34105
prisms: 0
wedges: 0
pyramids: 0
tet wedges: 0
tetrahedra: 0
polyhedra: 0

Checking topology...
Boundary definition OK.
Cell to face addressing OK.
Point usage OK.
Upper triangular ordering OK.
Face vertices OK.
Number of regions: 1 (OK).

Checking patch topology for multiply connected surfaces...
Patch Faces Points Surface topology
VELOCITY_INLET 433 868 ok (non-closed singly connected)
PRESSURE_FARFIELD 195 392 ok (non-closed singly connected)
AIRFOIL_WALL 334 668 ok (non-closed singly connected)
frontAndBackPlanes 68210 69172 ok (non-closed singly connected)

Checking geometry...
Overall domain bounding box (-1.39977 -1.5 -0.046096) (2.1 1.5 0.046096)
Mesh has 2 geometric (non-empty/wedge) directions (1 1 0)
Mesh has 2 solution (non-empty) directions (1 1 0)
All edges aligned with or perpendicular to non-empty directions.
Boundary openness (-3.35274e-017 2.62186e-018 -1.42503e-019) OK.
Max cell openness = 1.59929e-013 OK.
Max aspect ratio = 218.673 OK.
Minimum face area = 5.05592e-011. Maximum face area = 0.0183235. Face area magnitudes OK.
Min volume = 4.66115e-012. Max volume = 0.0014232. Total volume = 0.878736. Cell volumes OK.
Mesh non-orthogonality Max: 89.8884 average: 11.482
*Number of severely non-orthogonal (> 70 degrees) faces: 423.
Non-orthogonality check OK.
<<Writing 423 non-orthogonal faces to set nonOrthoFaces
Face pyramids OK.
Max skewness = 0.637561 OK.
Coupled point location match (average 0) OK.

Mesh OK.

End


All times are GMT -4. The time now is 23:31.