CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   simpleFoam: divergence after resuming a converged job (https://www.cfd-online.com/Forums/openfoam-solving/165281-simplefoam-divergence-after-resuming-converged-job.html)

pbalz January 13, 2016 07:37

simpleFoam: divergence after resuming a converged job
 
Hi,

I'm simulating a rather complex geometry with four inlets and three outlets. The mesh consists of four separate regions connected via ACMI interfaces to couple them. The reason for this is that snappy couldn't mesh the whole region all in one (I tried a lot!) so I had to mesh separate regions and connect them. Anyway, as the geometry is complex the mesh is not the best but OK and it should run quite good.

For the steady solution I use simpleFoam. With low velocity boundary conditions (< 10 m/s) it converges and yields physical results. With high BC (which I need) simpleFoam diverges which seems OK... I didn't expect a converged solution straight away.
So my idea was to get an initial converged solution and stepwise increase the boundary condtions, then resume the calculation and repeat, up to the desired values until I get a fully converged solution.


Well, it turned out to be slightly harder than that...
The main problem is the following:
When I just resume the converged simulation without any changes, the simulation will blow up within the next few timesteps. Always.

I never got such a weird behavior on other cases, so my initial guess is that something with the ACMI Patches is not working as intended.

Is this a known bug or does anybody have an idea on how to solve this problem?
A workaround would also work as I really need to get this simulation running.



Unfortunately, I can't upload the case or give you detailed information because of a non-disclosure agreement.
But I can surely provide several logs if this is beneficial for solving my problem.

Any help would be greatly appreciated!
Thanks!

akidess January 14, 2016 02:28

Can you use mergeMeshes to produce a single region instead of ACMI?

pbalz January 14, 2016 05:18

1 Attachment(s)
I've tried that and unfortunately it's not possible. The main reason is that snappyHexMesh really isn't that accurate when it comes to catching feature edges+surfaces. So the "interfaces" are not 100% coincident and have different resolutions.
It's just a pain in the ass to achieve perfect matching interfaces with snappy for complex geometries (I would say it's impossible).

So that's the reason why I'm using ACMIs, because "normal" AMIs don't account for unconnected faces.
Btw, I've attached the checkMesh log so that you get a rough idea of the mesh. [I had to rename the patches, sorry!].

arjun January 14, 2016 07:56

Quote:

Originally Posted by pbalz (Post 580780)
Hi,

When I just resume the converged simulation without any changes, the simulation will blow up within the next few timesteps. Always.

I think the issue is that upon reading the converged solution openfoam can not construct the fluxes correctly. This sometimes could cause divergence.

To check this issue create a simple case, converge it and resume.
If you see jump in continuity residual then this might be the main issue causing problem.

clktp January 19, 2016 08:53

Actually it happened to my cases a couple of times. I was running some cases and they were killed because I didn't have enough disk space. Then I resubmit the case from the latest iteration I had, but meanwhile I kept the log file for the older one. Anyway then I let it run for a while, and killed it because it diverged.

Then I compared two log files for the residuals and I saw that after I resubmit the case residuals started to increase while in the older case kept decreasing at the same iterations. Afterwards I tried to see if I did something wrong, then it happened in exactly same way, residuals started to increase when I resubmit the case.

Anyway I didn't think about it a lot, maybe I did something wrong and I didn't notice, though I have no idea what I can do wrong, it's the same case, same folders etc. But in my case there wasn't anything comples, it was just a simple flat surface with simpleFoam. From that day if something happens and my case is killed, I always restart it from zero, but I have literally no idea.

pbalz January 19, 2016 09:11

Hi,

Quote:

Originally Posted by arjun (Post 580926)
I think the issue is that upon reading the converged solution openfoam can not construct the fluxes correctly. This sometimes could cause divergence.

That sounds exactly like the problem I am experiencing at the moment and I've already thought of that. The main problem stays, how do I solve this issue?
It appears to happen with some cases but not all, i.e. interDyMFoam cases (just to test other solvers as well) work quite good.


Quote:

From that day if something happens and my case is killed, I always restart it from zero, but I have literally no idea.
Good to know that I'm not the only one with exactly this problem. In general restarting a case from zero worked for me too but now I try to increase the boundary conditions step-wise (the same way as in Fluent). This should be possible in OpenFoam too.
For sure this issue is well known to the developers too but maybe there is some reason why it's not that simple to fix... Does anyone have a clue?

clktp January 19, 2016 09:26

I don't think it's a general problem, because even in some tutorials first you get a solution from potentialFoam to convergence your case easier. It's more or less the same thing. Actually until I read this, I was quite sure that I did something stupid and didn't notice.

I know it sounds quite silly, but did you try to change your folder's name from #latestiteration to zero.

arjun January 19, 2016 12:23

The issue is that there are two component to flux.
One due to velocitz and density. Second due to rhie and chow.

First part could be constructed from given data, second part however requires you to know Ap of velocity and pressure gradient. This part is mainly omitted in many solvers because in theory in a converged case goes to 0.

In practice it is not zero but small.

My suggestion would be to run first very few iterations with very small pressure urf. Say 0.01 or so.

(Dont touch momentum urf , keep as it is).

If it is a transient case, increase number of inner iterations and use very small pressure urf.

This should give solver time to recover and make fluxes better.


Quote:

Originally Posted by clktp (Post 581557)
I don't think it's a general problem, because even in some tutorials first you get a solution from potentialFoam to convergence your case easier. It's more or less the same thing. Actually until I read this, I was quite sure that I did something stupid and didn't notice.
.

Some cases recover and very few can not. I think it depends on how strongly pressure affects the solution.

pbalz January 22, 2016 06:14

Hi Arjun,

thanks for your reply! I will try as you suggested.

karamiag September 7, 2016 03:43

rhie and chow unreconstructable flux component
 
Hi Arjun,
could you better explain what you mean when you say that there is a part of the flux
due to Rie and Chow not reconstructable? Eventually suggesting references.
Thank you!

Quote:

Originally Posted by arjun (Post 581592)
The issue is that there are two component to flux.
One due to velocitz and density. Second due to rhie and chow.

First part could be constructed from given data, second part however requires you to know Ap of velocity and pressure gradient. This part is mainly omitted in many solvers because in theory in a converged case goes to 0.

In practice it is not zero but small.

My suggestion would be to run first very few iterations with very small pressure urf. Say 0.01 or so.

(Dont touch momentum urf , keep as it is).

If it is a transient case, increase number of inner iterations and use very small pressure urf.

This should give solver time to recover and make fluxes better.




Some cases recover and very few can not. I think it depends on how strongly pressure affects the solution.


karamiag September 8, 2016 09:17

I also tried to reduce pressure urf modifying fvSolution file but I did not notice any change.
So I would like to ask you what file and how you would change it to reduce the pressure urf.
Thank you.

Quote:

Originally Posted by arjun (Post 581592)
The issue is that there are two component to flux.

My suggestion would be to run first very few iterations with very small pressure urf. Say 0.01 or so.

(Dont touch momentum urf , keep as it is).

If it is a transient case, increase number of inner iterations and use very small pressure urf.

This should give solver time to recover and make fluxes better.


arjun September 20, 2016 01:10

Quote:

Originally Posted by karamiag (Post 617124)
I also tried to reduce pressure urf modifying fvSolution file but I did not notice any change.
So I would like to ask you what file and how you would change it to reduce the pressure urf.
Thank you.

Someone mentioned that face fluxes being saved. So it seems there must be something else going on.

My advice was for trying to construct flux without disturbing pressure. Someone from openfoam shall really look into it.

arjun September 20, 2016 01:15

Quote:

Originally Posted by karamiag (Post 616888)
Hi Arjun,
could you better explain what you mean when you say that there is a part of the flux
due to Rie and Chow not reconstructable? Eventually suggesting references.
Thank you!

http://www.cfd-online.com/Wiki/Rhie-Chow_interpolation


For construction of flux you will need velocity, pressure gradient and Ap var (diagonal of velocity matrix).

Usually Ap variable is not stored. So even if it is stored one need to construct the fluxes after file is restored.

But if fluxes are saved and restored then there shall be no problem.


All times are GMT -4. The time now is 10:21.