Interfoam blows on parallel run
4 Attachment(s)
Hi,
I've got an interfoam case that run well serially but blows almost immediately when run parallel (p_rgh residual became about 1e+47). Checkmesh reports: Code:
/*---------------------------------------------------------------------------*\ Enclosed you can find all my setup and boundary conditions. I tried to look in the forum but I wasn't able to find a clear reason with a serial run case has not to work in parallel. BTW, p_rgh residuals became huge in the very last GAMG calculation,just before the time step continuity errors print, no matter how many correctors I include. Why ? Thanks for any help. |
Just for reference I include the error interfoam returns:
Code:
Do you think could be a meshing problem ? Thanks for any help. |
Hi. I have a similar issue and am exploring it at the moment. Current ideas include....
The pressure solver parameter, `nCellsInCoarsestLevel' is no longer appropriate with the reduced cell count in each domain. Smoothness across the entire domain is reduced during solution as each subdomain is considered in isolation of the others, other than information from the interfaces (ok, that sounds dodgy but it's the best I can do from memory). I'm paraphrasing but thought I'd try to contribute as I'm feeling depressed at the lack of feedback I'm getting on questions I've posted elsewhere. I'll see if I can find the articles I read and post them here... Good luck, I'll watch this thread.... Best regards, Mark. |
While I didn't do a great job doing an apples to apples comparison, I've seen a few parallel interFoam cases in that blowup in 1.6-ext but work fine in 2.1.x. It is worth noting that the compilation was done a little differently between the two foam versions: 1.6-ext used mvapich2 and 2.1.x was done with openMPI. Could that be an issue? I'm not too familiar with subtleties of MPI implementations.
I was going to try and figure out what the issue is but I haven't had time to dig into it. |
Daniele, this simulation is broken from the start - the continuity is way off before the first pressure correction! To debug, I'd run just one time step, and then check the results to see what happens at the boundaries.
|
Hi,
your equation for the turbulent kinetic energy is not converging at all during the iterations. The code explodes when the equation for omega diverges. If your problem is significantly turbulent, you might want to solve k and omega at each time-step, by setting "turbOnFinalIterOnly no;" in the PIMPLE subdictionary, and eventually use PIMPLE instead of PISO, with under-relaxation (see below). Also, more in general:
Best, |
hello,
Just to add my 2 cent ... try also to set maxCo at a lower value, 0.5 may be too high, try 0.2. regards, olivier |
Thanks to all for so many suggestions. I'm going to try all of them.
Any idea why the sim runs serially without errors ? |
@Alberto: I'll try to reduce the case dimension (110MB) in order to let you have it.
In the meanwhile I tried using reltol=0 but the solver crash at the first step: Code:
Using the PIMPLE subdict: Code:
PIMPLE Note: I haven't try yet to reduce the max Courant number. I also tried to check the decomposed BC and they are consistent with the full ones. I'm not an expert, at all, but the main thing, to me, is that the serial case is working and the parallel one no. Couldn't it be somehow linked to the processor BC, as anothr_acc suggested ? In the meanwhile I'm running the serial case with the new solution setup, just to see whether and when it's going to crash. |
Just a small update.
It seems to be a meshing problem. I'm running the same geometry meshed without the layers refinement and it's parallel running with the original setup (all but the tollerances that have been corrected as Alberto suggested). I still have some questions: - Why didn't checkmesh return any errors (even when applied to each processorX case) ? Or, better, are you aware of any case that could bring checkmesh to say OK when actually it's not ? - Is it common for a critic mesh to be even more critic once decomposed ?Shouldn't decomposition be a... transparent operation ? Just a thought... browsing the forum I've got the impression that the big part of the problems are due to meshing. Rarely a fine tune of the solution dict can move a case from crash to good. From that point of view a good mesh is very good resistant to bad setup...at least from the solver stability point of view. The check-list of every beginner (like me) should be: - Mesh - Scheme - Solution Any comment ? Anyway I'll post any news on my current case. |
Quote:
Quote:
P.S. What decomposition are you using? Scotch? Quote:
Best, |
Thanks Alberto.
Yes, I'm using scotch. I'm a newbie and I'm working on a not so simple geometry, I don't think to be able to find a better decomposition than its :o. Now I'm working on the geometry, reducing the smaller (and hopefully negligibles) features. My goal is to obtain a good looking layers distribution on the mesh. At the moment I've got problems on sharp corners, I'll check on the SHM forum. |
Quick update:
Using DICGaussSeidel and DILUGaussSeidel as smoother for GAMG and smoothSolver solvers seems to solve the problem (just few time steps, so far). I'm just following Alberto's indirect help :) (http://www.cfd-online.com/Forums/ope...tml#post227426). Thanks. |
Wow! These changes worked for me, too! I'm using simpleFoam with a Reynolds Stress turbulence model and like the case above, the solver seemed to run ok with a serial job but in parallel blew either after a long time for no clear reason, or blew on the first step.
Changing the smoother used with GAMG fixed the problem immediately. I've changed my relTols to zero (this did not cause the fix but I'm going with it anyway) and changed to the smooth solver for velocity, epsilon and R solution with the smoother specified above. Thanks so much for that tip. My jobs appear to be running now! Best regards, Mark. |
Objectively, my pressure residuals dropped two orders of magnitude the moment I made this change. Happy days!
|
Dear Danvica,
could you post your fvsolutions file, please? I'm a bit confused, wheter you used GAMG as preconditioner or as solver. Thank you! |
Hi styleworker,
I'm sorry but I cannot recover the right fsolution I used. Most probable: Code:
/*--------------------------------*- C++ -*----------------------------------*\ Daniele |
All times are GMT -4. The time now is 03:49. |