CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Interfoam blows on parallel run

Register Blogs Community New Posts Updated Threads Search

Like Tree16Likes
  • 4 Post By alberto
  • 1 Post By alberto
  • 5 Post By danvica
  • 6 Post By danvica

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 1, 2012, 05:15
Default Interfoam blows on parallel run
  #1
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Hi,
I've got an interfoam case that run well serially but blows almost immediately when run parallel (p_rgh residual became about 1e+47).

Checkmesh reports:

Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1                                   |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
/*   Windows 32 and 64 bit porting by blueCAPE: http://www.bluecape.com.pt   *\
|  Based on Windows porting (2.0.x v4) by Symscape: http://www.symscape.com   |
\*---------------------------------------------------------------------------*/
Build  : 2.1-c62f134541ee
Exec   : checkmesh
Date   : May 01 2012
Time   : 11:03:55
Host   : "UFFTECNICO7"
PID    : 1220
Case   : M:/f900layer
nProcs : 1
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create polyMesh for time = 0
Time = 0
Mesh stats
    points:           1540991
    faces:            4219890
    internal faces:   3997533
    cells:            1356793
    boundary patches: 5
    point zones:      0
    face zones:       0
    cell zones:       0
Overall number of cells of each type:
    hexahedra:     1166345
    prisms:        50764
    wedges:        0
    pyramids:      0
    tet wedges:    1837
    tetrahedra:    31
    polyhedra:     137816
Checking topology...
    Boundary definition OK.
    Cell to face addressing OK.
    Point usage OK.
    Upper triangular ordering OK.
    Face vertices OK.
    Number of regions: 1 (OK).
Checking patch topology for multiply connected surfaces ...
    Patch               Faces    Points   Surface topology                  
    defaultFaces        0        0        ok (empty)                        
    walls               218118   251141   ok (non-closed singly connected)  
    outlet              2833     3848     ok (non-closed singly connected)  
    inletc              703      816      ok (non-closed singly connected)  
    inleth              703      816      ok (non-closed singly connected)  
Checking geometry...
    Overall domain bounding box (-0.0349486 -0.0143717 -0.27418) (0.215826 0.0145106 0.00527969)
    Mesh (non-empty, non-wedge) directions (1 1 1)
    Mesh (non-empty) directions (1 1 1)
    Boundary openness (1.383e-016 -8.59187e-016 -1.90504e-016) OK.
    Max cell openness = 7.62663e-016 OK.
    Max aspect ratio = 30.8252 OK.
    Minumum face area = 2.14489e-009. Maximum face area = 3.0547e-006.  Face area magnitudes OK.
    Min volume = 4.266e-013. Max volume = 3.09268e-009.  Total volume = 0.000168861.  Cell volumes OK.
    Mesh non-orthogonality Max: 64.9781 average: 9.73097
    Non-orthogonality check OK.
    Face pyramids OK.
    Max skewness = 2.08043 OK.
    Coupled point location match (average 0) OK.
Mesh OK.
End
In the picture you can see the mesh.

Enclosed you can find all my setup and boundary conditions.

I tried to look in the forum but I wasn't able to find a clear reason with a serial run case has not to work in parallel.

BTW, p_rgh residuals became huge in the very last GAMG calculation,just before the time step continuity errors print, no matter how many correctors I include. Why ?

Thanks for any help.
Attached Images
File Type: jpg scr1.jpg (13.3 KB, 257 views)
File Type: jpg scr3.jpg (72.5 KB, 303 views)
Attached Files
File Type: zip setup.zip (6.8 KB, 86 views)
File Type: zip bc.zip (3.0 KB, 51 views)
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 1, 2012, 11:11
Default
  #2
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Just for reference I include the error interfoam returns:

Code:
 
M:\f900layer>mpiexec -n 4    interfoam -parallel         
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1                                   |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
/*   Windows 32 and 64 bit porting by blueCAPE: http://www.bluecape.com.pt   *\
|  Based on Windows porting (2.0.x v4) by Symscape: http://www.symscape.com   |
\*---------------------------------------------------------------------------*/
Build  : 2.1-c62f134541ee
Exec   : interfoam -parallel
Date   : May 01 2012
Time   : 17:00:37
Host   : "UFFTECNICO7"
PID    : 9224
Case   : M:/f900layer
nProcs : 4
Slaves : 
3
(
"UFFTECNICO7.9860"
"UFFTECNICO7.8832"
"UFFTECNICO7.9948"
)
Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : nonBlocking
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
 
PIMPLE: Operating solver in PISO mode
Reading field p_rgh
Reading field alpha1
Reading field U
Reading/calculating face flux field phi
Reading transportProperties
Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting turbulence model type RASModel
Selecting RAS turbulence model kOmegaSST
kOmegaSSTCoeffs
{
    alphaK1         0.85034;
    alphaK2         1;
    alphaOmega1     0.5;
    alphaOmega2     0.85616;
    gamma1          0.5532;
    gamma2          0.4403;
    beta1           0.075;
    beta2           0.0828;
    betaStar        0.09;
    a1              0.31;
    c1              10;
}
 
Reading g
Calculating field g.h
time step continuity errors : sum local = 0.00111869, global = -0.00111869, cumulative = -0.00111869
GAMG:  Solving for pcorr, Initial residual = 1, Final residual = 0.0939645, No Iterations 12
GAMG:  Solving for pcorr, Initial residual = 0.016886, Final residual = 0.000640254, No Iterations 2
GAMG:  Solving for pcorr, Initial residual = 0.00475767, Final residual = 0.000198716, No Iterations 2
GAMG:  Solving for pcorr, Initial residual = 0.00223692, Final residual = 8.4695e-005, No Iterations 2
GAMG:  Solving for pcorr, Initial residual = 0.00138859, Final residual = 4.343e-005, No Iterations 2
time step continuity errors : sum local = 0.00013811, global = -6.02976e-008, cumulative = -0.00111875
Courant Number mean: 0.960669 max: 211.701
Starting time loop
Courant Number mean: 0.00226573 max: 0.499295
Interface Courant Number mean: 0 max: 0
deltaT = 2.35849e-006
Time = 2.35849e-006
MULES: Solving for alpha1
Phase-1 volume fraction = 6.59604e-007  Min(alpha1) = 0  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 1.31921e-006  Min(alpha1) = 0  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 1.97881e-006  Min(alpha1) = -3.65119e-025  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.63842e-006  Min(alpha1) = -2.15788e-023  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 1, Final residual = 0.000322717, No Iterations 2
smoothSolver:  Solving for Uy, Initial residual = 1, Final residual = 0.000337929, No Iterations 2
smoothSolver:  Solving for Uz, Initial residual = 1, Final residual = 0.0040445, No Iterations 1
GAMG:  Solving for p_rgh, Initial residual = 1, Final residual = 0.068123, No Iterations 12
GAMG:  Solving for p_rgh, Initial residual = 0.0173844, Final residual = 0.000657355, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00486757, Final residual = 0.000201661, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00227043, Final residual = 8.49585e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00140338, Final residual = 4.25119e-005, No Iterations 2
time step continuity errors : sum local = 3.06836e-007, global = -1.29678e-010, cumulative = -0.00111875
GAMG:  Solving for p_rgh, Initial residual = 0.00129786, Final residual = 0.000116266, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.00122335, Final residual = 5.35063e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000722669, Final residual = 2.37654e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000567776, Final residual = 1.5856e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000467471, Final residual = 1.23498e-005, No Iterations 2
time step continuity errors : sum local = 8.71684e-008, global = 1.61757e-011, cumulative = -0.00111875
GAMG:  Solving for p_rgh, Initial residual = 0.000415674, Final residual = 1.13259e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000353334, Final residual = 9.06858e-006, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000310302, Final residual = 7.84026e-006, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000275727, Final residual = 6.77365e-006, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000249738, Final residual = 7.4289e-008, No Iterations 9
time step continuity errors : sum local = 5.24309e-010, global = -7.94578e-013, cumulative = -0.00111875
smoothSolver:  Solving for omega, Initial residual = 0.000751123, Final residual = 1.55042e-007, No Iterations 3
smoothSolver:  Solving for k, Initial residual = 1, Final residual = 6.12194e-005, No Iterations 3
ExecutionTime = 83.218 s  ClockTime = 83 s
Courant Number mean: 0.00226108 max: 0.473956
Interface Courant Number mean: 2.85931e-006 max: 0.177277
deltaT = 2.48788e-006
Time = 4.84637e-006
MULES: Solving for alpha1
Phase-1 volume fraction = 3.33273e-006  Min(alpha1) = -8.14863e-023  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 4.02704e-006  Min(alpha1) = -1.06512e-022  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 4.72136e-006  Min(alpha1) = -1.82129e-022  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 5.41567e-006  Min(alpha1) = -3.46934e-022  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 0.347181, Final residual = 0.00123868, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 0.288893, Final residual = 0.0020266, No Iterations 1
smoothSolver:  Solving for Uz, Initial residual = 0.383562, Final residual = 0.00219642, No Iterations 1
GAMG:  Solving for p_rgh, Initial residual = 0.0015376, Final residual = 0.000104287, No Iterations 9
GAMG:  Solving for p_rgh, Initial residual = 0.419083, Final residual = 0.0417686, No Iterations 1
GAMG:  Solving for p_rgh, Initial residual = 0.312039, Final residual = 0.0176243, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.0725684, Final residual = 0.00627527, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.0151413, Final residual = 0.001234, No Iterations 3
time step continuity errors : sum local = 9.63734e-008, global = 6.82369e-011, cumulative = -0.00111875
GAMG:  Solving for p_rgh, Initial residual = 0.0484515, Final residual = 0.00417269, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.0624542, Final residual = 0.00412223, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.0105898, Final residual = 0.000706991, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.00328531, Final residual = 0.000314927, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00140359, Final residual = 0.000102603, No Iterations 3
time step continuity errors : sum local = 3.09792e-009, global = 2.40103e-012, cumulative = -0.00111875
GAMG:  Solving for p_rgh, Initial residual = 0.00327651, Final residual = 0.00030206, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00127124, Final residual = 8.88962e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000592684, Final residual = 3.52354e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000374184, Final residual = 2.36845e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.000279674, Final residual = 8.80247e-008, No Iterations 18
time step continuity errors : sum local = 2.69836e-012, global = 3.12549e-015, cumulative = -0.00111875
smoothSolver:  Solving for omega, Initial residual = 0.000222471, Final residual = 6.54908e-008, No Iterations 2
smoothSolver:  Solving for k, Initial residual = 0.170403, Final residual = 4.19455e-005, No Iterations 2
ExecutionTime = 133.469 s  ClockTime = 134 s
Courant Number mean: 0.00238928 max: 0.494389
Interface Courant Number mean: 3.02348e-006 max: 0.198727
deltaT = 2.51301e-006
Time = 7.35939e-006
MULES: Solving for alpha1
Phase-1 volume fraction = 6.11773e-006  Min(alpha1) = -2.92291e-022  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 6.8198e-006  Min(alpha1) = -5.42996e-022  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 7.52186e-006  Min(alpha1) = -4.32711e-022  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 8.22393e-006  Min(alpha1) = -5.48215e-022  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 0.00159001, Final residual = 8.08127e-006, No Iterations 1
smoothSolver:  Solving for Uy, Initial residual = 0.00518775, Final residual = 2.83385e-005, No Iterations 1
smoothSolver:  Solving for Uz, Initial residual = 0.00147006, Final residual = 9.37366e-006, No Iterations 1
GAMG:  Solving for p_rgh, Initial residual = 0.209561, Final residual = 0.00619473, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.0441934, Final residual = 0.00174281, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00981531, Final residual = 0.000424857, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00334192, Final residual = 0.000247085, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00158343, Final residual = 0.000131685, No Iterations 2
time step continuity errors : sum local = 1.96911e-009, global = 5.51779e-013, cumulative = -0.00111875
GAMG:  Solving for p_rgh, Initial residual = 0.00272109, Final residual = 8.01031e-005, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00113411, Final residual = 4.98538e-005, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.000662833, Final residual = 4.4928e-005, No Iterations 3
GAMG:  Solving for p_rgh, Initial residual = 0.000521569, Final residual = 5.21471e-005, No Iterations 4
GAMG:  Solving for p_rgh, Initial residual = 0.000461228, Final residual = 1.31055e+015, No Iterations 200
time step continuity errors : sum local = 1.96724e+010, global = -2.87158e+006, cumulative = -2.87158e+006
GAMG:  Solving for p_rgh, Initial residual = 0.999957, Final residual = 0.0290708, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.0871181, Final residual = 0.00777385, No Iterations 1
GAMG:  Solving for p_rgh, Initial residual = 0.0194607, Final residual = 0.000887361, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.0074301, Final residual = 0.000270396, No Iterations 2
GAMG:  Solving for p_rgh, Initial residual = 0.00310641, Final residual = 9.25115e-008, No Iterations 27
time step continuity errors : sum local = 1.23246e+007, global = 72576.5, cumulative = -2.799e+006
smoothSolver:  Solving for omega, Initial residual = 1, Final residual = 0.000966837, No Iterations 4
smoothSolver:  Solving for k, Initial residual = 0.994149, Final residual = 0.000994127, No Iterations 176
ExecutionTime = 271.695 s  ClockTime = 272 s
Courant Number mean: 5.34538e+013 max: 3.90337e+019
Interface Courant Number mean: 430.395 max: 2.22887e+008
deltaT = 3.21903e-026
--> FOAM Warning : 
    From function Time::operator++()
    in file db/Time/Time.C at line 1010
    Increased the timePrecision from 6 to 7 to distinguish between timeNames at time 7.35939e-006
Time = 7.359389e-006
--> FOAM Warning : 
    From function Time::operator++()
    in file db/Time/Time.C at line 1010
    Increased the timePrecision from 7 to 8 to distinguish between timeNames at time 7.35939e-006
job aborted:
[ranks] message
[0] process exited without calling finalize
[1-3] terminated
---- error analysis -----
[0] on Ufftecnico7
interfoam ended prematurely and may have crashed. exit code 3
---- error analysis -----
The serial execution is ok.

Do you think could be a meshing problem ?

Thanks for any help.
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 1, 2012, 11:28
Default
  #3
Member
 
ms
Join Date: Mar 2009
Location: West London
Posts: 47
Rep Power: 17
anothr_acc is on a distinguished road
Hi. I have a similar issue and am exploring it at the moment. Current ideas include....

The pressure solver parameter, `nCellsInCoarsestLevel' is no longer appropriate with the reduced cell count in each domain.

Smoothness across the entire domain is reduced during solution as each subdomain is considered in isolation of the others, other than information from the interfaces (ok, that sounds dodgy but it's the best I can do from memory).

I'm paraphrasing but thought I'd try to contribute as I'm feeling depressed at the lack of feedback I'm getting on questions I've posted elsewhere. I'll see if I can find the articles I read and post them here... Good luck, I'll watch this thread....

Best regards,

Mark.
anothr_acc is offline   Reply With Quote

Old   May 1, 2012, 11:40
Default
  #4
Senior Member
 
kmooney's Avatar
 
Kyle Mooney
Join Date: Jul 2009
Location: San Francisco, CA USA
Posts: 323
Rep Power: 17
kmooney is on a distinguished road
While I didn't do a great job doing an apples to apples comparison, I've seen a few parallel interFoam cases in that blowup in 1.6-ext but work fine in 2.1.x. It is worth noting that the compilation was done a little differently between the two foam versions: 1.6-ext used mvapich2 and 2.1.x was done with openMPI. Could that be an issue? I'm not too familiar with subtleties of MPI implementations.

I was going to try and figure out what the issue is but I haven't had time to dig into it.
kmooney is offline   Reply With Quote

Old   May 2, 2012, 03:37
Default
  #5
Senior Member
 
akidess's Avatar
 
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 29
akidess will become famous soon enough
Daniele, this simulation is broken from the start - the continuity is way off before the first pressure correction! To debug, I'd run just one time step, and then check the results to see what happens at the boundaries.
__________________
*On twitter @akidTwit
*Spend as much time formulating your questions as you expect people to spend on their answer.
akidess is offline   Reply With Quote

Old   May 2, 2012, 03:56
Default
  #6
Senior Member
 
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36
alberto will become famous soon enoughalberto will become famous soon enough
Hi,

your equation for the turbulent kinetic energy is not converging at all during the iterations. The code explodes when the equation for omega diverges. If your problem is significantly turbulent, you might want to solve k and omega at each time-step, by setting "turbOnFinalIterOnly no;" in the PIMPLE subdictionary, and eventually use PIMPLE instead of PISO, with under-relaxation (see below).

Also, more in general:
  1. Since your mesh does not have particular problems of non-orthogonality, adding four non-orthogonal correction steps is excessive. They are simply trying to solve a problem that is most likely elsewhere (typically, check your boundary conditions).
  2. The relTol value for pcorr should be zero. Note that the linear solver quits too early on the first iteration over pcorr. You require a tolerance of 1.0e-8, but it quits at 1.0e-4 due to your relTol.
  3. All the settings for the linear solvers at the final iteration should have relTol set to zero.
  4. To use under-relaxation you should set nOuterCorrectors to a value larger than 1 to use the full PIMPLE algorithm.
P.S. If you post a small case that reproduces the problem, it would be helpful to give more targeted answers.


Best,
__________________
Alberto Passalacqua

GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541)
OpenQBMM - An open-source implementation of quadrature-based moment methods.

To obtain more accurate answers, please specify the version of OpenFOAM you are using.
alberto is offline   Reply With Quote

Old   May 2, 2012, 06:03
Default
  #7
Senior Member
 
Olivier
Join Date: Jun 2009
Location: France, grenoble
Posts: 272
Rep Power: 17
olivierG is on a distinguished road
hello,
Just to add my 2 cent ... try also to set maxCo at a lower value, 0.5 may be too high, try 0.2.

regards,
olivier
olivierG is offline   Reply With Quote

Old   May 2, 2012, 16:26
Default
  #8
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Thanks to all for so many suggestions. I'm going to try all of them.
Any idea why the sim runs serially without errors ?
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 3, 2012, 00:59
Default
  #9
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
@Alberto: I'll try to reduce the case dimension (110MB) in order to let you have it.

In the meanwhile I tried using reltol=0 but the solver crash at the first step:

Code:
 
M:\f900layer>mpiexec -n 4    interfoam -parallel         
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1                                   |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
/*   Windows 32 and 64 bit porting by blueCAPE: http://www.bluecape.com.pt   *\
|  Based on Windows porting (2.0.x v4) by Symscape: http://www.symscape.com   |
\*---------------------------------------------------------------------------*/
Build  : 2.1-c62f134541ee
Exec   : interfoam -parallel
Date   : May 03 2012
Time   : 06:35:14
Host   : "UFFTECNICO7"
PID    : 5656
Case   : M:/f900layer
nProcs : 4
Slaves : 
3
(
"UFFTECNICO7.6140"
"UFFTECNICO7.4864"
"UFFTECNICO7.9408"
)
Pstream initialized with:
    floatTransfer     : 0
    nProcsSimpleSum   : 0
    commsType         : nonBlocking
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
 
PIMPLE: Operating solver in PISO mode
Reading field p_rgh
Reading field alpha1
Reading field U
Reading/calculating face flux field phi
Reading transportProperties
Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting turbulence model type RASModel
Selecting RAS turbulence model kOmegaSST
kOmegaSSTCoeffs
{
    alphaK1         0.85034;
    alphaK2         1;
    alphaOmega1     0.5;
    alphaOmega2     0.85616;
    gamma1          0.5532;
    gamma2          0.4403;
    beta1           0.075;
    beta2           0.0828;
    betaStar        0.09;
    a1              0.31;
    c1              10;
}
 
Reading g
Calculating field g.h
time step continuity errors : sum local = 0.00111869, global = -0.00111869, cumulative = -0.00111869
GAMG:  Solving for pcorr, Initial residual = 1, Final residual = 9.44929e-009, No Iterations 100
GAMG:  Solving for pcorr, Initial residual = 0.0168697, Final residual = 8.39749e-009, No Iterations 37
time step continuity errors : sum local = 2.54022e-008, global = -2.22964e-010, cumulative = -0.00111869
Courant Number mean: 0.964422 max: 230.286
Starting time loop
Courant Number mean: 0.00209202 max: 0.499536
Interface Courant Number mean: 0 max: 0
deltaT = 2.1692e-006
Time = 2.1692e-006
MULES: Solving for alpha1
Phase-1 volume fraction = 6.06664e-007  Min(alpha1) = 0  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 1.21333e-006  Min(alpha1) = 0  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 1.81999e-006  Min(alpha1) = -1.00739e-024  Max(alpha1) = 1
MULES: Solving for alpha1
Phase-1 volume fraction = 2.42666e-006  Min(alpha1) = -1.41385e-023  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 1, Final residual = 1.38163e-012, No Iterations 6
smoothSolver:  Solving for Uy, Initial residual = 1, Final residual = 1.37788e-012, No Iterations 6
smoothSolver:  Solving for Uz, Initial residual = 1, Final residual = 5.12925e-011, No Iterations 5
GAMG:  Solving for p_rgh, Initial residual = 1, Final residual = 5.24428e+006, No Iterations 200
GAMG:  Solving for p_rgh, Initial residual = 0.793564, Final residual = 1.09358e+022, No Iterations 200
time step continuity errors : sum local = 2.17739e+023, global = 1.92673e+019, cumulative = 1.92673e+019
GAMG:  Solving for p_rgh, Initial residual = 1, Final residual = 8.64916e-009, No Iterations 46
GAMG:  Solving for p_rgh, Initial residual = 0.0824733, Final residual = 9.41259e-009, No Iterations 31
time step continuity errors : sum local = 4.78625e+031, global = 1.38544e+030, cumulative = 1.38544e+030
GAMG:  Solving for p_rgh, Initial residual = 1, Final residual = 8.28675e-009, No Iterations 48
GAMG:  Solving for p_rgh, Initial residual = 0.0725632, Final residual = 7.96813e-008, No Iterations 21
time step continuity errors : sum local = 4.52331e+070, global = 1.36619e+069, cumulative = 1.36619e+069
job aborted:
[ranks] message
[0] process exited without calling finalize
[1-3] terminated
---- error analysis -----
[0] on Ufftecnico7
interfoam ended prematurely and may have crashed. exit code 3
---- error analysis -----

Using the PIMPLE subdict:

Code:
PIMPLE
{
    //pRefCell        0;
    //pRefValue       0;
 
    turbOnFinalIterOnly no;
    momentumPredictor yes;
    nCorrectors     3;
    nOuterCorrectors    2;
    nNonOrthogonalCorrectors 1;
    nAlphaCorr      1;
    nAlphaSubCycles 4;
    cAlpha          1;
}
has the same result.

Note: I haven't try yet to reduce the max Courant number.

I also tried to check the decomposed BC and they are consistent with the full ones.

I'm not an expert, at all, but the main thing, to me, is that the serial case is working and the parallel one no. Couldn't it be somehow linked to the processor BC, as anothr_acc suggested ?

In the meanwhile I'm running the serial case with the new solution setup, just to see whether and when it's going to crash.
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 4, 2012, 04:30
Default
  #10
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Just a small update.

It seems to be a meshing problem. I'm running the same geometry meshed without the layers refinement and it's parallel running with the original setup (all but the tollerances that have been corrected as Alberto suggested).

I still have some questions:

- Why didn't checkmesh return any errors (even when applied to each processorX case) ? Or, better, are you aware of any case that could bring checkmesh to say OK when actually it's not ?

- Is it common for a critic mesh to be even more critic once decomposed ?Shouldn't decomposition be a... transparent operation ?

Just a thought... browsing the forum I've got the impression that the big part of the problems are due to meshing.
Rarely a fine tune of the solution dict can move a case from crash to good.
From that point of view a good mesh is very good resistant to bad setup...at least from the solver stability point of view.

The check-list of every beginner (like me) should be:

- Mesh
- Scheme
- Solution

Any comment ? Anyway I'll post any news on my current case.
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 5, 2012, 02:42
Default
  #11
Senior Member
 
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36
alberto will become famous soon enoughalberto will become famous soon enough
Quote:
Originally Posted by danvica View Post
Just a small update.

It seems to be a meshing problem. I'm running the same geometry meshed without the layers refinement and it's parallel running with the original setup (all but the tollerances that have been corrected as Alberto suggested).

I still have some questions:

- Why didn't checkMesh return any errors (even when applied to each processorX case) ? Or, better, are you aware of any case that could bring checkmesh to say OK when actually it's not ?
The checkMesh utility tends to be quite picky in my experience, and it checks for standard features of a mesh to establish its quality, so, if it says the mesh is OK, you should not experience problems due to it.

Quote:
- Is it common for a critic mesh to be even more critic once decomposed ?Shouldn't decomposition be a... transparent operation ?
The decomposition is a transparent operation, but some of the solution algorithms (for example some linear solvers) necessarily have to work differently in a parallel run. In some cases it happens that running in parallel might require (generally slightly) different settings.

P.S. What decomposition are you using? Scotch?

Quote:
Just a thought... browsing the forum I've got the impression that the big part of the problems are due to meshing.
Rarely a fine tune of the solution dict can move a case from crash to good.
From that point of view a good mesh is very good resistant to bad setup...at least from the solver stability point of view.
Yes, I absolutely agree, and this is not specific to a given code. Creating a good mesh is a good investment in terms of time, because you will save it later both in terms of stability (which might mean also computational time), and in terms of quality of the results (you won't have to re-run cases to improve them). If you work with multiphase flows, this becomes more important (use a hex mesh, every time you can :-)).

Best,
pmdelgado2 likes this.
__________________
Alberto Passalacqua

GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541)
OpenQBMM - An open-source implementation of quadrature-based moment methods.

To obtain more accurate answers, please specify the version of OpenFOAM you are using.
alberto is offline   Reply With Quote

Old   May 5, 2012, 06:42
Default
  #12
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Thanks Alberto.

Yes, I'm using scotch. I'm a newbie and I'm working on a not so simple geometry, I don't think to be able to find a better decomposition than its .

Now I'm working on the geometry, reducing the smaller (and hopefully negligibles) features. My goal is to obtain a good looking layers distribution on the mesh. At the moment I've got problems on sharp corners, I'll check on the SHM forum.
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   May 6, 2012, 05:37
Default
  #13
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Quick update:

Using DICGaussSeidel and DILUGaussSeidel as smoother for GAMG and smoothSolver solvers seems to solve the problem (just few time steps, so far).

I'm just following Alberto's indirect help (http://www.cfd-online.com/Forums/ope...tml#post227426). Thanks.
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Old   June 15, 2012, 06:14
Default
  #14
Member
 
ms
Join Date: Mar 2009
Location: West London
Posts: 47
Rep Power: 17
anothr_acc is on a distinguished road
Wow! These changes worked for me, too! I'm using simpleFoam with a Reynolds Stress turbulence model and like the case above, the solver seemed to run ok with a serial job but in parallel blew either after a long time for no clear reason, or blew on the first step.

Changing the smoother used with GAMG fixed the problem immediately. I've changed my relTols to zero (this did not cause the fix but I'm going with it anyway) and changed to the smooth solver for velocity, epsilon and R solution with the smoother specified above.

Thanks so much for that tip. My jobs appear to be running now!

Best regards,

Mark.
anothr_acc is offline   Reply With Quote

Old   June 15, 2012, 06:48
Default
  #15
Member
 
ms
Join Date: Mar 2009
Location: West London
Posts: 47
Rep Power: 17
anothr_acc is on a distinguished road
Objectively, my pressure residuals dropped two orders of magnitude the moment I made this change. Happy days!
anothr_acc is offline   Reply With Quote

Old   December 5, 2012, 08:52
Default
  #16
Member
 
Join Date: May 2012
Posts: 55
Rep Power: 14
styleworker is on a distinguished road
Dear Danvica,

could you post your fvsolutions file, please? I'm a bit confused, wheter you used GAMG as preconditioner or as solver.

Thank you!
styleworker is offline   Reply With Quote

Old   December 22, 2012, 02:09
Default
  #17
Senior Member
 
Daniele Vicario
Join Date: Mar 2009
Location: Novara, Italy
Posts: 142
Rep Power: 17
danvica is on a distinguished road
Hi styleworker,
I'm sorry but I cannot recover the right fsolution I used.

Most probable:

Code:
/*--------------------------------*- C++ -*----------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.0.0                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      fvSolution;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
solvers
{
    pcorr                                  // linear equation system solver for p
    {
        solver          GAMG;           // very efficient multigrid solver
        tolerance       1e-07;          // solver finishes if either absolute
        relTol          0;          // tolerance is reached or the relative
                                        // tolerance here
        minIter         3;              // a minimum number of iterations
        maxIter         200;            // limitation of iterions number
        smoother        GaussSeidel;            // DIC - setting for GAMG
        nPreSweeps      1;              // 1 for p, set to 0 for all other!
        nPostSweeps     2;              // 2 is fine
        nFinestSweeps   2;              // 2 is fine
        scaleCorrection true;           // true is fine
        directSolveCoarsestLevel false; // false is fine
        cacheAgglomeration on;          // on is fine; set to off, if dynamic
                                        // mesh refinement is used!
        nCellsInCoarsestLevel 500;      // 500 is fine,
                                        // otherwise sqrt(number of cells)
        agglomerator    faceAreaPair;   // faceAreaPair is fine
        mergeLevels     1;              // 1 is fine
    }
    p_rgh
    {
        solver          GAMG;           // very efficient multigrid solver
        tolerance       1e-07;          // solver finishes if either absolute
        relTol          0;          // tolerance is reached or the relative
                                        // tolerance here
        minIter         3;              // a minimum number of iterations
        maxIter         200;            // limitation of iterions number
        smoother        GaussSeidel;            // DIC - setting for GAMG
        nPreSweeps      1;              // 1 for p, set to 0 for all other!
        nPostSweeps     2;              // 2 is fine
        nFinestSweeps   2;              // 2 is fine
        scaleCorrection true;           // true is fine
        directSolveCoarsestLevel false; // false is fine
        cacheAgglomeration on;          // on is fine; set to off, if dynamic
                                        // mesh refinement is used!
        nCellsInCoarsestLevel 500;      // 500 is fine,
                                        // otherwise sqrt(number of cells)
        agglomerator    faceAreaPair;   // faceAreaPair is fine
        mergeLevels     1;              // 1 is fine
    }
    p_rghFinal
    {
        $p_rgh;
        tolerance       1e-06;
        relTol          0;
    }
    U                                   // linear equation system solver for U
    {
        solver          smoothSolver;   // solver type
        smoother        GaussSeidel;    // smoother type
        tolerance       1e-07;          // solver finishes if either absolute
        relTol          0;           // tolerance is reached or the relative
                                        // tolerance here
        nSweeps         1;              // setting for smoothSolver
        maxIter         200;            // limitation of iterations number
 
    }
 
    UFinal                                   // linear equation system solver for U
    {
        solver          smoothSolver;   // solver type
        smoother        GaussSeidel;    // smoother type
        tolerance       1e-06;          // solver finishes if either absolute
        relTol          0;           // tolerance is reached or the relative
                                        // tolerance here
        nSweeps         1;              // setting for smoothSolver
        maxIter         200;            // limitation of iterations number
 
    }
 
    k
    {
        solver           smoothSolver;
        smoother         GaussSeidel;
        tolerance        1e-7;
        relTol           0;
        nSweeps          1;
 
    }
    kFinal
    {
        solver           smoothSolver;
        smoother         GaussSeidel;
        tolerance        1e-7;
        relTol           0;
        nSweeps          1;
 
    }
 
    omega
    {
        solver           smoothSolver;
        smoother         GaussSeidel;
        tolerance        1e-7;
        relTol           0;
        nSweeps          1;
 
    }
 
    omegaFinal
    {
        solver           smoothSolver;
        smoother         GaussSeidel;
        tolerance        1e-7;
        relTol           0;
        nSweeps          1;
 
    }
}
relaxationFactors
{
    p              0.2;                // 0.3 is stable, decrease for bad mesh
    U               0.6;                // 0.7 is stable, decrease for bad mesh
    k               0.5;
    omega           0.5;
}
PIMPLE
{
    //pRefCell        0;
    //pRefValue       0;
 
    turbOnFinalIterOnly yes;
    momentumPredictor yes;
    nCorrectors     3;
    nOuterCorrectors    2;
    nNonOrthogonalCorrectors 1;
    nAlphaCorr      1;
    nAlphaSubCycles 4;
    cAlpha          1;
}
 
// ************************************************************************* //
I'll post any further information.

Daniele
__________________
Daniele Vicario

blueCFD2.1 - Windows 7
danvica is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
new Solver won't run parallel Chris Lucas OpenFOAM 4 January 10, 2012 10:30
parallel run for fsi simulation??? smn CFX 2 July 19, 2009 04:04
Run in parallel a 2mesh case cosimobianchini OpenFOAM Running, Solving & CFD 2 January 11, 2007 06:33
Serial run OK parallel one fails r2d2 OpenFOAM Running, Solving & CFD 2 November 16, 2005 12:44
How to run parallel in ICEM_CFD? Kiddo Main CFD Forum 2 January 24, 2005 08:53


All times are GMT -4. The time now is 00:58.