CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

mpirun interFoam very stable --> then blows up

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 22, 2013, 03:49
Angry mpirun interFoam very stable --> then blows up
  #1
New Member
 
Ziad
Join Date: Oct 2012
Location: Raunheim
Posts: 5
Rep Power: 14
ghadab is on a distinguished road
Dear friends,

i am working with the interFoam solver and running the simulation on 24 processors. After two days of calculation one of the processors blows up. If i read the log before the error, i see that the simulation was stable (maxco, residuals, time step, ...). I copied for you the last lines of the logfile.

MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216176 Min(alpha1) = -5.5510116e-20 Max(alpha1) = 1.0000333
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216177 Min(alpha1) = -1.5739172e-19 Max(alpha1) = 1.0000347
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216178 Min(alpha1) = -2.0048211e-19 Max(alpha1) = 1.0000359
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216179 Min(alpha1) = -1.6316185e-19 Max(alpha1) = 1.000037
GAMG: Solving for p_rgh, Initial residual = 0.00055114557, Final residual = 7.5653985e-06, No Iterations 2
GAMG: Solving for p_rgh, Initial residual = 8.8321274e-05, Final residual = 6.1051387e-06, No Iterations 1
time step continuity errors : sum local = 4.5327434e-10, global = -5.4431254e-12, cumulative = -0.00078028933
GAMG: Solving for p_rgh, Initial residual = 0.00016425054, Final residual = 4.5803235e-06, No Iterations 2
GAMGPCG: Solving for p_rgh, Initial residual = 5.0422281e-05, Final residual = 1.520426e-06, No Iterations 1
time step continuity errors : sum local = 1.1287501e-10, global = 1.3621927e-13, cumulative = -0.00078028933
DILUPBiCG: Solving for k, Initial residual = 0.00023313691, Final residual = 8.5579005e-09, No Iterations 3
bounding k, min: -1.2733491e-05 max: 4.7929403 average: 0.042688034
ExecutionTime = 232572.69 s ClockTime = 233069 s

Courant Number mean: 0.011293328 max: 0.50499715
Interface Courant Number mean: 0.00034175293 max: 0.13210925
deltaT = 8.5249158e-06
Time = 1.25438208

MULES: Solving for alpha1
Phase-1 volume fraction = 0.9321618 Min(alpha1) = -1.7904223e-19 Max(alpha1) = 1.0000387
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216181 Min(alpha1) = -7.3661288e-20 Max(alpha1) = 1.0000406
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216183 Min(alpha1) = -3.8216149e-20 Max(alpha1) = 1.0000422
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216184 Min(alpha1) = -8.0706645e-20 Max(alpha1) = 1.0000437
GAMG: Solving for p_rgh, Initial residual = 0.00072098495, Final residual = 8.618023e-06, No Iterations 2
GAMG: Solving for p_rgh, Initial residual = 0.00010279397, Final residual = 6.9378374e-06, No Iterations 1
time step continuity errors : sum local = 5.0463324e-10, global = 1.4211672e-12, cumulative = -0.00078028932
GAMG: Solving for p_rgh, Initial residual = 0.00016345643, Final residual = 4.7007859e-06, No Iterations 2
GAMGPCG: Solving for p_rgh, Initial residual = 5.0320654e-05, Final residual = 1.5890004e-06, No Iterations 1
time step continuity errors : sum local = 1.1557738e-10, global = -6.230831e-13, cumulative = -0.00078028933
DILUPBiCG: Solving for k, Initial residual = 0.00023066785, Final residual = 6.6803586e-09, No Iterations 3
bounding k, min: -9.637777e-05 max: 4.9497763 average: 0.042689423
ExecutionTime = 232574.05 s ClockTime = 233071 s

Courant Number mean: 0.011173537 max: 0.50453059
Interface Courant Number mean: 0.00033831272 max: 0.13069177
deltaT = 8.4479993e-06
Time = 1.25439053

MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216185 Min(alpha1) = -1.5646013e-19 Max(alpha1) = 1.0000451
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216186 Min(alpha1) = -8.945875e-20 Max(alpha1) = 1.0000463
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216187 Min(alpha1) = -2.122793e-19 Max(alpha1) = 1.0000474
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216189 Min(alpha1) = -1.8830937e-19 Max(alpha1) = 1.0000485
GAMG: Solving for p_rgh, Initial residual = 0.00067100595, Final residual = 8.5266915e-06, No Iterations 2
GAMG: Solving for p_rgh, Initial residual = 9.7152886e-05, Final residual = 6.6805452e-06, No Iterations 1
time step continuity errors : sum local = 4.7779724e-10, global = -1.6421516e-12, cumulative = -0.00078028933
GAMG: Solving for p_rgh, Initial residual = 0.00016094755, Final residual = 4.6225229e-06, No Iterations 2
GAMGPCG: Solving for p_rgh, Initial residual = 4.9303045e-05, Final residual = 1.5754646e-06, No Iterations 1
time step continuity errors : sum local = 1.1267799e-10, global = 2.7460826e-13, cumulative = -0.00078028933
DILUPBiCG: Solving for k, Initial residual = 0.00022857712, Final residual = 5.4361084e-09, No Iterations 3
bounding k, min: -0.00070291813 max: 5.1064455 average: 0.042690956
ExecutionTime = 232575.42 s ClockTime = 233072 s

Courant Number mean: 0.011072711 max: 0.50475466
Interface Courant Number mean: 0.00033544306 max: 0.12948992
deltaT = 8.3598681e-06
Time = 1.25439889

MULES: Solving for alpha1
Phase-1 volume fraction = 0.9321619 Min(alpha1) = -7.8526875e-20 Max(alpha1) = 1.0000496
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216191 Min(alpha1) = -1.0447194e-19 Max(alpha1) = 1.0000507
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216192 Min(alpha1) = -8.0890711e-20 Max(alpha1) = 1.0000516
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216193 Min(alpha1) = -2.1416833e-19 Max(alpha1) = 1.0000525
GAMG: Solving for p_rgh, Initial residual = 0.00067214463, Final residual = 8.3396353e-06, No Iterations 2
GAMG: Solving for p_rgh, Initial residual = 9.3977145e-05, Final residual = 6.43286e-06, No Iterations 1
time step continuity errors : sum local = 4.5116681e-10, global = -1.6635553e-12, cumulative = -0.00078028933
GAMG: Solving for p_rgh, Initial residual = 0.00015785756, Final residual = 4.5155882e-06, No Iterations 2
GAMGPCG: Solving for p_rgh, Initial residual = 4.8203437e-05, Final residual = 1.5168408e-06, No Iterations 1
time step continuity errors : sum local = 1.0638073e-10, global = 2.1630499e-13, cumulative = -0.00078028933
DILUPBiCG: Solving for k, Initial residual = 0.00022623408, Final residual = 3.8751548e-09, No Iterations 3
bounding k, min: -0.018329866 max: 5.2509637 average: 0.042692732
ExecutionTime = 232576.78 s ClockTime = 233073 s

Courant Number mean: 0.010957187 max: 0.50412939
Interface Courant Number mean: 0.00033184779 max: 0.12811218
deltaT = 8.2856681e-06
Time = 1.25440717

MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216194 Min(alpha1) = -6.9553818e-20 Max(alpha1) = 1.0000535
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216195 Min(alpha1) = -1.49288e-19 Max(alpha1) = 1.0000544
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216197 Min(alpha1) = -8.8329747e-20 Max(alpha1) = 1.0000552
MULES: Solving for alpha1
Phase-1 volume fraction = 0.93216198 Min(alpha1) = -1.8334624e-19 Max(alpha1) = 1.000056
GAMG: Solving for p_rgh, Initial residual = 0.00062626096, Final residual = 8.1036593e-06, No Iterations 2
GAMG: Solving for p_rgh, Initial residual = 8.8361519e-05, Final residual = 6.1779555e-06, No Iterations 1
time step continuity errors : sum local = 4.2607103e-10, global = -5.6739447e-13, cumulative = -0.00078028933
GAMG: Solving for p_rgh, Initial residual = 0.00015557742, Final residual = 4.3762882e-06, No Iterations 2
GAMGPCG: Solving for p_rgh, Initial residual = 4.7961358e-05, Final residual = 1.5262668e-06, No Iterations 1
time step continuity errors : sum local = 1.0526006e-10, global = -2.1962042e-13, cumulative = -0.00078028933
[2] #0 Foam::error: printStack(Foam::Ostream&)[7] #0 Foam::error: printStack(Foam::Ostream&)[8] #0 Foam::error: printStack(Foam::Ostream&)[4] #0 Foam::error: printStack(Foam::Ostream&)[5] #0 Foam::error: printStack(Foam::Ostream&)[6] #0 [9] #0 Foam::error: printStack(Foam::Ostream&)[10] #0 Foam::error: printStack(Foam::Ostream&)[11] #0 Foam::error: printStack(Foam::Ostream&)[1] #0 Foam::error: printStack(Foam::Ostream&)[3] #0 Foam::error: printStack(Foam::Ostream&)Foam::error: printStack(Foam::Ostream&)[0] #0 Foam::error:rintStack(Foam::Ostream&)[14] #0 Foam::error: printStack(Foam::Ostream&)[17] #0 Foam::error: printStack(Foam::Ostream&)[19] #0 Foam::error: printStack(Foam::Ostream&)[20] #0 Foam::error: printStack(Foam::Ostream&)[12] #0 Foam::error: printStack(Foam::Ostream&)[13] #0 Foam::error: printStack(Foam::Ostream&)[15] #0 Foam::error: printStack(Foam::Ostream&)[16] #0 Foam::error: printStack(Foam::Ostream&)[18] #0 Foam::error: printStack(Foam::Ostream&)[21] #0 Foam::error: printStack(Foam::Ostream&)[22] #[23] #00 Foam::error: printStack(Foam::Ostream&)Foam::error: printStack(Foam::Ostream&)--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.


I hope you can help me and i will be able to give you all data you need like system/ 0/ and constant/ folders.

Best Regards

Ziad
ghadab is offline   Reply With Quote

Old   February 22, 2013, 04:43
Default
  #2
Senior Member
 
Bernhard
Join Date: Sep 2009
Location: Delft
Posts: 790
Rep Power: 22
Bernhard is on a distinguished road
I have been experiencing similar error-messages recently. Maybe you can look in this thread for hints: http://www.cfd-online.com/Forums/ope...ork-error.html
Bernhard is offline   Reply With Quote

Old   February 22, 2013, 05:07
Default
  #3
New Member
 
Ziad
Join Date: Oct 2012
Location: Raunheim
Posts: 5
Rep Power: 14
ghadab is on a distinguished road
Quote:
Originally Posted by Bernhard View Post
I have been experiencing similar error-messages recently. Maybe you can look in this thread for hints: http://www.cfd-online.com/Forums/ope...ork-error.html
Dear Bernhard,

thank you for your reply. Actually i am working on cluster and i do not have any idea if other applications run parallel with my simulation ( but i do not think so!). I started the same simulation 2 times, where in the second one i changed the MaxCo from 0.8 to 0.5. Both blow up.
ghadab is offline   Reply With Quote

Old   October 27, 2013, 11:34
Default
  #4
Member
 
Albert Tong
Join Date: Dec 2010
Location: Perth, WA, Australia
Posts: 76
Blog Entries: 1
Rep Power: 16
tfuwa is on a distinguished road
Quote:
Originally Posted by ghadab View Post
Dear Bernhard,

thank you for your reply. Actually i am working on cluster and i do not have any idea if other applications run parallel with my simulation ( but i do not think so!). I started the same simulation 2 times, where in the second one i changed the MaxCo from 0.8 to 0.5. Both blow up.
Hi Ziad,

Just wondering if you have solved this. I am experiencing the same problem. The simulation just blows up at a time long after the it is stable. Can you please share your solution? Many thanks.
__________________
Kind regards,

Albert
tfuwa is offline   Reply With Quote

Reply

Tags
interfoam, mpirun

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
InterFoam stops after deltaT goes to 1e14 francesco_b OpenFOAM Running, Solving & CFD 9 July 25, 2020 07:36
interFoam vs. simpleFoam channel flow comparison DanM OpenFOAM Running, Solving & CFD 12 January 31, 2020 16:26
under what circumstances would interFoam have alpha > 1? yanxiang OpenFOAM Running, Solving & CFD 1 July 15, 2013 17:29
Interfoam blows on parallel run danvica OpenFOAM Running, Solving & CFD 16 December 22, 2012 03:09
interFoam case blows up jrrygg OpenFOAM Running, Solving & CFD 8 November 14, 2012 04:16


All times are GMT -4. The time now is 15:35.