CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Bugs

Crash with interDyMFoam + dynamicRefineFvMesh + 64procs

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 20, 2017, 13:12
Default Crash with interDyMFoam + dynamicRefineFvMesh + 64procs
  #1
New Member
 
Roberto Ribeiro
Join Date: Jan 2017
Posts: 3
Rep Power: 9
RobertoRibeiro is on a distinguished road
Hello everyone,

I'm facing a crash with a practically out-of-box run with foam-extend-3.1 (and 3.2 and 4.0).

Basically, I'm grabbing the damBreakWithObstacle tutorial, adjusting it to run with 64procs and I'm getting the following error at an arbitrary time-step:


Code:
Courant Number mean: 0.00153715 max: 0.111756 velocity magnitude: 1.28822
Time = 0.017

Selected 5 cells for refinement out of 102711.

    From function void polyMesh::initMesh()
    in file meshes/polyMesh/polyMeshInitMesh.C at line 81
    Truncating neighbour list at 22329 for backward compatibility
Refined from 102711 to 102746 cells.
Selected 0 split points out of a possible 8633.
time step continuity errors : sum local = 2.9183e-06, global = -5.04178e-11, cumulative = -2.40664e-05
GAMGPCG:  Solving for pcorr, Initial residual = 1, Final residual = 2.78744e-05, No Iterations 4
time step continuity errors : sum local = 1.02223e-09, global = 3.25557e-14, cumulative = -2.40664e-05
Courant Number mean: 0.00153664 max: 0.108621 velocity magnitude: 1.05436
MULES: Solving for alpha1
Liquid phase volume fraction = 0.0848214  Min(alpha1) = -7.32253e-22  Max(alpha1) = 1
MULES: Solving for alpha1
Liquid phase volume fraction = 0.0848214  Min(alpha1) = -1.49078e-22  Max(alpha1) = 1
MULES: Solving for alpha1
Liquid phase volume fraction = 0.0848214  Min(alpha1) = -1.21907e-22  Max(alpha1) = 1
smoothSolver:  Solving for Ux, Initial residual = 0.0297867, Final residual = 2.39419e-08, No Iterations 3
smoothSolver:  Solving for Uy, Initial residual = 0.0290037, Final residual = 2.9586e-08, No Iterations 3
smoothSolver:  Solving for Uz, Initial residual = 0.0495207, Final residual = 3.42538e-08, No Iterations 3
GAMG:  Solving for pd, Initial residual = 0.00403362, Final residual = 0.000170153, No Iterations 1
time step continuity errors : sum local = 7.24378e-05, global = 3.60611e-09, cumulative = -2.40628e-05
GAMG:  Solving for pd, Initial residual = 0.000210132, Final residual = 6.90479e-06, No Iterations 3
time step continuity errors : sum local = 2.92656e-06, global = 5.70459e-08, cumulative = -2.40058e-05
GAMG:  Solving for pd, Initial residual = 2.78919e-05, Final residual = 1.33396e-06, No Iterations 3
time step continuity errors : sum local = 5.65676e-07, global = 1.19196e-08, cumulative = -2.39939e-05
GAMGPCG:  Solving for pd, Initial residual = 5.38728e-06, Final residual = 9.72962e-09, No Iterations 3
time step continuity errors : sum local = 4.12615e-09, global = -5.75693e-11, cumulative = -2.39939e-05
ExecutionTime = 7.64 s  ClockTime = 12 s

Courant Number mean: 0.00162271 max: 0.111354 velocity magnitude: 1.27342
Time = 0.018

Selected 4 cells for refinement out of 102746.

    From function void polyMesh::initMesh()
    in file meshes/polyMesh/polyMeshInitMesh.C at line 81
    Truncating neighbour list at 22329 for backward compatibility
Refined from [48]
[48]
[48] --> FOAM FATAL IO ERROR:
[48] incorrect first token, expected <int> or '(', found on line 0 the word 'x'
[48]
[48] file: IOstream at line 0.
[48]
[48]     From function operator>>(Istream&, List<T>&)
[48]     in file /work/01502/rri/work/openfoam/extend-3.1/builds/master-sandy/foam-extend-3.1/src/foam/lnInclude/ListIO.C at line 149.
[48]
FOAM parallel run exiting
[48]
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 48
TACC: MPI job exited with code: 1

TACC: Shutdown complete. Exiting.
I'm running this in a cluster with IntelMPI and compiled OF with ICC. The run used 4 identical nodes. The mesh size is default, I disabled the adpativeTimeStep for simplicity, and those were basically my only changes.

Any help would be appreciated.
Thanks
RobertoRibeiro is offline   Reply With Quote

Old   April 10, 2017, 11:11
Default
  #2
New Member
 
Roberto Ribeiro
Join Date: Jan 2017
Posts: 3
Rep Power: 9
RobertoRibeiro is on a distinguished road
Hi,

I've been trying to identify the source of the problem. So far I've identify a frequent point of crash in the Foam::hexRef8::setRefinement, specifically where we call syncTools::syncEdgeList. It frequently triggers a:

[19] --> FOAM FATAL IO ERROR:
[19] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token

It seems like inconsistent data between coupled processor patches that causes problems in the parsing of the messages. The inconsistency is probably generated by the refinement, either on the current step or on the previous. I have enabled all the pertinent checks but none reported any problem.

Any help would extremely appreciated
Thank you
RobertoRibeiro is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
interDyMFoam: Crash due to cells compression FC_90 OpenFOAM Running, Solving & CFD 0 July 28, 2015 06:16
Simulation crash with dynamicRefineFvMesh and kOmegaSST - OF 2.3.x nathanael OpenFOAM Running, Solving & CFD 4 June 29, 2014 17:02
modifying interDyMFoam for floatingObject Elisabeth_ofoam OpenFOAM Programming & Development 7 June 11, 2014 08:42
error using interDyMFoam with kOmegaSST to simulate sloshing anmartin OpenFOAM Running, Solving & CFD 0 July 20, 2010 13:21
OpenFOAM15dev interDyMFoam problem using dynamicRefineFvMesh eberberovic OpenFOAM Bugs 6 January 14, 2010 05:18


All times are GMT -4. The time now is 21:00.