CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   March 13, 2018, 10:15
Default OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD
  #1
New Member
 
Aaron Endres
Join Date: Jun 2016
Posts: 3
Rep Power: 4
aendres is on a distinguished road
Dear OpenFOAM users,

I have noticed a problem with the annularThermalMixer tutorial for rhoPimpleDyMFoam in newer OpenFOAM versions (3.0 to 5.0 and also v1606+ and v1706+, OpenFOAM 2.3.1 worked fine). When running rhoPimpleDyMFoam on 2 Haswell nodes (56 cores) with MPI DAPL enabled, the solver crashes. Usually this happens during the update of the cyclic AMI boundary condition. The error message is related to the communication between the nodes, as it does not appear when running on just one node with the same decomposition. The error occurrence is independent of the composition method (scotch or hierarchical). It is always something related to IO operations:

[34] --> FOAM FATAL IO ERROR:
[34] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token
[34] file: IOstream at line 0.
[34] From function void Foam::IOstream::fatalCheck(const char*) const
[34] in file db/IOstreams/IOstreams/IOstream.C at line 109.
FOAM parallel run exiting

or

[29] --> FOAM FATAL IO ERROR:
[29] Expected a ')' or a '}' while reading List, found on line 0 the label 0
[29] file: IOstream at line 0.
[29] From function char Foam::Istream::readEndList(const char*)
[29] in file db/IOstreams/IOstreams/Istream.C at line 155.
FOAM parallel run exiting

With these MPI settings, the error occurs:
export I_MPI_DAPL_UD=enable
export I_MPI_FABRICS_LIST=dapl,ofa,tcp
export I_MPI_FABRICS=shm:dapl
When switching to ofa instead of dapl, no errors occur:
unset I_MPI_DAPL_UD
unset I_MPI_FABRICS_LIST
export I_MPI_FABRICS=shmfa

(the smiley is a ":" and "o")
Has anyone of you experienced similar problems? Do you think this is an OpenFOAM bug or do you think this has something to do with the MPI setup of the cluster I am running OpenFOAM on? The strange thing is that everything without dynamic mesh and cyclicAMI works fine with dapl.

Thanks in advance for your help!
Aaron
aendres is offline   Reply With Quote

Reply

Tags
cylcicami, dapl, dynamic mesh, openfoam 5.x

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 02:39.