CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 13, 2018, 10:15
Default OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD
  #1
New Member
 
Aaron Endres
Join Date: Jun 2016
Posts: 12
Rep Power: 9
aendres is on a distinguished road
Dear OpenFOAM users,

I have noticed a problem with the annularThermalMixer tutorial for rhoPimpleDyMFoam in newer OpenFOAM versions (3.0 to 5.0 and also v1606+ and v1706+, OpenFOAM 2.3.1 worked fine). When running rhoPimpleDyMFoam on 2 Haswell nodes (56 cores) with MPI DAPL enabled, the solver crashes. Usually this happens during the update of the cyclic AMI boundary condition. The error message is related to the communication between the nodes, as it does not appear when running on just one node with the same decomposition. The error occurrence is independent of the composition method (scotch or hierarchical). It is always something related to IO operations:

[34] --> FOAM FATAL IO ERROR:
[34] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token
[34] file: IOstream at line 0.
[34] From function void Foam::IOstream::fatalCheck(const char*) const
[34] in file db/IOstreams/IOstreams/IOstream.C at line 109.
FOAM parallel run exiting

or

[29] --> FOAM FATAL IO ERROR:
[29] Expected a ')' or a '}' while reading List, found on line 0 the label 0
[29] file: IOstream at line 0.
[29] From function char Foam::Istream::readEndList(const char*)
[29] in file db/IOstreams/IOstreams/Istream.C at line 155.
FOAM parallel run exiting

With these MPI settings, the error occurs:
export I_MPI_DAPL_UD=enable
export I_MPI_FABRICS_LIST=dapl,ofa,tcp
export I_MPI_FABRICS=shm:dapl
When switching to ofa instead of dapl, no errors occur:
unset I_MPI_DAPL_UD
unset I_MPI_FABRICS_LIST
export I_MPI_FABRICS=shmfa

(the smiley is a ":" and "o")
Has anyone of you experienced similar problems? Do you think this is an OpenFOAM bug or do you think this has something to do with the MPI setup of the cluster I am running OpenFOAM on? The strange thing is that everything without dynamic mesh and cyclicAMI works fine with dapl.

Thanks in advance for your help!
Aaron
aendres is offline   Reply With Quote

Old   February 12, 2019, 04:41
Default
  #2
New Member
 
Join Date: Jun 2018
Posts: 11
Rep Power: 7
xshmuel is on a distinguished road
Hi aendres

I was wondering if you had solved you issue? I am currently trying to implement a dynamic mesh for dsmcFoamPlus and get the same error.
xshmuel is offline   Reply With Quote

Old   February 12, 2019, 06:34
Default
  #3
New Member
 
Aaron Endres
Join Date: Jun 2016
Posts: 12
Rep Power: 9
aendres is on a distinguished road
Hi xshmuel,


I still don't know what exactly causes the errors, but by changing the mpi settings as described above I was able to run the code without errors...
aendres is offline   Reply With Quote

Old   February 24, 2019, 22:03
Default
  #4
Member
 
Dongxu Wang
Join Date: Sep 2018
Location: China
Posts: 33
Rep Power: 7
wdx_cfd is on a distinguished road
Quote:
Originally Posted by aendres View Post
Hi xshmuel,


I still don't know what exactly causes the errors, but by changing the mpi settings as described above I was able to run the code without errors...
Hi aendres,

I am facing a very similar problem which makes me very confused. My program can run in serial model. But it will crash when ran parallel. I want to try your settings but I don't know what to do. Where could I modify this settings?

Here is my problem:
Why my program cannot ran in parallel???

Thank you very much!
wdx_cfd is offline   Reply With Quote

Old   February 25, 2019, 02:48
Default
  #5
New Member
 
Aaron Endres
Join Date: Jun 2016
Posts: 12
Rep Power: 9
aendres is on a distinguished road
Hi Dongxu,


the mpi settings are modified with simple bash commands. Just copy the three 'unset' and 'export' commands to your bash script you are running on the cluster or to the terminal from which you are starting openfoam. You could also add them to your etc/bashrc file in your OpenFoam source directory after making sure that they fix your problem.
aendres is offline   Reply With Quote

Old   February 25, 2019, 05:18
Default
  #6
Member
 
Dongxu Wang
Join Date: Sep 2018
Location: China
Posts: 33
Rep Power: 7
wdx_cfd is on a distinguished road
Quote:
Originally Posted by aendres View Post
Hi Dongxu,


the mpi settings are modified with simple bash commands. Just copy the three 'unset' and 'export' commands to your bash script you are running on the cluster or to the terminal from which you are starting openfoam. You could also add them to your etc/bashrc file in your OpenFoam source directory after making sure that they fix your problem.
Hi aendres,

Thank you for your reply!

I've tried the commands in my terminal but it doesn't work.

I think this problem maybe a bug or something because my program could run in serial model. The critical problem is how to end up the pimple outer loop. Once my newly added criterion is utilised the problem will ocurr. So I think maybe I should pay more attention to the residuals control of PIMPLE algorithm.

GL
wdx_cfd is offline   Reply With Quote

Reply

Tags
cylcicami, dapl, dynamic mesh, openfoam 5.x


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 06:47.