CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

MPI bug?

Register Blogs Community New Posts Updated Threads Search

Like Tree3Likes
  • 2 Post By dlahaye
  • 1 Post By Simbelmynė

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 31, 2023, 16:33
Default MPI bug?
  #1
Senior Member
 
Simbelmynė's Avatar
 
Join Date: May 2012
Posts: 546
Rep Power: 15
Simbelmynė is on a distinguished road
Hey Foamers,

I have added a simple scalar transport to interFoam (let's say it is Temperature). It runs well in serial. However, when running in parallel it diverges if the diffusivity is too high (but works well if the diffusivity is low).

Simple or scotch decomposition both diverges although at different points in the simulation. Similarly, if using a different number of processes (other than one), it diverges at different points in time.

Any thoughts?

OpenFOAM v9
Simbelmynė is offline   Reply With Quote

Old   June 1, 2023, 02:26
Default
  #2
Senior Member
 
Domenico Lahaye
Join Date: Dec 2013
Posts: 722
Blog Entries: 1
Rep Power: 17
dlahaye is on a distinguished road
Please make sure that sure that linear system is solved to same accuracy independent of the number of processors.

The block Jacobi preconditioner that openFoam implements is known to be less performant on larger number of processors (cfr. literature of iterative solution methods, see below). More iterations of the Krylov acceleration are thus required to reach the same level of accuracy.

Good luck.

References
arjun and Simbelmynė like this.
dlahaye is online now   Reply With Quote

Old   June 1, 2023, 05:18
Default
  #3
Senior Member
 
Simbelmynė's Avatar
 
Join Date: May 2012
Posts: 546
Rep Power: 15
Simbelmynė is on a distinguished road
Thank you, switching from PBiCG to PBiCGStab or a smooth solver solved the issue.
Simbelmynė is offline   Reply With Quote

Old   June 1, 2023, 05:20
Default
  #4
Senior Member
 
Domenico Lahaye
Join Date: Dec 2013
Posts: 722
Blog Entries: 1
Rep Power: 17
dlahaye is on a distinguished road
Really? Why is that?
dlahaye is online now   Reply With Quote

Old   June 1, 2023, 06:07
Default
  #5
Senior Member
 
Simbelmynė's Avatar
 
Join Date: May 2012
Posts: 546
Rep Power: 15
Simbelmynė is on a distinguished road
Quote:
Originally Posted by dlahaye View Post
Really? Why is that?

I have no idea. All solvers have the same convergence criteria (absolute 1e-16) and they all perform nicely for a few hundred time-steps. Depending on how the domain is decomposed, the PBiCG solver crashes at different points. The other solvers mentioned do not. Some message passing problem perhaps?



All solvers behave nicely in serial.
dlahaye likes this.

Last edited by Simbelmynė; June 1, 2023 at 12:19.
Simbelmynė is offline   Reply With Quote

Old   June 1, 2023, 12:16
Default
  #6
Senior Member
 
Domenico Lahaye
Join Date: Dec 2013
Posts: 722
Blog Entries: 1
Rep Power: 17
dlahaye is on a distinguished road
Thanks for the additional information.
dlahaye is online now   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Floating Point overflow and MPI tuning parms lstonebr STAR-CCM+ 0 September 1, 2019 17:41
mpirun, best parameters pablodecastillo Hardware 18 November 10, 2016 12:36
[OpenFOAM.org] MPI compiling and version mismatch pki OpenFOAM Installation 7 June 15, 2015 16:21
Error using LaunderGibsonRSTM on SGI ALTIX 4700 jaswi OpenFOAM 2 April 29, 2008 10:54
Is Testsuite on the way or not lakeat OpenFOAM Installation 6 April 28, 2008 11:12


All times are GMT -4. The time now is 05:50.