CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Open MPI-fork() error

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree4Likes

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   September 2, 2018, 18:37
Default
  #21
Senior Member
 
Join Date: Jan 2013
Posts: 129
Rep Power: 8
kkpal is on a distinguished road
I also had the same error after the upgrade of our HPC. Before that the same case ran well. I think it might be a problem of system setting.
kkpal is offline   Reply With Quote

Old   September 21, 2018, 19:01
Default
  #22
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by smraniaki View Post
I've had the same problem. This is one of those problem drive me crazy! I could unreasonably solve the problem by modifying fvSolution dictionary (changing a smoother). The other time I could get rid of it by changing my decomposition scheme. I still don't know how and why it happens but apparently it comes from ghost cells that are not identifiable by MPI.

Goodluck
Smran
Thank you that was very useful. I know this is a 4 year old post.

But any more insight would be extremely valuable. I have a custom PISO solver with GAMG as matrix solver and with CyclicBC.

Right now I use a GaussSeidel smoother, what smoother did you use to mitigate this problem?

Regards,

Quote:
Originally Posted by kkpal View Post
I also had the same error after the upgrade of our HPC. Before that the same case ran well. I think it might be a problem of system setting.
Did you mitigate it?

you have cyclic BCs?

Last edited by wyldckat; September 23, 2018 at 13:38. Reason: merged posts a few minutes apart
fedvasu is offline   Reply With Quote

Old   September 21, 2018, 19:31
Default
  #23
Senior Member
 
Join Date: Jan 2013
Posts: 129
Rep Power: 8
kkpal is on a distinguished road
Yes. The HPC upgraded the openmpi version. I changed it back to the original version and the problem is solved. I do have cyclic BCs.

Quote:
Originally Posted by fedvasu View Post
Did you mitigate it?

you have cyclic BCs?
fedvasu likes this.
kkpal is offline   Reply With Quote

Old   September 24, 2018, 16:32
Default
  #24
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by kkpal View Post
Yes. The HPC upgraded the openmpi version. I changed it back to the original version and the problem is solved. I do have cyclic BCs.
Thanks kkpal,

But may I know more specific details?

which version of OpenFOAM? did you/admin compile OF with system MPI or OPENMPI from ThirdParty?

which version of OPENMPI gave you this trouble and which(original) version solved it.
fedvasu is offline   Reply With Quote

Old   November 30, 2018, 07:00
Default
  #25
New Member
 
Gazi Yavuz
Join Date: Apr 2018
Posts: 13
Rep Power: 3
uckmhnds is on a distinguished road
That issue still exist with OF 6.0 but you may not have any problem with newer release OF 6.2xx. The problem is basically about the fact that OpenMPI divides CFD case into the processors to solve it simultaneously with different processors and fork() environment might do the same job as an sub-program (let's say) such that it blocks OpenMPI to communicate between processors. It is a bug about OpenMPI and OpenFOAM enviroments
uckmhnds is offline   Reply With Quote

Old   December 2, 2018, 20:46
Default
  #26
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by uckmhnds View Post
That issue still exist with OF 6.0 but you may not have any problem with newer release OF 6.2xx. The problem is basically about the fact that OpenMPI divides CFD case into the processors to solve it simultaneously with different processors and fork() environment might do the same job as an sub-program (let's say) such that it blocks OpenMPI to communicate between processors. It is a bug about OpenMPI and OpenFOAM enviroments
thank youfor letting us know.

is there a specific bug report or reports of this to OF devs?

it is very difficult for me to reproduce, sometimes the simulation crashes/sometimes doesn't if i change my pressure solver from GAMG to PCG it helps and in some other cases it doesn't
fedvasu is offline   Reply With Quote

Old   December 4, 2018, 17:12
Default
  #27
New Member
 
Gazi Yavuz
Join Date: Apr 2018
Posts: 13
Rep Power: 3
uckmhnds is on a distinguished road
Quote:
Originally Posted by fedvasu View Post
thank youfor letting us know.

is there a specific bug report or reports of this to OF devs?

it is very difficult for me to reproduce, sometimes the simulation crashes/sometimes doesn't if i change my pressure solver from GAMG to PCG it helps and in some other cases it doesn't

Actually, i did not report a bug to OF but my institution to upgrade software in clusters. They installed the release OpenFOAM/OpenFoam.org-6.20180918.git, which is still a 6.x version so compatible with older 6.0 version. I would recommend you to do so and you would not need to change anything in your case folders as you had to do between different versions (i.e 2.x to 3.x).
uckmhnds is offline   Reply With Quote

Old   January 31, 2019, 15:55
Default
  #28
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by uckmhnds View Post
Actually, i did not report a bug to OF but my institution to upgrade software in clusters. They installed the release OpenFOAM/OpenFoam.org-6.20180918.git, which is still a 6.x version so compatible with older 6.0 version. I would recommend you to do so and you would not need to change anything in your case folders as you had to do between different versions (i.e 2.x to 3.x).
Yeah Thanks, I now need to run my bigger cases, these segfault crashes are random and no particular remedy or workaround solves it.

What software did your institute upgrade? apart from you installing OF-dev?
fedvasu is offline   Reply With Quote

Old   January 31, 2019, 16:15
Default
  #29
New Member
 
Gazi Yavuz
Join Date: Apr 2018
Posts: 13
Rep Power: 3
uckmhnds is on a distinguished road
Quote:
Originally Posted by fedvasu View Post
Yeah Thanks, I now need to run my bigger cases, these segfault crashes are random and no particular remedy or workaround solves it.

What software did your institute upgrade? apart from you installing OF-dev?

They have just installed "OpenFOAM/OpenFoam.org-6.20180918.git". That solved the problem.
uckmhnds is offline   Reply With Quote

Old   January 31, 2019, 16:18
Default
  #30
New Member
 
Gazi Yavuz
Join Date: Apr 2018
Posts: 13
Rep Power: 3
uckmhnds is on a distinguished road
If you still have the same problem with the new release. Please, post here the given error such that we might find another solution.
uckmhnds is offline   Reply With Quote

Old   January 31, 2019, 19:20
Default
  #31
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by uckmhnds View Post
If you still have the same problem with the new release. Please, post here the given error such that we might find another solution.
Unfortunately, I have to install OF myself, I was unable to use system MPI and system Intel compiler. I had used GCC and ThirdParty OpenMPI.

I will try to install OF-dev with Intel compiler and MPI(system).

If I have same problem [random crashes of my cases], I will post them.

It might take a few days because sometimes the cases crash quickly and sometimes they don't.

I am going to make a detailed report of why this problem persists and ask my cluster admin to install it on my behalf.
fedvasu is offline   Reply With Quote

Old   January 31, 2019, 19:23
Default
  #32
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by uckmhnds View Post
If you still have the same problem with the new release. Please, post here the given error such that we might find another solution.
And could you please tell me what is your cluster environment,

specifically OS,
Compiler used to compile latest OF,
MPI used (whether it was system or ThirdParty)


This would be very helpful for me.
fedvasu is offline   Reply With Quote

Old   February 6, 2019, 18:15
Default
  #33
Senior Member
 
Mark Olesen
Join Date: Mar 2009
Location: http://olesenm.github.io/
Posts: 882
Rep Power: 24
olesen will become famous soon enough
Quote:
Originally Posted by fedvasu View Post
And could you please tell me what is your cluster environment,

specifically OS,
Compiler used to compile latest OF,
MPI used (whether it was system or ThirdParty)


This would be very helpful for me.
I don't know if this is related but there was an issue with openmpi + fork on infiniband which was fixed around v1706 (or earlier/later).

If you cannot or do not wish to change OpenFOAM versions, often times you can avoid problems by making certain that all #calc and #code statements in the dictionaries are evaluated prior to sending things off in parallel. For example, by running 1 iteration in serial just to ensure that all dynamic code has been compiled.
olesen is offline   Reply With Quote

Old   February 6, 2019, 18:24
Default Compiling with IntelMPI and appropriately including impi headers
  #34
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
The problem arose due to, compiling openfoam with ThirdParty OpenMPI, but compiling with IntelMPI and Intel compiler (System) seems to have alleviated the problem.

Previously I was unable to install using Intel C++ and MPi because I wasn't including the IMPI headers correctly.

@olesen I don't have any cases with calc or code in my dictionaries.
fedvasu is offline   Reply With Quote

Old   April 30, 2019, 23:15
Default
  #35
New Member
 
abdul
Join Date: Feb 2019
Posts: 1
Rep Power: 0
abdullahbolek is on a distinguished road
I encountered also with the same problem when I try to run my case in HPC, which has OpenFOAM 5.0 installed. I am running the same case in my desktop without any problem with OpenFOAM 6.0.
abdullahbolek is offline   Reply With Quote

Old   May 14, 2019, 22:50
Default
  #36
Member
 
Join Date: Oct 2013
Posts: 82
Rep Power: 7
fedvasu is on a distinguished road
Quote:
Originally Posted by abdullahbolek View Post
I encountered also with the same problem when I try to run my case in HPC, which has OpenFOAM 5.0 installed. I am running the same case in my desktop without any problem with OpenFOAM 6.0.
try to reinstall OF-6 with intel-mpi with icc or opnmpi with gcc on your cluster, I think that is the only way you can solve this.
fedvasu is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[OpenFOAM] Native ParaView Reader Bugs tj22 ParaView 270 January 4, 2016 11:39
polynomial thermophysical properties II sebastian OpenFOAM Running, Solving & CFD 53 April 10, 2014 05:45
[OpenFOAM] Saving ParaFoam views and case sail ParaView 9 November 25, 2011 15:46
CGNS lib and Fortran compiler manaliac Main CFD Forum 2 November 29, 2010 06:25
Version 15 on Mac OS X gschaider OpenFOAM Installation 113 December 2, 2009 10:23


All times are GMT -4. The time now is 12:24.