CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

error while running mpirun command in openfoam

Register Blogs Community New Posts Updated Threads Search

Like Tree5Likes
  • 4 Post By wyldckat
  • 1 Post By gamemakerh

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 28, 2017, 07:27
Default error while running mpirun command in openfoam
  #1
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 9
tarunw is on a distinguished road
I am getting following error message while running mpirun command on a server

Code:
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
This message appears only when I am running the code in parallel. How to rectify this problem?
Note: I installed OF-3.0.1 on the server from source code

Last edited by wyldckat; June 4, 2017 at 18:15. Reason: Added [CODE][/CODE] markers
tarunw is offline   Reply With Quote

Old   April 30, 2017, 12:39
Default
  #2
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
__________________
wyldckat is offline   Reply With Quote

Old   June 3, 2017, 18:26
Default
  #3
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 9
tarunw is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
hi bruno
Sorry for the late reply. I tried my best to rectify the situation with the help of the forum but unable to solve the problem.

mpirun version used is openmpi-1.10.0. OF version-3.0.1 and OS on cluster is CentOS 6.7. I have compressed installed OF and thirdParty folders and am sending you the google drive link to that file in a PM. Please have a look at my problem. I request you to please help me as soon as you can (I have to run cases for my thesis and not much time left!!). Eagerly waiting for your reply.

Thanks & Regards
Tarun
tarunw is offline   Reply With Quote

Old   June 4, 2017, 20:13
Default
  #4
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Tarun,

Many thanks for sharing your whole installation! I've tested it in a VM with CentOS 6.8 and was able to reproduce the problem.

Furthermore, I've finally figured out how to solve the problem. If you run the following command before trying to run mpirun:
Code:
export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
you should no longer get that error message.

If this solves the problem, then:
  1. Edit the file "$HOME/OpenFOAM/OpenFOAM-3.0.1/etc/config/settings.sh".
  2. Look for these lines:
    Code:
    OPENMPI)
        export FOAM_MPI=openmpi-1.10.0
        # Optional configuration tweaks:
        _foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config/openmpi.sh`
    
        export MPI_ARCH_PATH=$WM_THIRD_PARTY_DIR/platforms/$WM_ARCH$WM_COMPILER/$FOAM_MPI
    
        # Tell OpenMPI where to find its install directory
        export OPAL_PREFIX=$MPI_ARCH_PATH
  3. Add after the line with "OPAL_PREFIX" the new line for "OPAL_LIBDIR", so that those two lines looks like this:
    Code:
        export OPAL_PREFIX=$MPI_ARCH_PATH
        export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
  4. Save and close the file.
  5. Next time you start a new terminal window/tab and/or active this OpenFOAM 3.0.1 environment, it should work as intended.


Curiously enough, I had spotted this issue back in 2013: mpicc link doesn't update when moving OF 2.2.1 installation

Best regards,
Bruno
sita, wenxu, gamemakerh and 1 others like this.
__________________
wyldckat is offline   Reply With Quote

Old   February 12, 2018, 08:58
Default
  #5
New Member
 
Join Date: Sep 2014
Posts: 9
Rep Power: 11
gamemakerh is on a distinguished road
Thank you Bruno !


On one machine I had problem, solution you described, but on another machine I'm getting way different problem...

I want to use same case but I'm getting error message.

Code:
mpirun has detected an attempt to run as root.
Running at root is *strongly* discouraged as any mistake (e.g., in
defining TMPDIR) or bug can result in catastrophic damage to the OS
file system, leaving your system in an unusable state.

You can override this protection by adding the --allow-run-as-root
option to your cmd line. However, we reiterate our strong advice
against doing so - please do so at your own risk.
The main issue (this is my initial guess) is that I'm using Ubuntu server.

every time I want to launch OpenFOAM I gave to call ./startOpenFOAM command. after this I cant use any mpirun commands.

My workflow is showed below

Code:
0. gmsh file.geo -3 -o file.msh
1. gmshToFoam file.msh
2. changeDirectory
3. decomposePar
4. mpirun -np 4 buoyantBoussinesqSimpleFoam -parallel
5. reconstructPar -latesttime
6. rm -r proc*
Looking forward for any ideas or reference links. Thank you in advance.
rmaries likes this.
gamemakerh is offline   Reply With Quote

Old   February 13, 2018, 05:15
Default mpirun not found
  #6
New Member
 
noname
Join Date: Feb 2018
Posts: 1
Rep Power: 0
Veronicapa is on a distinguished road
Hey

I am a beginner user of OpenFOAM yet. I am trying to run the snappyHexMesh in parallel, but this error comes out:

$ mpirun -np 4 snappyHeymesh -overwrite -parallel
-bash: mpirun: command not found

If anyone can give me advise how to run it in parallel or what is the issue, i will appreciate.


Thanks
Veronicapa is offline   Reply With Quote

Old   February 14, 2018, 08:03
Default
  #7
New Member
 
Join Date: Oct 2017
Location: Germany
Posts: 26
Rep Power: 8
rajan19us is on a distinguished road
Seriously?? are you really using 'snappyHeymesh' instead of 'SnappyHexMesh' ??
rajan19us is offline   Reply With Quote

Old   March 10, 2019, 15:40
Default mpirun has attempted to run as root.
  #8
Member
 
Maries
Join Date: Mar 2010
Location: Cologne, Germany
Posts: 75
Rep Power: 16
rmaries is on a distinguished road
Hi,


I too got this problem in my ubuntu machine. Any one have solution?


I have installed openfoam-1812 using docker in my ubuntu. I am getting this problem.



Thanks in advanc



Quote:
Originally Posted by gamemakerh View Post
Thank you Bruno !


On one machine I had problem, solution you described, but on another machine I'm getting way different problem...

I want to use same case but I'm getting error message.

Code:
mpirun has detected an attempt to run as root.
Running at root is *strongly* discouraged as any mistake (e.g., in
defining TMPDIR) or bug can result in catastrophic damage to the OS
file system, leaving your system in an unusable state.

You can override this protection by adding the --allow-run-as-root
option to your cmd line. However, we reiterate our strong advice
against doing so - please do so at your own risk.
The main issue (this is my initial guess) is that I'm using Ubuntu server.

every time I want to launch OpenFOAM I gave to call ./startOpenFOAM command. after this I cant use any mpirun commands.

My workflow is showed below

Code:
0. gmsh file.geo -3 -o file.msh
1. gmshToFoam file.msh
2. changeDirectory
3. decomposePar
4. mpirun -np 4 buoyantBoussinesqSimpleFoam -parallel
5. reconstructPar -latesttime
6. rm -r proc*
Looking forward for any ideas or reference links. Thank you in advance.
rmaries is offline   Reply With Quote

Old   October 31, 2019, 07:13
Default
  #9
New Member
 
Marina
Join Date: Sep 2019
Posts: 6
Rep Power: 6
Marina PA is on a distinguished road
Hi,
I got the same error while running in parallel. Have anybody found a solution for this problem?
Thanks in advanced.
Marina PA is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun start second run on same processor as the already running Simjoh OpenFOAM Running, Solving & CFD 5 February 21, 2024 07:09
[OpenFOAM.org] Running OpenFOAM KateEisenhower OpenFOAM Installation 3 March 10, 2015 13:56
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 04:59
Problem with mpirun with OpenFOAM jiejie OpenFOAM 3 July 7, 2010 19:30
Random machine freezes when running several OpenFoam jobs simultaneously 2bias OpenFOAM Installation 5 July 2, 2010 07:40


All times are GMT -4. The time now is 09:38.