CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

error while running mpirun command in openfoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By wyldckat

Reply
 
LinkBack Thread Tools Display Modes
Old   March 28, 2017, 07:27
Default error while running mpirun command in openfoam
  #1
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 3
tarunw is on a distinguished road
I am getting following error message while running mpirun command on a server

Code:
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
This message appears only when I am running the code in parallel. How to rectify this problem?
Note: I installed OF-3.0.1 on the server from source code

Last edited by wyldckat; June 4, 2017 at 18:15. Reason: Added [CODE][/CODE] markers
tarunw is offline   Reply With Quote

Old   April 30, 2017, 12:39
Default
  #2
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,125
Blog Entries: 39
Rep Power: 110
wyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of light
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
__________________
wyldckat is offline   Reply With Quote

Old   June 3, 2017, 18:26
Default
  #3
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 3
tarunw is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
hi bruno
Sorry for the late reply. I tried my best to rectify the situation with the help of the forum but unable to solve the problem.

mpirun version used is openmpi-1.10.0. OF version-3.0.1 and OS on cluster is CentOS 6.7. I have compressed installed OF and thirdParty folders and am sending you the google drive link to that file in a PM. Please have a look at my problem. I request you to please help me as soon as you can (I have to run cases for my thesis and not much time left!!). Eagerly waiting for your reply.

Thanks & Regards
Tarun
tarunw is offline   Reply With Quote

Old   June 4, 2017, 20:13
Default
  #4
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,125
Blog Entries: 39
Rep Power: 110
wyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of light
Hi Tarun,

Many thanks for sharing your whole installation! I've tested it in a VM with CentOS 6.8 and was able to reproduce the problem.

Furthermore, I've finally figured out how to solve the problem. If you run the following command before trying to run mpirun:
Code:
export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
you should no longer get that error message.

If this solves the problem, then:
  1. Edit the file "$HOME/OpenFOAM/OpenFOAM-3.0.1/etc/config/settings.sh".
  2. Look for these lines:
    Code:
    OPENMPI)
        export FOAM_MPI=openmpi-1.10.0
        # Optional configuration tweaks:
        _foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config/openmpi.sh`
    
        export MPI_ARCH_PATH=$WM_THIRD_PARTY_DIR/platforms/$WM_ARCH$WM_COMPILER/$FOAM_MPI
    
        # Tell OpenMPI where to find its install directory
        export OPAL_PREFIX=$MPI_ARCH_PATH
  3. Add after the line with "OPAL_PREFIX" the new line for "OPAL_LIBDIR", so that those two lines looks like this:
    Code:
        export OPAL_PREFIX=$MPI_ARCH_PATH
        export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
  4. Save and close the file.
  5. Next time you start a new terminal window/tab and/or active this OpenFOAM 3.0.1 environment, it should work as intended.


Curiously enough, I had spotted this issue back in 2013: mpicc link doesn't update when moving OF 2.2.1 installation

Best regards,
Bruno
gamemakerh likes this.
__________________
wyldckat is offline   Reply With Quote

Old   February 12, 2018, 09:58
Default
  #5
New Member
 
Join Date: Sep 2014
Posts: 8
Rep Power: 5
gamemakerh is on a distinguished road
Thank you Bruno !


On one machine I had problem, solution you described, but on another machine I'm getting way different problem...

I want to use same case but I'm getting error message.

Code:
mpirun has detected an attempt to run as root.
Running at root is *strongly* discouraged as any mistake (e.g., in
defining TMPDIR) or bug can result in catastrophic damage to the OS
file system, leaving your system in an unusable state.

You can override this protection by adding the --allow-run-as-root
option to your cmd line. However, we reiterate our strong advice
against doing so - please do so at your own risk.
The main issue (this is my initial guess) is that I'm using Ubuntu server.

every time I want to launch OpenFOAM I gave to call ./startOpenFOAM command. after this I cant use any mpirun commands.

My workflow is showed below

Code:
0. gmsh file.geo -3 -o file.msh
1. gmshToFoam file.msh
2. changeDirectory
3. decomposePar
4. mpirun -np 4 buoyantBoussinesqSimpleFoam -parallel
5. reconstructPar -latesttime
6. rm -r proc*
Looking forward for any ideas or reference links. Thank you in advance.
gamemakerh is offline   Reply With Quote

Old   February 13, 2018, 06:15
Default mpirun not found
  #6
New Member
 
noname
Join Date: Feb 2018
Posts: 1
Rep Power: 0
Veronicapa is on a distinguished road
Hey

I am a beginner user of OpenFOAM yet. I am trying to run the snappyHexMesh in parallel, but this error comes out:

$ mpirun -np 4 snappyHeymesh -overwrite -parallel
-bash: mpirun: command not found

If anyone can give me advise how to run it in parallel or what is the issue, i will appreciate.


Thanks
Veronicapa is offline   Reply With Quote

Old   February 14, 2018, 09:03
Default
  #7
New Member
 
Join Date: Oct 2017
Location: Germany
Posts: 20
Rep Power: 2
rajan19us is on a distinguished road
Seriously?? are you really using 'snappyHeymesh' instead of 'SnappyHexMesh' ??
rajan19us is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun start second run on same processor as the already running Simjoh OpenFOAM Running, Solving & CFD 4 November 9, 2016 03:59
Running OpenFOAM KateEisenhower OpenFOAM Installation on Windows, Mac and other Unsupported Platforms 3 March 10, 2015 14:56
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 04:59
Problem with mpirun with OpenFOAM jiejie OpenFOAM 3 July 7, 2010 19:30
Random machine freezes when running several OpenFoam jobs simultaneously 2bias OpenFOAM Installation 5 July 2, 2010 07:40


All times are GMT -4. The time now is 15:19.