CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

error while running mpirun command in openfoam

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   March 28, 2017, 07:27
Default error while running mpirun command in openfoam
  #1
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 3
tarunw is on a distinguished road
I am getting following error message while running mpirun command on a server

Code:
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
This message appears only when I am running the code in parallel. How to rectify this problem?
Note: I installed OF-3.0.1 on the server from source code

Last edited by wyldckat; June 4, 2017 at 18:15. Reason: Added [CODE][/CODE] markers
tarunw is offline   Reply With Quote

Old   April 30, 2017, 12:39
Default
  #2
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,036
Blog Entries: 39
Rep Power: 110
wyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of light
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
__________________
wyldckat is offline   Reply With Quote

Old   June 3, 2017, 18:26
Default
  #3
New Member
 
Tarun
Join Date: Jun 2016
Posts: 6
Rep Power: 3
tarunw is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Quick answer/question: Have you managed to solve this problem? I only saw your post today.

If not yet, the problem is likely because the wrong MPI toolbox is being used, specifically the mpirun version being used.

If you Google for the following sentence:
Code:
site:cfd-online.com "wyldckat" "OPAL_SUCCESS"
you should find several situations similar to yours and solutions that worked for others.
hi bruno
Sorry for the late reply. I tried my best to rectify the situation with the help of the forum but unable to solve the problem.

mpirun version used is openmpi-1.10.0. OF version-3.0.1 and OS on cluster is CentOS 6.7. I have compressed installed OF and thirdParty folders and am sending you the google drive link to that file in a PM. Please have a look at my problem. I request you to please help me as soon as you can (I have to run cases for my thesis and not much time left!!). Eagerly waiting for your reply.

Thanks & Regards
Tarun
tarunw is offline   Reply With Quote

Old   June 4, 2017, 20:13
Default
  #4
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,036
Blog Entries: 39
Rep Power: 110
wyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of lightwyldckat is a glorious beacon of light
Hi Tarun,

Many thanks for sharing your whole installation! I've tested it in a VM with CentOS 6.8 and was able to reproduce the problem.

Furthermore, I've finally figured out how to solve the problem. If you run the following command before trying to run mpirun:
Code:
export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
you should no longer get that error message.

If this solves the problem, then:
  1. Edit the file "$HOME/OpenFOAM/OpenFOAM-3.0.1/etc/config/settings.sh".
  2. Look for these lines:
    Code:
    OPENMPI)
        export FOAM_MPI=openmpi-1.10.0
        # Optional configuration tweaks:
        _foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config/openmpi.sh`
    
        export MPI_ARCH_PATH=$WM_THIRD_PARTY_DIR/platforms/$WM_ARCH$WM_COMPILER/$FOAM_MPI
    
        # Tell OpenMPI where to find its install directory
        export OPAL_PREFIX=$MPI_ARCH_PATH
  3. Add after the line with "OPAL_PREFIX" the new line for "OPAL_LIBDIR", so that those two lines looks like this:
    Code:
        export OPAL_PREFIX=$MPI_ARCH_PATH
        export OPAL_LIBDIR=$OPAL_PREFIX/lib64/
  4. Save and close the file.
  5. Next time you start a new terminal window/tab and/or active this OpenFOAM 3.0.1 environment, it should work as intended.


Curiously enough, I had spotted this issue back in 2013: mpicc link doesn't update when moving OF 2.2.1 installation

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun start second run on same processor as the already running Simjoh OpenFOAM Running, Solving & CFD 4 November 9, 2016 03:59
Running OpenFOAM KateEisenhower OpenFOAM Installation on Windows, Mac and other Unsupported Platforms 3 March 10, 2015 14:56
Something weird encountered when running OpenFOAM in parallel on multiple nodes xpqiu OpenFOAM Running, Solving & CFD 2 May 2, 2013 04:59
Problem with mpirun with OpenFOAM jiejie OpenFOAM 3 July 7, 2010 19:30
Random machine freezes when running several OpenFoam jobs simultaneously 2bias OpenFOAM Installation 5 July 2, 2010 07:40


All times are GMT -4. The time now is 01:59.