CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   OpenFOAM Installation (http://www.cfd-online.com/Forums/openfoam-installation/)
-   -   mpirun not working in paralleö with OpenMPI (http://www.cfd-online.com/Forums/openfoam-installation/97301-mpirun-not-working-paralleoe-openmpi.html)

zordiack February 14, 2012 06:48

mpirun not working in paralleö with OpenMPI
 
I'm having a lot of trouble getting mpirun to work in parallel with OpenFOAM 2.1.0. I compiled it from Thirdparty-2.1.0 with Allwmake and it compiled without errors. I then managed to run decomposePar on my case nicely but when i try to invoke mpirun according to user guide it gives me the following error:

$mpirun -np 2 buoyantBoussinesqSimpleFoam -parallel


--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function UPstream::init(int& argc, char**& argv)
in file UPstream.C at line 37.

FOAM exiting



--> FOAM FATAL ERROR:
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

From function UPstream::init(int& argc, char**& argv)
in file UPstream.C at line 37.

FOAM exiting

--------------------------------------------------------------------------
mpirun noticed that the job aborted, but has no info as to the process
that caused that situation.
--------------------------------------------------------------------------

It also gives the same error message with Test-parallel. mpirun seems to be working otherwise, just not with the -parallel option. For example i can run following without any problems:

$mpirun -np 2 Test-parallel

I'm totally stuck with this problem, which seems to be somehow related to Pstream library. I haven't made modifications to files in Thirdparty-2.1.0 or in OpenFOAM installation. There seems to be others with the same errors, but no solutions that i could find.

Any help is very much appreciated, as my case could really use more than one core.

zordiack February 14, 2012 14:25

Okay i'm going to reply to this myself because i think i solved this. I'm not exactly sure what helped but i'm posting what i think did the trick.

1) Make sure that you have correctly set MPI in $WM_PROJECT_DIR/etc/bashrc

export WM_MPLIB=OPENMPI

2) Run Allwmake again in Thirdparty dir (possibly best to run Allclean first and if you do, don't forget to compile paraView again :)

ThirdParty-2.1.0$ ./Allwmake

3) Move dummy libs away from default location

cd $FOAM_LIBBIN
mv dummy/ dummy__

4) Go to OpenFOAM sources and recompile Pstream and dummyThirdparty

cd $FOAM_SRC
cd Pstream
./Allwmake

cd $FOAM_SRC
cd dummyThirdParty
./Allwmake

I did some other fidling too, but i belive those should do the trick :) Feel free to ask if you have been fighting with this too and still can't get mpirun to run in parallel. Mine now runs happily when Pstream was recompiled :)

mpirun -np 2 buoyantBoussinesqSimpleFoam -parallel > log2 &

There is now 2 processes running and both cores are working :)

Leech February 16, 2012 15:11

You could also use the foamJob script.
I also had problems running mpirun. But hte foamJob script gets the libraries,etc. automatically.
foamJob -s -p >solver<

Luchini August 29, 2013 02:38

Thank you,
this worked also for me.

However, I didn't do point 1 and 2. In point 1 i have left
export WM_MPLIB=SYSTEMOPENMPI
And I didn’t recompile the thirdParty library.

regards.

einatlev September 29, 2013 22:11

Thanks!! This helped me too!!
 
:):):)
Downloaded ThirdParty-2.2.1, compiled it, moved dummy, etc. Now it's working!!

Anjishnu Choudhury September 9, 2014 11:37

Only remaking (./Allwmake) dummy files did the work for me !! Thanks!:):)

zfaraday February 2, 2015 21:35

Only recompiling Pstream made it work for me!

Thanks,

Alex

stephie March 4, 2015 04:41

Hello,

today I tried your way to solve the problem. Unfortunatly I got a new message from the program:

Code:

[stephanie:04480]  [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable either  could not be found or was not executable by this user in file  ess_singleton_module.c at line 231
[stephanie:04480]  [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable either  could not be found or was not executable by this user in file  ess_singleton_module.c at line 140
[stephanie:04480]  [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable either  could not be found or was not executable by this user in file  runtime/orte_init.c at line 128
[stephanie:04481] [[INVALID],INVALID]  ORTE_ERROR_LOG: A system-required executable either could not be found  or was not executable by this user in file ess_singleton_module.c at  line 231
[stephanie:04481] [[INVALID],INVALID] ORTE_ERROR_LOG: A  system-required executable either could not be found or was not  executable by this user in file ess_singleton_module.c at line 140
[stephanie:04481]  [[INVALID],INVALID] ORTE_ERROR_LOG: A system-required executable either  could not be found or was not executable by this user in file  runtime/orte_init.c at line 128
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_ess_set_name failed
  --> Returned value A system-required executable either could not be  found or was not executable by this user (-127) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_ess_set_name failed
  --> Returned value A system-required executable either could not be  found or was not executable by this user (-127) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: orte_init failed
  --> Returned "A system-required executable either could not be found  or was not executable by this user" (-127) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: orte_init failed
  --> Returned "A system-required executable either could not be found  or was not executable by this user" (-127) instead of "Success" (0)
--------------------------------------------------------------------------
[stephanie:4480] *** An error occurred in MPI_Init
[stephanie:4480] *** on a NULL communicator
[stephanie:4480] *** Unknown error
[stephanie:4480] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
[stephanie:4481] *** An error occurred in MPI_Init
[stephanie:4481] *** on a NULL communicator
[stephanie:4481] *** Unknown error
[stephanie:4481] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
--------------------------------------------------------------------------
An MPI process is aborting at a time when it cannot guarantee that all
of its peer processes in the job will be killed properly.  You should
double check that everything has shut down cleanly.

  Reason:    Before MPI_INIT completed
  Local host: stephanie
  PID:        4480
--------------------------------------------------------------------------
--------------------------------------------------------------------------
An MPI process is aborting at a time when it cannot guarantee that all
of its peer processes in the job will be killed properly.  You should
double check that everything has shut down cleanly.

  Reason:    Before MPI_INIT completed
  Local host: stephanie
  PID:        4481
--------------------------------------------------------------------------

===================================================================================
=  BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=  EXIT CODE: 1
=  CLEANING UP REMAINING PROCESSES
=  YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================

Perhaps one of you may tell my where I did the mistake. First I opened the bashrc file and changed export WM_MPLIB=SYSTEMOPENMPI into export WM_MPLIB=OPENMPI.

Than I went in ThridPartx and run ./Allwmake.

Next I did step 3) und 4) as written.

Did I forget something?

I would be really grateful if someone could help me.

Thank you.

wyldckat March 14, 2015 07:34

Greetings Stephanie,

I've finally managed to take a better look into your question and I think I've understood what the problem is. As far as I can figure out, you followed these instructions: http://openfoamwiki.net/index.php/In...u#Ubuntu_14.04

Therefore, I suggest you try the following steps:
  1. Go into OpenFOAM's "etc" folder:
    Code:

    foam
    cd etc

  2. Create the file "prefs.sh" and place the contents you had in the alias. For example, you can do this by running:
    Code:

    echo export WM_NCOMPPROCS=4 >> prefs.sh
    echo export WM_MPLIB=OPENMPI  >> prefs.sh

  3. Then run:
    Code:

    wmSET
    foam
    ./Allwmake > log.make 2>&1

  4. Then try again going to the case folder you had and launch in parallel the run you had tried before.
Best regards,
Bruno

stephie March 18, 2015 10:43

Hello Bruno,

the installation of OpenFOAM on Ubuntu 14.4 did a friend of me. Therefore he used a video from YouTube.
I'm deeply grateful for your help. I tried the code, you have posted and it worked :)

[Moderator note: text missing here has been copied to a new thread: http://www.cfd-online.com/Forums/ope...out-range.html]

Thank you for your help,
best regards,

Stephie


All times are GMT -4. The time now is 21:34.