CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Installation

[OpenFOAM.com] Pstream library error in parallel mode

Register Blogs Community New Posts Updated Threads Search

Like Tree11Likes
  • 1 Post By begou
  • 10 Post By Uyan

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   January 13, 2010, 11:57
Unhappy Pstream library error in parallel mode
  #1
New Member
 
Patrick Begou
Join Date: Mar 2009
Location: Grenoble, France
Posts: 17
Rep Power: 17
begou is on a distinguished road
Hi,

I've successfuly compiled OpenFOAM 1.6.x with Intel C++ compiler on a SGI Altix. I've used the SGI MPT implementation of MPI.

I've run a small sequential test simplefoam in pitzDaily dir.

Then I've decomposed the problem to run it with 2 processors but launching
mpirun -np 2 simpleFoam -parallel
give an error message:

MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: --> FOAM FATAL ERROR:
MPI: calcul9sv4: 0x57f000004b093bc7: Trying to use the dummy Pstream library.
MPI: calcul9sv4: 0x57f000004b093bc7: This dummy library cannot be used in parallel mode
MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: From function Pstream::init(int& argc, char**& argv)
MPI: calcul9sv4: 0x57f000004b093bc7: in file Pstream.C at line 40.
MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: FOAM exiting


May be I've done something bad! At compile time loader was looking for
libPstream.so in ./lib/linuxIA64IccDPOpt/ but I found that this library was in ./lib/linuxIA64IccDPOpt/dummy/
and I added a link in ./lib/linuxIA64IccDPOpt/ to allow compilation

Thanks for your help

Patrick
begou is offline   Reply With Quote

Old   February 2, 2010, 11:21
Default
  #2
New Member
 
Patrick Begou
Join Date: Mar 2009
Location: Grenoble, France
Posts: 17
Rep Power: 17
begou is on a distinguished road
Problem solved.

Allwmake had not compiled the parallel part of Pstream. I had added a new MPI config for MPT wich is the MPI version provided by SGI on my Altix server. I've called this new config MPT, of course .
But the script used to build Pstream tests for the existence of the 3 letters "MPI" in the name of the MPI config ($WM_MPLIB variable)

the solution is
- call the new configs MPIsomething
or
- remove this test in src/Pstream/Allwmake
Shane911 likes this.
begou is offline   Reply With Quote

Old   September 28, 2011, 02:09
Default Similar Pstream Error
  #3
Senior Member
 
n/a
Join Date: Sep 2009
Posts: 199
Rep Power: 16
deji is on a distinguished road
Hello Foamers. I am getting a similar error when I attempt to run OpenFOAM 1.6.x in parallel as well. I did check and Pstream/mpi was compiled. So can anyone give me any advise as to why I would be getting such an error from OpenFOAM. Please help.
deji is offline   Reply With Quote

Old   January 14, 2015, 03:37
Default
  #4
Member
 
Heliana Cardenas
Join Date: Jul 2013
Posts: 30
Rep Power: 12
heliana60 is on a distinguished road
Hi guys,

I know this post is a bit old, but I am also having the same problem when I run mpirun. I am getting:

Code:
 --> FOAM FATAL ERROR: 
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

    From function UPstream::init(int& argc, char**& argv)
    in file UPstream.C at line 37.

FOAM exiting
Actually when I try to compile the Pstream directory I get the next error:

Code:
 + wmake libso dummy
'/home/heliana/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/dummy/libPstream.so' is up to date.
+ case "$WM_MPLIB" in
+ set +x

Note: ignore spurious warnings about missing mpicxx.h headers

wmake libso mpi
SOURCE=UOPwrite.C ;  g++ -m64 -Dlinux64 -DWM_DP -Wall -Wextra -Wno-unused-parameter -Wold-style-cast -Wnon-virtual-dtor -O3  -DNoRepository -ftemplate-depth-100 -DOMPI_SKIP_MPICXX -I/home/heliana/OpenFOAM/ThirdParty-2.3.x/platforms/linux64Gcc/openmpi-1.6.5/include -IlnInclude -I. -I/home/heliana/OpenFOAM/OpenFOAM-2.3.x/src/OpenFOAM/lnInclude -I/home/heliana/OpenFOAM/OpenFOAM-2.3.x/src/OSspecific/POSIX/lnInclude   -fPIC -c $SOURCE -o Make/linux64GccDPOptOPENMPI/UOPwrite.o
UOPwrite.C:29:17: error: mpi.h: No such file or directory
In file included from UOPwrite.C:32:
PstreamGlobals.H:55: error: ‘MPI_Request’ was not declared in this scope
PstreamGlobals.H:55: error: template argument 1 is invalid
PstreamGlobals.H:55: error: invalid type in declaration before ‘;’ token
PstreamGlobals.H:66: error: ‘MPI_Comm’ was not declared in this scope
PstreamGlobals.H:66: error: template argument 1 is invalid
PstreamGlobals.H:66: error: invalid type in declaration before ‘;’ token
PstreamGlobals.H:67: error: ‘MPI_Group’ was not declared in this scope
PstreamGlobals.H:67: error: template argument 1 is invalid
PstreamGlobals.H:67: error: invalid type in declaration before ‘;’ token
UOPwrite.C: In static member function ‘static bool Foam::UOPstream::write(Foam::UPstream::commsTypes, int, const char*, std::streamsize, int, Foam::label)’:
UOPwrite.C:77: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:80: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:81: error: ‘MPI_Bsend’ was not declared in this scope
UOPwrite.C:97: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:100: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:101: error: ‘MPI_Send’ was not declared in this scope
UOPwrite.C:113: error: ‘MPI_Request’ was not declared in this scope
UOPwrite.C:113: error: expected ‘;’ before ‘request’
UOPwrite.C:119: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:122: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:123: error: ‘request’ was not declared in this scope
UOPwrite.C:124: error: ‘MPI_Isend’ was not declared in this scope
UOPwrite.C:131: error: request for member ‘size’ in ‘Foam::PstreamGlobals::outstandingRequests_’, which is of non-class type ‘int’
UOPwrite.C:135: error: request for member ‘append’ in ‘Foam::PstreamGlobals::outstandingRequests_’, which is of non-class type ‘int’
make: *** [Make/linux64GccDPOptOPENMPI/UOPwrite.o] Error 1
I am running this in a cluster so it might be the reason but I can't figure out what it is,

thank you for any comment,

Heliana

Last edited by wyldckat; January 17, 2015 at 13:27. Reason: Added [CODE][/CODE]
heliana60 is offline   Reply With Quote

Old   January 17, 2015, 13:37
Default
  #5
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Heliana,

The problem is implied by this error message:
Code:
UOPwrite.C:29:17: error: mpi.h: No such file or directory
This means that an installation of Open-MPI is not being found where you configured OpenFOAM to use it.

Run the following commands to diagnose what settings you're currently using:
Code:
echo $FOAM_MPI

echo $MPI_ARCH_PATH

ls -l $MPI_ARCH_PATH

which mpirun
Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   January 20, 2015, 03:15
Unhappy Pstream library error in parallel mode
  #6
Member
 
Heliana Cardenas
Join Date: Jul 2013
Posts: 30
Rep Power: 12
heliana60 is on a distinguished road
Hi Bruno!

Thank you so much, I checked and openFoam settings was trying to compile an earlier version of the open-mpi that was downloaded by ThirdParty. I fixed that and it compiled

Heliana

[ Moderator note: full post that was made here was moved to here: http://www.cfd-online.com/Forums/ope...escriptor.html - left here only the relevant answer. ]

Last edited by wyldckat; January 24, 2015 at 13:21. Reason: See "Moderator note"
heliana60 is offline   Reply With Quote

Old   March 3, 2015, 12:45
Default mpi.h: No such file or directory : help...
  #7
Member
 
Join Date: Feb 2014
Posts: 62
Rep Power: 12
Uyan is on a distinguished road
Dear Bruno and Heliana,

I am trying to install OpenFOAM-2.2.x on a cluster with Bull Linux 6. I can get it to install and work only on single core.

I came across error
Code:
This dummy library cannot be used in parallel mode
and then i came to this thread when i wanted to compile Pstream

Code:
UOPwrite.C:29:17: fatal error: mpi.h: No such file or directory
 #include "mpi.h"
--
As for Bruno's diagnosis questions I get
q1)
Code:
   echo $FOAM_MPI
openmpi-1.6.3
q2)
Code:
 echo $MPI_ARCH_PATH
/xxx/OpenFOAM/ThirdParty-2.2.x/platforms/linux64Gcc/openmpi-1.6.3
q3)
Code:
ls -l $MPI_ARCH_PATH
ls: cannot access /xxx/OpenFOAM/ThirdParty-2.2.x/platforms/linux64Gcc/openmpi-1.6.3: No such file or directory
q4)
Code:
which mpirun
/opt/mpi/bullxmpi/1.1.17.1/bin/mpirun
So I think q3 is the cause here. But i have no idea how to solve this problem.
I know Bruno is an expert on these kind of problems . Please can any one of you Help me out here.

Thank you very much.
Uyan is offline   Reply With Quote

Old   March 3, 2015, 14:59
Smile Solved
  #8
Member
 
Join Date: Feb 2014
Posts: 62
Rep Power: 12
Uyan is on a distinguished road
Oh this is embarrassing, I found the answer by my self.

change the $WM_PROJECT_DIR/etc/bashrc file

Code:
export WM_MPLIB=SYSTEMOPENMPI
then change the $WM_PROJECT_DIR/etc/config/settings.sh

Code:
export FOAM_MPI=openmpi-system
Then source the etc/bashrc file again and recompiling Pstream solved my problem.

cd $FOAM_SRC
cd Pstream
./Allwmake
Uyan is offline   Reply With Quote

Old   June 18, 2015, 09:45
Default Installation error
  #9
New Member
 
Join Date: Sep 2014
Posts: 11
Rep Power: 11
Cluap is on a distinguished road
Hello !

I have a problem kind of similar to yours.

I'm trying to install OpenFOAM-dev with Ubuntu 12.04 but when I launch Allwamake I get the following error :

UOPwrite.C:29:17: fatal error: mpi.h : No such file or directory
compilation aborted

I've tried to change change the $WM_PROJECT_DIR/etc/bashrc file and the $WM_PROJECT_DIR/etc/config/settings.sh but it doesn't change anything !

Any help would be really appreciated.

Thanks in advance,

Paul
Cluap is offline   Reply With Quote

Old   June 21, 2015, 15:46
Default
  #10
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Paul,

OK... I see that you've asked this here as well, where the original question was made in the bug tracker: http://www.openfoam.org/mantisbt/view.php?id=1752

Quoting myself from there:
Quote:
Looks like Open-MPI wasn't built or isn't installed. You can also try running the following command for ascertaining which MPI toolbox you have installed:

mpirun --version

Beyond this, you haven't provided enough details to work with, but my guess is that:
- either you used the ThirdParty package from 2.4.0;
- or used the ThirdParty-dev repository.

Either way, if you check the page for the ThirdParty-dev repository, you'll see what 3rd party packages need to be downloaded and from where: https://github.com/OpenFOAM/ThirdParty-dev/
Quote:
Simply download the files and unpack them directly in the ThirdParty-dev folder. For example, you can do these steps by running:

cd $WM_THIRD_PARTY_DIR
wget https://gforge.inria.fr/frs/download...h_6.0.3.tar.gz
tar -xzf scotch_6.0.3.tar.gz
Please report back on the bug tracker if you have managed to install OpenFOAM-dev or not.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   June 23, 2016, 22:29
Default Pstream library error in parallel mode
  #11
New Member
 
Harrif
Join Date: Feb 2011
Location: National University of Singapore
Posts: 2
Rep Power: 0
harrif is on a distinguished road
Dear all,

I know this is quite an old thread, I apologise for bringing this up again.

I encounter the same problem when building OF231 in HPC cluster (Centos 6.3) in my university. I follow closely the instruction for Centos 6.5 as listed here by Bruno: [URL="http://openfoamwiki.net/index.php/Installation/Linux/OpenFOAM-2.3.1/CentOS_SL_RHEL#CentOS_5.10"]. However since I do not have root access, I started from step #5. But because I skipped the first four steps, I could not include this part: module load openmpi-x86_64 || export PATH=$PATH:/usr/lib64/openmpi/bin. Hence I used the default system openmpi with output as listed below.

I had to build Gcc according to the steps and use that as the compiler for OpenFOAM, with success, except for the parallel communication. The error that I mentioned is when running mpirun with -parallel gives: This dummy library cannot be used in parallel mode. mpirun can be executed. And when Pstream is compiled, the same error as previously reported occurs: UOPwrite.C:29:17: fatal error: mpi.h: No such file or directory
#include "mpi.h".

I tried many ways from what I digged from cfd-online forum, but so far I have had no success. This is the output when executing according to Bruno's diagnosis:

echo $FOAM_MPI
openmpi-system

echo $MPI_ARCH_PATH
SYSTEMOPENMPI

ls -l $MPI_ARCH_PATH
/app1/centos6.3/gnu/mvapich2-1.9

which mpirun
/app1/centos6.3/gnu/mvapich2-1.9/bin/mpirun

Please let me know if you know what went wrong with my installation. Appreciate your help.
harrif is offline   Reply With Quote

Old   July 17, 2017, 07:12
Default
  #12
Member
 
Ali Noaman Ibrahim
Join Date: Sep 2015
Location: US_Chicago
Posts: 97
Rep Power: 10
alinuman15 is on a distinguished road
Quote:
Originally Posted by harrif View Post
Dear all,

I know this is quite an old thread, I apologise for bringing this up again.

I encounter the same problem when building OF231 in HPC cluster (Centos 6.3) in my university. I follow closely the instruction for Centos 6.5 as listed here by Bruno: [URL="http://openfoamwiki.net/index.php/Installation/Linux/OpenFOAM-2.3.1/CentOS_SL_RHEL#CentOS_5.10"]. However since I do not have root access, I started from step #5. But because I skipped the first four steps, I could not include this part: module load openmpi-x86_64 || export PATH=$PATH:/usr/lib64/openmpi/bin. Hence I used the default system openmpi with output as listed below.

I had to build Gcc according to the steps and use that as the compiler for OpenFOAM, with success, except for the parallel communication. The error that I mentioned is when running mpirun with -parallel gives: This dummy library cannot be used in parallel mode. mpirun can be executed. And when Pstream is compiled, the same error as previously reported occurs: UOPwrite.C:29:17: fatal error: mpi.h: No such file or directory
#include "mpi.h".

I tried many ways from what I digged from cfd-online forum, but so far I have had no success. This is the output when executing according to Bruno's diagnosis:

echo $FOAM_MPI
openmpi-system

echo $MPI_ARCH_PATH
SYSTEMOPENMPI

ls -l $MPI_ARCH_PATH
/app1/centos6.3/gnu/mvapich2-1.9

which mpirun
/app1/centos6.3/gnu/mvapich2-1.9/bin/mpirun

Please let me know if you know what went wrong with my installation. Appreciate your help.
Hi harrif :-
I wonder If you have solved your problem. I met exactly the same problem and no clue how to solve ?
alinuman15 is offline   Reply With Quote

Old   July 31, 2023, 11:40
Unhappy Similar error
  #13
New Member
 
Francesca
Join Date: Jul 2023
Location: Milan, Italy
Posts: 2
Rep Power: 0
Frensis is on a distinguished road
Hi, I get a similar error while trying to run my simulation on a cluster, in parallel on several cores.


Code:
--> FOAM FATAL ERROR: (openfoam-2212)
The dummy Pstream library cannot be used in parallel mode


    From static bool Foam::UPstream::init(int&, char**&, bool)
    in file UPstream.C at line 49.

FOAM exiting

I also tried running the commands suggested before
Quote:
Run the following commands to diagnose what settings you're currently using:
Code:
echo $FOAM_MPI

echo $MPI_ARCH_PATH

ls -l $MPI_ARCH_PATH

which mpirun

For the first two code lines I get no output, the third one outputs all the files I have in my folder, and when running "which mpirun" I get:

/usr/bin/which: no mpirun in (/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/ibutils/bin:/opt/sge/bin/lx-amd64:/opt/dell/srvadmin/bin:/home/mecc/fpal/.local/bin:/home/mecc/fpal/bin)


It can't be an installation issue, as the cluster is working for many others.
Any suggestions on how to solve this?

Thanks!
Frensis is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[OpenFOAM.org] OF2.3.1 + OS13.2 - Trying to use the dummy Pstream library aylalisa OpenFOAM Installation 23 June 15, 2015 14:49
Fluent 14.0 file not running in parallel mode in cluster tejakalva FLUENT 0 February 4, 2015 07:02
[PyFoam] Problems with the new PyFoam release zfaraday OpenFOAM Community Contributions 13 December 9, 2014 18:58
[PyFoam] having problems with pyfoam Installation vitorspadetoventurin OpenFOAM Community Contributions 3 December 2, 2014 07:18
TASCflow,problem with script and parallel mode Zbynek Hrncir CFX 0 October 2, 2001 07:30


All times are GMT -4. The time now is 12:20.