CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Installation

Pstream library error in parallel mode

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By Uyan

Reply
 
LinkBack Thread Tools Display Modes
Old   January 13, 2010, 12:57
Unhappy Pstream library error in parallel mode
  #1
New Member
 
Patrick Begou
Join Date: Mar 2009
Location: Grenoble, France
Posts: 17
Rep Power: 8
begou is on a distinguished road
Hi,

I've successfuly compiled OpenFOAM 1.6.x with Intel C++ compiler on a SGI Altix. I've used the SGI MPT implementation of MPI.

I've run a small sequential test simplefoam in pitzDaily dir.

Then I've decomposed the problem to run it with 2 processors but launching
mpirun -np 2 simpleFoam -parallel
give an error message:

MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: --> FOAM FATAL ERROR:
MPI: calcul9sv4: 0x57f000004b093bc7: Trying to use the dummy Pstream library.
MPI: calcul9sv4: 0x57f000004b093bc7: This dummy library cannot be used in parallel mode
MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: From function Pstream::init(int& argc, char**& argv)
MPI: calcul9sv4: 0x57f000004b093bc7: in file Pstream.C at line 40.
MPI: calcul9sv4: 0x57f000004b093bc7:
MPI: calcul9sv4: 0x57f000004b093bc7: FOAM exiting


May be I've done something bad! At compile time loader was looking for
libPstream.so in ./lib/linuxIA64IccDPOpt/ but I found that this library was in ./lib/linuxIA64IccDPOpt/dummy/
and I added a link in ./lib/linuxIA64IccDPOpt/ to allow compilation

Thanks for your help

Patrick
begou is offline   Reply With Quote

Old   February 2, 2010, 12:21
Default
  #2
New Member
 
Patrick Begou
Join Date: Mar 2009
Location: Grenoble, France
Posts: 17
Rep Power: 8
begou is on a distinguished road
Problem solved.

Allwmake had not compiled the parallel part of Pstream. I had added a new MPI config for MPT wich is the MPI version provided by SGI on my Altix server. I've called this new config MPT, of course .
But the script used to build Pstream tests for the existence of the 3 letters "MPI" in the name of the MPI config ($WM_MPLIB variable)

the solution is
- call the new configs MPIsomething
or
- remove this test in src/Pstream/Allwmake
begou is offline   Reply With Quote

Old   September 28, 2011, 02:09
Default Similar Pstream Error
  #3
Senior Member
 
n/a
Join Date: Sep 2009
Posts: 198
Rep Power: 7
deji is on a distinguished road
Hello Foamers. I am getting a similar error when I attempt to run OpenFOAM 1.6.x in parallel as well. I did check and Pstream/mpi was compiled. So can anyone give me any advise as to why I would be getting such an error from OpenFOAM. Please help.
deji is offline   Reply With Quote

Old   January 14, 2015, 04:37
Default
  #4
New Member
 
Heliana Cardenas
Join Date: Jul 2013
Posts: 17
Rep Power: 3
heliana60 is on a distinguished road
Hi guys,

I know this post is a bit old, but I am also having the same problem when I run mpirun. I am getting:

Code:
 --> FOAM FATAL ERROR: 
Trying to use the dummy Pstream library.
This dummy library cannot be used in parallel mode

    From function UPstream::init(int& argc, char**& argv)
    in file UPstream.C at line 37.

FOAM exiting
Actually when I try to compile the Pstream directory I get the next error:

Code:
 + wmake libso dummy
'/home/heliana/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/dummy/libPstream.so' is up to date.
+ case "$WM_MPLIB" in
+ set +x

Note: ignore spurious warnings about missing mpicxx.h headers

wmake libso mpi
SOURCE=UOPwrite.C ;  g++ -m64 -Dlinux64 -DWM_DP -Wall -Wextra -Wno-unused-parameter -Wold-style-cast -Wnon-virtual-dtor -O3  -DNoRepository -ftemplate-depth-100 -DOMPI_SKIP_MPICXX -I/home/heliana/OpenFOAM/ThirdParty-2.3.x/platforms/linux64Gcc/openmpi-1.6.5/include -IlnInclude -I. -I/home/heliana/OpenFOAM/OpenFOAM-2.3.x/src/OpenFOAM/lnInclude -I/home/heliana/OpenFOAM/OpenFOAM-2.3.x/src/OSspecific/POSIX/lnInclude   -fPIC -c $SOURCE -o Make/linux64GccDPOptOPENMPI/UOPwrite.o
UOPwrite.C:29:17: error: mpi.h: No such file or directory
In file included from UOPwrite.C:32:
PstreamGlobals.H:55: error: ‘MPI_Request’ was not declared in this scope
PstreamGlobals.H:55: error: template argument 1 is invalid
PstreamGlobals.H:55: error: invalid type in declaration before ‘;’ token
PstreamGlobals.H:66: error: ‘MPI_Comm’ was not declared in this scope
PstreamGlobals.H:66: error: template argument 1 is invalid
PstreamGlobals.H:66: error: invalid type in declaration before ‘;’ token
PstreamGlobals.H:67: error: ‘MPI_Group’ was not declared in this scope
PstreamGlobals.H:67: error: template argument 1 is invalid
PstreamGlobals.H:67: error: invalid type in declaration before ‘;’ token
UOPwrite.C: In static member function ‘static bool Foam::UOPstream::write(Foam::UPstream::commsTypes, int, const char*, std::streamsize, int, Foam::label)’:
UOPwrite.C:77: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:80: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:81: error: ‘MPI_Bsend’ was not declared in this scope
UOPwrite.C:97: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:100: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:101: error: ‘MPI_Send’ was not declared in this scope
UOPwrite.C:113: error: ‘MPI_Request’ was not declared in this scope
UOPwrite.C:113: error: expected ‘;’ before ‘request’
UOPwrite.C:119: error: ‘MPI_BYTE’ was not declared in this scope
UOPwrite.C:122: error: invalid types ‘int[const Foam::label]’ for array subscript
UOPwrite.C:123: error: ‘request’ was not declared in this scope
UOPwrite.C:124: error: ‘MPI_Isend’ was not declared in this scope
UOPwrite.C:131: error: request for member ‘size’ in ‘Foam::PstreamGlobals::outstandingRequests_’, which is of non-class type ‘int’
UOPwrite.C:135: error: request for member ‘append’ in ‘Foam::PstreamGlobals::outstandingRequests_’, which is of non-class type ‘int’
make: *** [Make/linux64GccDPOptOPENMPI/UOPwrite.o] Error 1
I am running this in a cluster so it might be the reason but I can't figure out what it is,

thank you for any comment,

Heliana

Last edited by wyldckat; January 17, 2015 at 14:27. Reason: Added [CODE][/CODE]
heliana60 is offline   Reply With Quote

Old   January 17, 2015, 14:37
Default
  #5
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,935
Blog Entries: 34
Rep Power: 79
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Greetings Heliana,

The problem is implied by this error message:
Code:
UOPwrite.C:29:17: error: mpi.h: No such file or directory
This means that an installation of Open-MPI is not being found where you configured OpenFOAM to use it.

Run the following commands to diagnose what settings you're currently using:
Code:
echo $FOAM_MPI

echo $MPI_ARCH_PATH

ls -l $MPI_ARCH_PATH

which mpirun
Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   January 20, 2015, 04:15
Unhappy Pstream library error in parallel mode
  #6
New Member
 
Heliana Cardenas
Join Date: Jul 2013
Posts: 17
Rep Power: 3
heliana60 is on a distinguished road
Hi Bruno!

Thank you so much, I checked and openFoam settings was trying to compile an earlier version of the open-mpi that was downloaded by ThirdParty. I fixed that and it compiled

Heliana

[ Moderator note: full post that was made here was moved to here: Problems using a qsub (PBS) job-scheduler: "no access to tty (Bad file descriptor)" - left here only the relevant answer. ]

Last edited by wyldckat; January 24, 2015 at 14:21. Reason: See "Moderator note"
heliana60 is offline   Reply With Quote

Old   March 3, 2015, 13:45
Default mpi.h: No such file or directory : help...
  #7
New Member
 
Join Date: Feb 2014
Posts: 22
Rep Power: 3
Uyan is on a distinguished road
Dear Bruno and Heliana,

I am trying to install OpenFOAM-2.2.x on a cluster with Bull Linux 6. I can get it to install and work only on single core.

I came across error
Code:
This dummy library cannot be used in parallel mode
and then i came to this thread when i wanted to compile Pstream

Code:
UOPwrite.C:29:17: fatal error: mpi.h: No such file or directory
 #include "mpi.h"
--
As for Bruno's diagnosis questions I get
q1)
Code:
   echo $FOAM_MPI
openmpi-1.6.3
q2)
Code:
 echo $MPI_ARCH_PATH
/xxx/OpenFOAM/ThirdParty-2.2.x/platforms/linux64Gcc/openmpi-1.6.3
q3)
Code:
ls -l $MPI_ARCH_PATH
ls: cannot access /xxx/OpenFOAM/ThirdParty-2.2.x/platforms/linux64Gcc/openmpi-1.6.3: No such file or directory
q4)
Code:
which mpirun
/opt/mpi/bullxmpi/1.1.17.1/bin/mpirun
So I think q3 is the cause here. But i have no idea how to solve this problem.
I know Bruno is an expert on these kind of problems . Please can any one of you Help me out here.

Thank you very much.
Uyan is offline   Reply With Quote

Old   March 3, 2015, 15:59
Smile Solved
  #8
New Member
 
Join Date: Feb 2014
Posts: 22
Rep Power: 3
Uyan is on a distinguished road
Oh this is embarrassing, I found the answer by my self.

change the $WM_PROJECT_DIR/etc/bashrc file

Code:
export WM_MPLIB=SYSTEMOPENMPI
then change the $WM_PROJECT_DIR/etc/config/settings.sh

Code:
export FOAM_MPI=openmpi-system
Then source the etc/bashrc file again and recompiling Pstream solved my problem.

cd $FOAM_SRC
cd Pstream
./Allwmake
wyldckat likes this.
Uyan is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
problem loading UDF library in parallel cluster Veera Gutti FLUENT 7 July 3, 2011 09:15
parallel mode failure in 3ddp but not in 2ddp ak6g08 FLUENT 1 September 22, 2009 06:56
parallel mode failure in 3ddp but not in 2ddp ak6g08 Fluent UDF and Scheme Programming 0 September 22, 2009 06:16
DPM model in parallel batch mode Prashanth FLUENT 2 March 6, 2009 08:54
parallel mode - small problem? co2 FLUENT 2 June 1, 2004 23:47


All times are GMT -4. The time now is 07:25.