CFD Online URL
[Sponsors]
Home > Forums > OpenFOAM Installation

mpirun fails

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   October 19, 2011, 15:45
Default mpirun fails
  #1
New Member
 
Achim Boemelburg
Join Date: Oct 2011
Posts: 12
Rep Power: 5
achim is on a distinguished road
I'm trying to install OpenFOAM 1.7.1 on a RedHat EL 5.5 machine.
After installing the ThirdParty-1.7.1 stuff ending in
========================================
Done ThirdParty Allwmake
========================================

when trying to run e.g. mpirun w/o any argument I get:

mpirun: error while loading shared libraries: .../OpenFOAM/ThirdParty-1.7.1/platforms/linux64Gcc/openmpi-1.4.1/lib/libopen-rte.so.0: ELF file OS ABI invalid

Any idea what went wrong here?
achim is offline   Reply With Quote

Old   October 20, 2011, 06:45
Default
  #2
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Greetings Achim and welcome to the forum!

Check your Linux-machine architecture:
Code:
uname -m
If it shows "i686" or similar (i386/486/586), then the problem you are having is because you didn't set the variable WM_ARCH_OPTION to 32 in the file "OpenFOAM-1.7.1/etc/bashrc" or "OpenFOAM-1.7.1/etc/cshrc".

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   October 20, 2011, 13:20
Default uname -m
  #3
New Member
 
Achim Boemelburg
Join Date: Oct 2011
Posts: 12
Rep Power: 5
achim is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Greetings Achim and welcome to the forum!

Check your Linux-machine architecture:
Code:
uname -m
If it shows "i686" or similar (i386/486/586), then the problem you are having is because you didn't set the variable WM_ARCH_OPTION to 32 in the file "OpenFOAM-1.7.1/etc/bashrc" or "OpenFOAM-1.7.1/etc/cshrc".

Best regards,
Bruno
Thanks Bruno,

my uname -m is x86_64.

Starting a trivial hello world mpi program ends up with:
ipath_wait_for_device: The /dev/ipath device failed to appear after 30.0 seconds: Connection timed out

so it seems that the IB support is not built in the openmpi correctly.
achim is offline   Reply With Quote

Old   October 20, 2011, 17:49
Default
  #4
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Achim,

Mmm, that's strange... The error message "ELF file OS ABI invalid" indicates that the problem is something like running a 64bit application in a 32bit processor or vice-versa. And the solution I described was related to the most usual occurrence of this problem.
The other possibility is if you are trying to build an application in x86_64 to run in a IA64 (Itanium) or something like that.

As for IB support, you have two options:
  • Use the Open-MPI version that comes with your Linux distribution, which should be already optimized for your machines. For that, "WM_MPLIB=SYSTEMOPENMPI" should do the trick.
  • Build (again) Open-MPI that comes with OpenFOAM, but this time edit the file "ThirdParty-*/Allwmake" and search for the area for building Open-MPI and find the block of code:
    Code:
            # Infiniband support
            # if [ -d /usr/local/ofed -a -d /usr/local/ofed/lib64 ]
            # then
            #     configOpt="$configOpt --with-openib=/usr/local/ofed"
            #     configOpt="$configOpt --with-openib-libdir=/usr/local/ofed/lib64"
            # fi
    In bold are the hash tags to be removed, so it will build with IB.
If you check that "Allwmake" script, you'll find out that if you remove the files "$MPI_ARCH_PATH/lib/libmpi.*", you'll force it to rebuild Open-MPI with the new options

This might also come in handy:
Quote:
Originally Posted by pkr View Post
When using MPI_reduce, the OpenMPI was trying to establish TCP through a different interface. The problem is solved if the following command is used:
mpirun --mca btl_tcp_if_exclude lo,virbr0 -hostfile machines -np 2 /home/rphull/OpenFOAM/OpenFOAM-1.6/bin/foamExec interFoam -parallel

The above command will restrict MPI to use certain networks (lo, vibro in this case).
Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   October 21, 2011, 04:14
Default open mpi
  #5
New Member
 
Achim Boemelburg
Join Date: Oct 2011
Posts: 12
Rep Power: 5
achim is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Hi Achim,

Mmm, that's strange... The error message "ELF file OS ABI invalid" indicates that the problem is something like running a 64bit application in a 32bit processor or vice-versa. And the solution I described was related to the most usual occurrence of this problem.
The other possibility is if you are trying to build an application in x86_64 to run in a IA64 (Itanium) or something like that.

As for IB support, you have two options:
  • Use the Open-MPI version that comes with your Linux distribution, which should be already optimized for your machines. For that, "WM_MPLIB=SYSTEMOPENMPI" should do the trick.
  • Build (again) Open-MPI that comes with OpenFOAM, but this time edit the file "ThirdParty-*/Allwmake" and search for the area for building Open-MPI and find the block of code:
    Code:
            # Infiniband support
            # if [ -d /usr/local/ofed -a -d /usr/local/ofed/lib64 ]
            # then
            #     configOpt="$configOpt --with-openib=/usr/local/ofed"
            #     configOpt="$configOpt --with-openib-libdir=/usr/local/ofed/lib64"
            # fi
    In bold are the hash tags to be removed, so it will build with IB.
If you check that "Allwmake" script, you'll find out that if you remove the files "$MPI_ARCH_PATH/lib/libmpi.*", you'll force it to rebuild Open-MPI with the new options

This might also come in handy:

Best regards,
Bruno
Thanks Bruno,

now, the openmpi installation seems to be in a non-standard place and I don't have root rights.
# Is there a way to specify the location of the open-mpi installation?

I have:
/usr/mpi/gcc/openmpi-1.4.3 $ ls -l
total 28
drwxr-xr-x 2 root root 4096 Aug 10 15:59 bin
drwxr-xr-x 2 root root 4096 Aug 10 15:59 etc
drwxr-xr-x 4 root root 4096 Aug 10 15:59 include
drwxr-xr-x 3 root root 4096 Aug 10 15:59 lib64
drwxr-xr-x 2 root root 4096 Aug 10 15:59 openmpi_gcc-1.4.3
drwxr-xr-x 5 root root 4096 Aug 10 15:59 share
drwxr-xr-x 5 root root 4096 Aug 10 15:59 tests

How can I tell openfoam to use this installation?

Regards
Achim
achim is offline   Reply With Quote

Old   October 21, 2011, 05:21
Default
  #6
New Member
 
Achim Boemelburg
Join Date: Oct 2011
Posts: 12
Rep Power: 5
achim is on a distinguished road
Quote:
Originally Posted by achim View Post
Thanks Bruno,

now, the openmpi installation seems to be in a non-standard place and I don't have root rights.
# Is there a way to specify the location of the open-mpi installation?

I have:
/usr/mpi/gcc/openmpi-1.4.3 $ ls -l
total 28
drwxr-xr-x 2 root root 4096 Aug 10 15:59 bin
drwxr-xr-x 2 root root 4096 Aug 10 15:59 etc
drwxr-xr-x 4 root root 4096 Aug 10 15:59 include
drwxr-xr-x 3 root root 4096 Aug 10 15:59 lib64
drwxr-xr-x 2 root root 4096 Aug 10 15:59 openmpi_gcc-1.4.3
drwxr-xr-x 5 root root 4096 Aug 10 15:59 share
drwxr-xr-x 5 root root 4096 Aug 10 15:59 tests

How can I tell openfoam to use this installation?

Regards
Achim
------------------------------------------------------------------------

So far I edited etc/settings.sh:

...
# Communications library
# ~~~~~~~~~~~~~~~~~~~~~~

unset MPI_ARCH_PATH MPI_HOME

case "$WM_MPLIB" in
OPENMPI)
# mpi_version=openmpi-1.4.1
mpi_version=openmpi-1.4.3
# export MPI_ARCH_PATH=$WM_THIRD_PARTY_DIR/platforms/$WM_ARCH$WM_COMPILER/$mpi_version
export MPI_ARCH_PATH=/usr/mpi/gcc/openmpi-1.4.3

# Tell OpenMPI where to find its install directory
...

and etc/bashrc:

# WM_MPLIB = SYSTEMOPENMPI | OPENMPI | MPICH | MPICH-GM | HPMPI | MPI | QSMPI
# : ${WM_MPLIB:=OPENMPI}; export WM_MPLIB
: ${WM_MPLIB:=SYSTEMOPENMPI}; export WM_MPLIB

and - it seems to work ... (I can test currently only on one node, will get more during the weekend.)

Thanks a lot Bruno!


achim is offline   Reply With Quote

Old   October 21, 2011, 05:29
Default
  #7
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Achim,

You're welcome!
But it's interesting... you made a redundant fix By choosing SYSTEMOPENMPI, it automatically tries to detect where Open-MPI is installed. But since you also modified in "settings.sh" the entry OPENMPI, you can now probably use either mode, may it be WM_MPLIB=OPENMPI or SYSTEMOPENMPI

Good luck with setting up all nodes!
Bruno
wyldckat is offline   Reply With Quote

Old   August 22, 2013, 12:32
Default
  #8
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Hi guys,

i know this is an old thread, but i have a related question:

1) I want to run with IB support on a cluster - is it enough to simply change $WM_MPLIB = SYSTEMOPENMPI in $WM_PROJECT_DIR/etc/bashrc, where the system openmpi has been compiled with IB support, or do i need to completely recompile OF as well.

I know if i want to use $WM_MPLIB = OPENMPI with IB support from ThirdParty, i need to change some switches and recompile, but if i want to use an already compiled version of openmpi with IB support, i shouldn't have to recompile anything should i???

many thanks and best regards
Jonathan


Quote:
Originally Posted by wyldckat View Post
Hi Achim,

You're welcome!
But it's interesting... you made a redundant fix By choosing SYSTEMOPENMPI, it automatically tries to detect where Open-MPI is installed. But since you also modified in "settings.sh" the entry OPENMPI, you can now probably use either mode, may it be WM_MPLIB=OPENMPI or SYSTEMOPENMPI

Good luck with setting up all nodes!
Bruno
Jonathan is offline   Reply With Quote

Old   August 22, 2013, 12:38
Default
  #9
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Greetings Jonathan,

If you change the option to "WM_MPLIB=SYSTEMOPENMPI", save the file, start a new terminal and run Allwmake once again. OpenFOAM's build system will only build the missing stuff, such as "libptscotch" and Pstream. Once it's done building, it should be ready to go!

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   August 22, 2013, 12:44
Default
  #10
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Hi Bruno,

Firstly - thanks for your reply and very extensive help to everyone on the forum - its much appreciated!

May i just clarify then:

1) so even though the SYSTEMOPENMPI has already been compiled by the cluster administrators with IB support, i still need to run Allwmake for OF to make some additional libraries?

Is the $WM_MPLIB variable not just an environment variable which tells OF to use the system mpi library at runtime, rather than the one built during the OF install?

Many thanks
Best regards
Jonathan

Quote:
Originally Posted by wyldckat View Post
Greetings Jonathan,

If you change the option to "WM_MPLIB=SYSTEMOPENMPI", save the file, start a new terminal and run Allwmake once again. OpenFOAM's build system will only build the missing stuff, such as "libptscotch" and Pstream. Once it's done building, it should be ready to go!

Best regards,
Bruno
Jonathan is offline   Reply With Quote

Old   August 22, 2013, 12:51
Default
  #11
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Hi Bruno,

so - i just answered my own question! (and confirmed your post )

yes, there are missing libraries which you need to build when / if you want to use SYSTEMOPENMPI.

Out of interest then, i wonder why these Pstream etc are not needed when using the bundled mpi libraries! i don't know much about it, so just learning as much as possible at this stage

anyway, thanks for your help - apologies for the double post!

best regards
Jonathan

Quote:
Originally Posted by Jonathan View Post
Hi Bruno,

Firstly - thanks for your reply and very extensive help to everyone on the forum - its much appreciated!

May i just clarify then:

1) so even though the SYSTEMOPENMPI has already been compiled by the cluster administrators with IB support, i still need to run Allwmake for OF to make some additional libraries?

Is the $WM_MPLIB variable not just an environment variable which tells OF to use the system mpi library at runtime, rather than the one built during the OF install?

Many thanks
Best regards
Jonathan
Jonathan is offline   Reply With Quote

Old   August 22, 2013, 13:07
Default
  #12
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
The idea is that any and all MPI toolboxes that you want to build OpenFOAM with, will need "Pstream", "libptscotch" and similar libraries built accordingly. Because these are the libraries that link directly to each MPI toolbox. This is also because OpenFOAM already has these kinds of situations in mind, namely having a single installation that can operate with any number of MPI toolboxes... albeit only one MPI per shell environment.

If you run the following command, you'll see what I mean:
Code:
ls -ld $FOAM_LIBBIN
wyldckat is offline   Reply With Quote

Old   August 23, 2013, 06:24
Default
  #13
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Hi Bruno,

Quote:
Originally Posted by wyldckat View Post
The idea is that any and all MPI toolboxes that you want to build OpenFOAM with, will need "Pstream", "libptscotch" and similar libraries built accordingly. Because these are the libraries that link directly to each MPI toolbox. This is also because OpenFOAM already has these kinds of situations in mind, namely having a single installation that can operate with any number of MPI toolboxes... albeit only one MPI per shell environment.

If you run the following command, you'll see what I mean:
Code:
ls -ld $FOAM_LIBBIN
Thanks very much for that explanation - i see it now.

I wonder if you could help me though with my build?

I ran as instructed, but no libraries where produced in:

/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system

and attached is the first part of my build log - as you can see, it can't find ptscotch.h - which is strange, since (it does exist), but not where OF obviously expects it to be. However, its strange since i have simply downloaded and unzipped the source code from the website.

My previous OPENMPI build went along just fine for what its worth ...

EDIT: I am not so sure now whether my first compile with OPENMPI went ok - in my /lib/openmpi-1.5.3 directory, i only have libPstream.so and libptscotchDecomp.so, not libptscotch.so as Bruno suggested. However, i have been able to run jobs over 30 -> 60 processors on the cluster which ran ok!

Code:
make: Nothing to be done for `all'.

========================================
Start ThirdParty Allwmake
========================================

========================================
Build MPI libraries if required

========================================
Build Scotch decomposition library scotch_5.1.11
    /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11
    scotch header in /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11/include
    scotch libs   in /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64GccDPOpt/lib

========================================
Build PTScotch decomposition library scotch_5.1.11 (uses MPI)
    /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11

+ cd scotch_5.1.11/src
+ prefixDIR=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11
+ libDIR=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system
+ incDIR=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11/include/openmpi-system
+ mkdir -p /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11
+ mkdir -p /home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system
+ configOpt='prefix=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11 libdir=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system includedir=/home/jbergh/OpenFOAM/ThirdParty-2.1.1/platforms/linux64Gcc/scotch_5.1.11/include/openmpi-system'
+ '[' -f ../../etc/wmakeFiles/scotch/Makefile.inc.i686_pc_linux2.shlib-OpenFOAM-64 ']'
+ rm -f Makefile.inc
+ ln -s ../../etc/wmakeFiles/scotch/Makefile.inc.i686_pc_linux2.shlib-OpenFOAM-64 Makefile.inc
+ '[' -f Makefile.inc ']'
+ unset configEnv
+ '[' gcc '!=' gcc ']'
+ make realclean
(cd libscotch ;      make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
rm -f *~ *.o lib*.so parser_yy.c parser_ly.h parser_ll.c *scotch.h *scotchf.h y.output dummysizes
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
(cd scotch ;         make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/scotch'
rm -f *~ *.o acpl amk_ccc amk_fft2 amk_grf amk_hy amk_m2 amk_p2 atst gbase gcv *ggath *gmap gmk_hy gmk_m2 gmk_m3 gmk_msh gmk_ub2 gmtst *gord gotst gout *gpart *gscat *gtst mcv mmk_m2 mmk_m3 mord mtst
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/scotch'
(cd libscotchmetis ; make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotchmetis'
rm -f *~ *.o lib*.so
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotchmetis'
rm -f ../bin/* ../include/* ../lib/*
+ make -j 8 ptscotch
(cd libscotch ;      make VERSION=5 RELEASE=1 PATCHLEVEL=10 ptscotch && make ptinstall)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
rm -f *~ *.o lib*.so parser_yy.c parser_ly.h parser_ll.c *scotch.h *scotchf.h y.output dummysizes
make CFLAGS="-O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH" CC="mpicc"    \
                    scotch.h                            \
                    scotchf.h                            \
                    libptscotch.so                        \
                    libscotch.so                            \
                    libptscotcherr.so                        \
                    libptscotcherrexit.so
make[2]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -DSCOTCH_VERSION=5 -DSCOTCH_RELEASE=1 -DSCOTCH_PATCHLEVEL=10 dummysizes.c -o dummysizes -lz -lm -lrt
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph.c -o bdgraph.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_bd.c -o bdgraph_bipart_bd.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_df.c -o bdgraph_bipart_df.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_ex.c -o bdgraph_bipart_ex.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_ml.c -o bdgraph_bipart_ml.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_sq.c -o bdgraph_bipart_sq.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_st.c -o bdgraph_bipart_st.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_zr.c -o bdgraph_bipart_zr.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_check.c -o bdgraph_check.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_gather_all.c -o bdgraph_gather_all.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_store.c -o bdgraph_store.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c comm.c -o comm.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph.c -o dgraph.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph_allreduce.c -o dgraph_allreduce.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph_band.c -o dgraph_band.o
/usr/lib64/gcc/x86_64-suse-linux/4.3/../../../../x86_64-suse-linux/bin/ld: cannot find -lnuma
collect2: ld returned 1 exit status
make[2]: *** [dummysizes] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
make[1]: *** [ptscotch] Error 2
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
make: *** [ptscotch] Error 2
+ make realclean
(cd libscotch ;      make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
rm -f *~ *.o lib*.so parser_yy.c parser_ly.h parser_ll.c *scotch.h *scotchf.h y.output dummysizes
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
(cd scotch ;         make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/scotch'
rm -f *~ *.o acpl amk_ccc amk_fft2 amk_grf amk_hy amk_m2 amk_p2 atst gbase gcv *ggath *gmap gmk_hy gmk_m2 gmk_m3 gmk_msh gmk_ub2 gmtst *gord gotst gout *gpart *gscat *gtst mcv mmk_m2 mmk_m3 mord mtst
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/scotch'
(cd libscotchmetis ; make realclean)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotchmetis'
rm -f *~ *.o lib*.so
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotchmetis'
rm -f ../bin/* ../include/* ../lib/*

    WARNING: required include file 'ptscotch.h' not found!

========================================
Build Tecio
    optional component was not found

========================================
Done ThirdParty Allwmake
========================================

+ wmakePrintBuild -check
no git description found
+ /bin/rm -f OpenFOAM/Make/linux64GccDPOpt/global.C OpenFOAM/Make/linux64GccDPOpt/global.o
+ wmakeLnInclude OpenFOAM
+ wmakeLnInclude OSspecific/POSIX
+ Pstream/Allwmake
+ wmake libso dummy
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/dummy/libPstream.so' is up to date.
+ case "$WM_MPLIB" in
+ set +x

Note: ignore spurious warnings about missing mpicxx.h headers

wmake libso mpi
/usr/lib64/gcc/x86_64-suse-linux/4.3/../../../../x86_64-suse-linux/bin/ld: cannot find -lnuma
collect2: ld returned 1 exit status
make: *** [/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/openmpi-system/libPstream.so] Error 1
+ OSspecific/POSIX/Allwmake
Found <sys/inotify.h>  --  enabling inotify for file monitoring.
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOSspecific.o' is up to date.
+ wmake libso OpenFOAM
SOURCE=global/global.Cver ; sed -e 's!VERSION_STRING!2.1.1!' -e 's!BUILD_STRING!2.1.1-221db2718bbb!' $SOURCE > Make/linux64GccDPOpt/global.C; g++ -m64 -Dlinux64 -DWM_DP -Wall -Wextra -Wno-unused-parameter -Wold-style-cast -Wnon-virtual-dtor -O3  -DNoRepository -ftemplate-depth-100 -IMake/linux64GccDPOpt -IlnInclude -I. -I/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/src/OpenFOAM/lnInclude -I/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/src/OSspecific/POSIX/lnInclude   -fPIC -c Make/linux64GccDPOpt/global.C -o Make/linux64GccDPOpt/global.o
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so' is up to date.
+ wmake libso fileFormats
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libfileFormats.so' is up to date.
+ wmake libso triSurface
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libtriSurface.so' is up to date.
+ wmake libso meshTools
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libmeshTools.so' is up to date.
+ wmake libso edgeMesh
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libedgeMesh.so' is up to date.
+ wmake libso surfMesh
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/libsurfMesh.so' is up to date.
+ parallel/decompose/AllwmakeLnInclude
+ wmakeLnInclude decompositionMethods
+ wmakeLnInclude metisDecomp
+ wmakeLnInclude scotchDecomp
+ wmakeLnInclude ptscotchDecomp
+ dummyThirdParty/Allwmake
+ wmake libso scotchDecomp
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/dummy/libscotchDecomp.so' is up to date.
+ wmake libso ptscotchDecomp
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/dummy/libptscotchDecomp.so' is up to date.
+ wmake libso metisDecomp
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/dummy/libmetisDecomp.so' is up to date.
+ wmake libso MGridGen
'/home/jbergh/OpenFOAM/OpenFOAM-2.1.1/platforms/linux64GccDPOpt/lib/dummy/libMGridGen.so' is up to date.
many thanks and regards
Jonathan
Jonathan is offline   Reply With Quote

Old   August 24, 2013, 20:36
Default
  #14
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Jonathan,

"ptscotch" is only needed for performing decomposition in parallel runs, such as when using snappyHexMesh in parallel, where it is able to shift cells between processors for keeping a good balance. It is not usable directly in decomposePar and it's not usually used by solvers.

As for the error you got, the block of output that matters is this one:
Quote:
Originally Posted by Jonathan View Post
Code:
+ make -j 8 ptscotch
(cd libscotch ;      make VERSION=5 RELEASE=1 PATCHLEVEL=10 ptscotch && make ptinstall)
make[1]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
rm -f *~ *.o lib*.so parser_yy.c parser_ly.h parser_ll.c *scotch.h *scotchf.h y.output dummysizes
make CFLAGS="-O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH" CC="mpicc"    \
                    scotch.h                            \
                    scotchf.h                            \
                    libptscotch.so                        \
                    libscotch.so                            \
                    libptscotcherr.so                        \
                    libptscotcherrexit.so
make[2]: Entering directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -DSCOTCH_VERSION=5 -DSCOTCH_RELEASE=1 -DSCOTCH_PATCHLEVEL=10 dummysizes.c -o dummysizes -lz -lm -lrt
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph.c -o bdgraph.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_bd.c -o bdgraph_bipart_bd.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_df.c -o bdgraph_bipart_df.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_ex.c -o bdgraph_bipart_ex.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_ml.c -o bdgraph_bipart_ml.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_sq.c -o bdgraph_bipart_sq.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_st.c -o bdgraph_bipart_st.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_bipart_zr.c -o bdgraph_bipart_zr.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_check.c -o bdgraph_check.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_gather_all.c -o bdgraph_gather_all.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c bdgraph_store.c -o bdgraph_store.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c comm.c -o comm.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph.c -o dgraph.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph_allreduce.c -o dgraph_allreduce.o
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -fPIC -c dgraph_band.c -o dgraph_band.o
/usr/lib64/gcc/x86_64-suse-linux/4.3/../../../../x86_64-suse-linux/bin/ld: cannot find -lnuma
collect2: ld returned 1 exit status
make[2]: *** [dummysizes] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
make[1]: *** [ptscotch] Error 2
make[1]: Leaving directory `/home/jbergh/OpenFOAM/ThirdParty-2.1.1/scotch_5.1.11/src/libscotch'
make: *** [ptscotch] Error 2
From it, these lines are the most important:
Quote:
Originally Posted by Jonathan View Post
Code:
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED  -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict  -DSCOTCH_PTSCOTCH -DSCOTCH_VERSION=5 -DSCOTCH_RELEASE=1  -DSCOTCH_PATCHLEVEL=10 dummysizes.c -o dummysizes -lz -lm -lrt

...

/usr/lib64/gcc/x86_64-suse-linux/4.3/../../../../x86_64-suse-linux/bin/ld: cannot find -lnuma

...

make[2]: *** [dummysizes] Error 1
It's mpicc that is saying that when building the application dummysizes, it's not able to find the library "libnuma.so". This is a bit tricky, because mpicc should have also stated where this library is located, not just mention that it needed it by stating "-lnuma".
You can search for the library "libnuma.so" in your system and add the path to the folder it is in, by editing the file
Code:
ThirdParty-2.1.1/etc/wmakeFiles/scotch/Makefile.inc.i686_pc_linux2.shlib-OpenFOAM-64
and adding the path to the variable "LDFLAGS"; for example, if the path was "/usr/mpi/lib", then the line would be:
Code:
LDFLAGS         = -lz -lm -lrt -L/usr/mpi/lib
If this doesn't work, you might need to ask assistance to your systems administrator, since this seems to be specific to your cluster or MPI toolbox.

Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   August 26, 2013, 07:27
Default
  #15
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Dear Bruno,

thank you very much for your post:

I have tried the changes, and they are reflected in the compile log -

Code:
mpicc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -Drestrict=__restrict -DSCOTCH_PTSCOTCH -DSCOTCH_VERSION=5 -DSCOTCH_RELEASE=1 -DSCOTCH_PATCHLEVEL=10 dummysizes.c -o dummysizes -lz -lm -lrt -L/usr/lib64
however, i still bet -lnuma not found errors ...

What i have realised, is that, perhaps i am not supposed to be using mpicc, but rather one of the system versions (there are a couple on the cluster (i.e. mpiCC, mpiCC-vt). I am querying the system admins as to what the differences are, however, if i look in the OF wmake rules, and run: mpicc --showme:link, it does list -lnuma.

but i dont know if that means it knows where it is?

btw, the library libnuma.so is found in the dir /usr/lib64 as shown above.

any more ideas?

thanks very much again...

best regards and thanks
jonathan
Jonathan is offline   Reply With Quote

Old   August 26, 2013, 18:52
Default
  #16
Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 7,514
Blog Entries: 33
Rep Power: 74
wyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the roughwyldckat is a jewel in the rough
Hi Jonathan,

Run this command:
Code:
mpicc -showme
It will tell you the complete command line that will be used in place of mpicc.

For example, on my Ubuntu 12.04, the default Open-MPI mpicc gives me this:
Code:
gcc -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi -pthread -L/usr/lib/openmpi/lib -lmpi -lopen-rte -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
In your case, it will likely mention "/usr/lib" but not "/usr/lib64", before "-lnuma" appears, because only some Linux Distributions use explicitly the "lib64" folder, e.g. openSUSE.

Based on Open-MPI's man page for mpicc, e.g.: http://manpages.ubuntu.com/manpages/...openmpi.1.html - it gives several hints:
  • We can edit the reference wrapper file that has this list of options. You'll have to find the one for your compiler.
  • We can use the output from mpicc indirectly, like this:
    Code:
    gcc `mpicc -showme:compile` -L/usr/lib64 `mpicc -showme:link`
    The idea is to replace the references to mpicc for the command above.
  • Or simply export the value to the environment variable "LDFLAGS", before running Allwmake:
    Code:
    export LDFLAGS="-L/usr/lib64"
Best regards,
Bruno
wyldckat is offline   Reply With Quote

Old   August 27, 2013, 09:00
Default thanks
  #17
Senior Member
 
Join Date: Mar 2010
Location: Cape Town, SA
Posts: 141
Rep Power: 7
Jonathan is on a distinguished road
Dear Bruno,

thanks very much for your help - i think its all sorted now.

adding -L/usr/lib64 seems to have gotten rid of the -lnuma issue, but i think the actual solution was also to run ThirdParty-2.1.1/Allwmake, and then OpenFoam-2.1.1/Allwmake.

Previously i was just running ./Allwmake in OpenFoam-2.1.1, and i was still getting a whole lot of errors.

I'm not sure what has changed, but running wmake in ThirdParty ended up building the ptscotch etc libraries which were missing. These are all now in OpenFoam/platforms/../../openmpi-system/ as well.

Running:
Code:
mpicc -showme
did as you said - and listed the correct lib locations, including /usr/lib64.

so i'm not sure what the change actually was, but as i said above (i dont know if this is possible!), running wmake in ThirdParty first and then in OpenFoam, seemed to do the trick!

I just need to test now, and check that the Infiniband is actually being used in multiple node jobs.

again, thanks very much for your help,
its much appreciated,
best regards
jonathan
Jonathan is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun hangup.. can anyone help how to fix mpirun issues. prameelar OpenFOAM 10 August 9, 2011 10:43
Paraview in Prallel (server-client) prapanj OpenFOAM Paraview & paraFoam 11 September 24, 2010 08:12
Problem with mpirun with OpenFOAM jiejie OpenFOAM 3 July 7, 2010 20:30
MPIRUN fails lfbarcelo OpenFOAM 3 March 29, 2010 08:41
what is wrong with the mpirun parameter -mca ? donno OpenFOAM 6 March 24, 2010 18:00


All times are GMT -4. The time now is 02:48.