CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Installation (https://www.cfd-online.com/Forums/openfoam-installation/)
-   -   [OpenFOAM.com] Compiling OF in cluster (https://www.cfd-online.com/Forums/openfoam-installation/232277-compiling-cluster.html)

fedez91 December 7, 2020 20:40

Compiling OF in cluster
 
Hi, I am trying to compile OFv2006 in the PSC Bridges supercomputer. i Follow the instructions provided in the official website (https://develop.openfoam.com/Develop...r/doc/Build.md) but I get the following errors:

When I source the /etc/config.sh/setting I get the following
Code:

gcc: error: unrecognized command line option ‘--showme:link’
And then when I try to compile
Code:

wmake mpi
gcc: error: unrecognized command line option ‘--showme:compile’
    Ctoo: UOPwrite.C
In file included from UOPwrite.C:33:0:
PstreamGlobals.H:42:17: fatal error: mpi.h: No such file or directory
 #include <mpi.h>
                ^
compilation terminated.

The cluster manual (https://www.psc.edu/resources/bridge...g-environment/) says the Inlet MPI compiler is load by default, but I am not sure if I should change anything in the configuration files.

Thanks a lot of the help.

olesen December 15, 2020 04:19

Quote:

Originally Posted by fedez91 (Post 790028)
Hi, I am trying to compile OFv2006 in the PSC Bridges supercomputer. i Follow the instructions provided in the official website (https://develop.openfoam.com/Develop...r/doc/Build.md) but I get the following errors:

When I source the /etc/config.sh/setting I get the following
Code:

gcc: error: unrecognized command line option ‘--showme:link’
And then when I try to compile
Code:

wmake mpi
gcc: error: unrecognized command line option ‘--showme:compile’
    Ctoo: UOPwrite.C
In file included from UOPwrite.C:33:0:
PstreamGlobals.H:42:17: fatal error: mpi.h: No such file or directory
 #include <mpi.h>
                ^
compilation terminated.

The cluster manual (https://www.psc.edu/resources/bridge...g-environment/) says the Inlet MPI compiler is load by default, but I am not sure if I should change anything in the configuration files.

Thanks a lot of the help.




If you are using INTELMPI, you need to specify this. Either in the etc/bashrc (the WM_MPLIB value), or provide the same in a prefs.sh file. The '--showme:compile' bits are used for SYSTEMOPENMPI (the default).


If your cluster has it, you might also consider using spack for your installation.

fedez91 December 15, 2020 12:11

Quote:

Originally Posted by olesen (Post 790692)
If you are using INTELMPI, you need to specify this. Either in the etc/bashrc (the WM_MPLIB value), or provide the same in a prefs.sh file. The '--showme:compile' bits are used for SYSTEMOPENMPI (the default).


If your cluster has it, you might also consider using spack for your installation.

Thanks mark for the reply. Unfortunately, I have no Spack in the cluster. I have changed WM_MPLIB for INTELMPI but now I get the following error:
Code:

Compile OpenFOAM libraries

    ln: OpenFOAM/lnInclude
    ln: OSspecific/POSIX/lnInclude
    found <sys/inotify.h>  --  enabling inotify for file monitoring.
wmake libo (POSIX)
wmake  dummy (mpi=INTELMPI)
wmake dummy
wmake  mpi (mpi=INTELMPI)
wmake mpi
Making dependency list for source file PstreamGlobals.C
Making dependency list for source file UPstream.C
Making dependency list for source file UIPread.C
Making dependency list for source file UOPwrite.C
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -c UOPwrite.C -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UOPwrite.o
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -c UIPread.C -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UIPread.o
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -c UPstream.C -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UPstream.o
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -c PstreamGlobals.C -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/PstreamGlobals.o
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -shared -Xlinker --add-needed -Xlinker --no-as-needed  /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UOPwrite.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UIPread.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UPstream.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/PstreamGlobals.o -L/home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32OptINTELMPI/lib \
    -L/opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/lib -lmpi  -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32Opt/lib/mpi/libPstream.so
/bin/ld: cannot find -lmpi
collect2: error: ld returned 1 exit status
make: *** [/home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32Opt/lib/mpi/libPstream.so] Error 1

Any idea why it is still showing this error?
Thanks for all the help!

olesen December 15, 2020 12:23

Quote:

Originally Posted by fedez91 (Post 790757)
Code:


...
    -L/opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/lib -lmpi  -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32Opt/lib/mpi/libPstream.so
/bin/ld: cannot find -lmpi
collect2: error: ld returned 1 exit status
 make: *** [/home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32Opt/lib/mpi/libPstream.so] Error 1

Any idea why it is still showing this error?
Thanks for all the help!


After sourcing your updated OpenFOAM etc/bashrc, you should also verify that the directories listed by the $MPI_ARCH_PATH actually contain an include/ and a lib/ or lib64/ directory AND that these also contain the mpi.h, mpi.so etc.


They are probably there, but could also be in a parallel path. In the meantime, while you are busy digging, you can kick off an OpenFOAM compilation of everything else. Just change the WM_MPLIB to "dummy" to skip over the mpi bits. You can come back later and change it back to INTELMPI, or whichever mpi flavour you have and rebuild. Since this second pass only rebuilds the MPI bits, it will typically run through in a couple (1-3) of minutes, with most of the time being spent examining all of the things that it does not need to rebuild.


Don't give up, you'll soon get there.

fedez91 December 15, 2020 12:57

Quote:

Originally Posted by olesen (Post 790759)
After sourcing your updated OpenFOAM etc/bashrc, you should also verify that the directories listed by the $MPI_ARCH_PATH actually contain an include/ and a lib/ or lib64/ directory AND that these also contain the mpi.h, mpi.so etc.


They are probably there, but could also be in a parallel path. In the meantime, while you are busy digging, you can kick off an OpenFOAM compilation of everything else. Just change the WM_MPLIB to "dummy" to skip over the mpi bits. You can come back later and change it back to INTELMPI, or whichever mpi flavour you have and rebuild. Since this second pass only rebuilds the MPI bits, it will typically run through in a couple (1-3) of minutes, with most of the time being spent examining all of the things that it does not need to rebuild.


Don't give up, you'll soon get there.

Thanks Mark! I will check that and see if there is something missing.

Sorry if this is a silly question, but when you say change WM_MPLIB to "dummy" do you mean:
Code:

>export WM_MPLIB=dummy
?
If I do that a just get a similar error
Code:

gcc=/usr/lib64/ccache/gcc
clang=/usr/lib64/ccache/clang
mpirun=/opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/bin/mpirun
make=/bin/make
cmake=/bin/cmake
wmake=/home/zabaleta/OpenFOAM/OpenFOAM-v2006/wmake/wmake
m4=/bin/m4
flex=/bin/flex

compiler=/usr/lib64/ccache/g++
g++ (GCC) 4.8.5 20150623 (Red Hat 4.8.5-36)

========================================
2020-12-15 12:55:05 -0500
Starting compile OpenFOAM-v2006 Allwmake
  Gcc system compiler
  linux64GccDPInt32Opt, with INTELMPI mpi
========================================

built wmake-bin (linux64Gcc)
Skip ThirdParty (no directory)
========================================
Compile OpenFOAM libraries

    ln: OpenFOAM/lnInclude
    ln: OSspecific/POSIX/lnInclude
    found <sys/inotify.h>  --  enabling inotify for file monitoring.
wmake libo (POSIX)
wmake  dummy (mpi=INTELMPI)
wmake dummy
wmake  mpi (mpi=INTELMPI)
wmake mpi
g++ -std=c++11 -m64 -pthread -DOPENFOAM=2006 -DWM_DP -DWM_LABEL_SIZE=32 -Wall -Wextra -Wold-style-cast -Wnon-virtual-dtor -Wno-unused-parameter -Wno-invalid-offsetof -Wno-attributes -Wno-unknown-pragmas  -O3  -DNoRepository -ftemplate-depth-100 -DMPICH_SKIP_MPICXX -DOMPI_SKIP_MPICXX -isystem /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/include -Wno-old-style-cast -Wno-unused-local-typedefs -Wno-array-bounds -Wno-deprecated-declarations -fpermissive -iquote. -IlnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OpenFOAM/lnInclude -I/home/zabaleta/OpenFOAM/OpenFOAM-v2006/src/OSspecific/POSIX/lnInclude  -fPIC -shared -Xlinker --add-needed -Xlinker --no-as-needed  /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UOPwrite.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UIPread.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/UPstream.o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/build/linux64GccDPInt32OptINTELMPI/src/Pstream/mpi/PstreamGlobals.o -L/home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32OptINTELMPI/lib \
    -L/opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/lib -lmpi  -o /home/zabaleta/OpenFOAM/OpenFOAM-v2006/platforms/linux64GccDPInt32Opt/lib/mpi/libPstream.so
/bin/ld: cannot find -lmpi
collect2: error: ld returned 1 exit status


olesen December 15, 2020 15:33

Quote:

Originally Posted by fedez91 (Post 790767)
Thanks Mark! I will check that and see if there is something missing.

Sorry if this is a silly question, but when you say change WM_MPLIB to "dummy" do you mean:
Code:

>export WM_MPLIB=dummy
?
If I do that a just get a similar error


Code:


wmake libo (POSIX)
wmake  dummy (mpi=INTELMPI)
wmake dummy
wmake  mpi (mpi=INTELMPI)
wmake mpi



Nope that's not going to do it (actually you can see that from the wmake with the mpi=INTELMPI information).


Lets go for the heavy-handed solution.

Edit your etc/bashrc (or etc/prefs.sh) and change WM_MPLIB to have "dummy". Source the OpenFOAM etc/bashrc again. To test that it worked (or not), just try recompiling some mpi bits. For example,


src/Pstream/Allwmake


Should be ok.


BTW: you were almost there with your first approach, but you would have also need the FOAM_MPI variable. The direct approach (edit the file) seems to be safer until you get a better idea of how things work.


Cheers,
/mark

olesen December 15, 2020 15:38

@fedez91 - thinking of your experience, wmake for v2012 will get two additional options:
Code:

ENH: add wmake -show-mpi-compile, -show-mpi-link options
   
- useful for diagnosing which MPI paths and flags are being used
  when setting up for a new MPI configuration.


You will still need to figure if things make sense, but at least this way you can easily see what OpenFOAM is looking for when using a particular MPI.

fedez91 January 12, 2021 15:08

Quote:

Originally Posted by olesen (Post 790785)
Nope that's not going to do it (actually you can see that from the wmake with the mpi=INTELMPI information).


Lets go for the heavy-handed solution.

Edit your etc/bashrc (or etc/prefs.sh) and change WM_MPLIB to have "dummy". Source the OpenFOAM etc/bashrc again. To test that it worked (or not), just try recompiling some mpi bits. For example,


src/Pstream/Allwmake


Should be ok.


BTW: you were almost there with your first approach, but you would have also need the FOAM_MPI variable. The direct approach (edit the file) seems to be safer until you get a better idea of how things work.


Cheers,
/mark

I was able to compile the whole thing with the dummy suggestion, but I am still having trouble with the MPI part.

When I checked the intel mpi installed in the cluster, I cannot find mpi.so. I think it was replaced with something else. In particular, I see the following in the instalation folder
Code:

>ls /opt/intel/compilers_and_libraries_2020.4.304/linux/mpi/intel64/lib/
debug    libmpicxx.a  libmpicxx.so.12    libmpicxx.so.12.0.0  libmpifort.so    libmpifort.so.12.0    libmpijava.so    libmpijava.so.1.0        mpi.jar  release_mt
debug_mt  libmpicxx.so  libmpicxx.so.12.0  libmpifort.a        libmpifort.so.12  libmpifort.so.12.0.0  libmpijava.so.1  libmpi_shm_heap_proxy.so  release

In the manual (first post in the thread) it says: GNU Compilers -> Intel MPI -> C++ -> Compile with this command: mpicxx

So I guess mpi.so was replaced with libmpicxx.so

I tried to create a soft link to libmpicxx.so called mpi.so in $WM_PROJECT_DIR/libs and added that to $LD_LIBRARY_PATH hoping that the compiler finds mpi.so there and goes directly to libmpicxx.so but it didn't work. I guess I am close to solving the problem but I don't know how to continue :confused::confused:.

Any help appreciated!

fedez91 January 19, 2021 12:31

I was able to compile the code using OpenMPI, but not IntelMPI.

All I had to do was to change WM_MPLIB = SYSTEMOPENMPI and then load OpenMPIfor gcc compilers module using:
Code:

>load module mpi/gcc_openmpi
All compiled normally then.
Thanks for all the help.


All times are GMT -4. The time now is 06:47.