CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2 > SU2 Installation

SU2 parallel install build from source failed

Register Blogs Community New Posts Updated Threads Search

Like Tree5Likes
  • 2 Post By pcg
  • 2 Post By Zen
  • 1 Post By pcg

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 1, 2020, 07:13
Default SU2 parallel install build from source failed
  #1
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Hello Everyone

I use UBUNTU 18.04 LTS for my linux distro.

I'm trying to build SU2(7.0.5 Blackbird version) from source. I have issues about parallel build. Although I specify MPICC and MPICXX paths(shown in bashrc_mpicc_mpiccx_paths.png) in my .bashrc file,it gives that MPICC and MPICXX not found message when I run meson.py. It is shown in MPICC_and_MPICXX_fail.png

I want to give some facts regarding to problem and what I have done so far.

1.I have installed simple parallel version of SU2(7.0.5 Blackbird). It is working fine with both serial and parallel.In other words, I can run SU2 with both using parallel_computation.py and mpirun command ,seperately. No problem regarding to mpi issue I think. By the way, I use MPICH and I built it from source. Its location is given in mpi_path_in_bashrc.png . Also when I run which command on my terminal it gives me location of mpicc and mpicxx shown in mpicc_locations_seen_from_terminal.png

2. On my first attempt,it gives pkg-config not found when I run meson.py.Then,I have installed pkg-config and specified its PATH in my .bashrc(shown in pkg-config_path_in_bashrc.png in thumbnail figure). No problem regarding pkg-config I think.

3. On my second attempt, when I run meson.py. Terminal showed me that pkg-config is found message. However, it gave no MPICC and MPICXX found mesage shown in MPICC_and_MPICXX_fail.png

To sum up, although I have properly installed MPICH (and wrote PATH in .bashrc) in mpicc_locations_seen_from_terminal.png , meson.py of SU2 gives me NO MPICC and MPICXX found error.MPICC_and_MPICXX_fail.png



Could you help me for this issue ?
How can I solve MPICC issue on meson.py
I will be very thankful.
Attached Images
File Type: png pkg-config_path_in_bashrc.png (14.6 KB, 52 views)

Last edited by steadyman; July 1, 2020 at 07:29. Reason: wrong attachment
steadyman is offline   Reply With Quote

Old   July 3, 2020, 19:39
Default
  #2
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 465
Rep Power: 13
pcg is on a distinguished road
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]
ugurtan666 and steadyman like this.
pcg is offline   Reply With Quote

Old   July 4, 2020, 03:30
Default
  #3
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by pcg View Post
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]
Thanks I'll try it
and I think Openmpi is less problematic to these kind of things I will also try that
steadyman is offline   Reply With Quote

Old   July 10, 2020, 06:09
Default
  #4
Member
 
Hyun Ko
Join Date: Jun 2009
Posts: 32
Rep Power: 16
hyunko is on a distinguished road
hi steadyman

I have same issues for compiling SU2.
Do you have any update for that?
hyunko is offline   Reply With Quote

Old   July 10, 2020, 11:10
Default OPENMPI update
  #5
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by hyunko View Post
hi steadyman

I have same issues for compiling SU2.
Do you have any update for that?
Hello Hyunko I cannot run SU2 with OPENMPI. Now I'll try MPICH built from source you can visit my other post I mention this problem also there but with MPICH
SU2 paralllel Build from source ninja failed on ubuntu 18.04

Last edited by steadyman; July 10, 2020 at 11:11. Reason: Extra Information
steadyman is offline   Reply With Quote

Old   July 11, 2020, 00:58
Default
  #6
Member
 
Hyun Ko
Join Date: Jun 2009
Posts: 32
Rep Power: 16
hyunko is on a distinguished road
Quote:
Originally Posted by steadyman View Post
Hello Hyunko I cannot run SU2 with OPENMPI. Now I'll try MPICH built from source you can visit my other post I mention this problem also there but with MPICH
SU2 paralllel Build from source ninja failed on ubuntu 18.04


Hello stradyman

You mean that you still have a problem to build SU2 by mpich?

I have a same problem to compile SU2 with mpich.
I have tried two ways within this thread. But not successful.
hyunko is offline   Reply With Quote

Old   July 11, 2020, 09:51
Default meson build python script change does not work
  #7
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by pcg View Post
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]

Hello pcg

I have performed 2. option you recommended but it gives error shown in meson_prefix.png In this I have used prefix option.
In my 2nd attempt , I do not use prefix just use ./meson.py build and it gives meson_witout_prefix.png
steadyman is offline   Reply With Quote

Old   July 16, 2020, 06:33
Default
  #8
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 18
Rep Power: 13
averis007 is on a distinguished road
Even with intel mpi, I'm facing the same issues. Can anyone help?
Attached Images
File Type: jpg Screenshot from 2020-07-16 16-01-21.jpg (96.2 KB, 46 views)
averis007 is offline   Reply With Quote

Old   July 17, 2020, 08:51
Default
  #9
Zen
Member
 
Zeno
Join Date: Sep 2013
Location: Delft, The Netherlands
Posts: 63
Rep Power: 12
Zen is on a distinguished road
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:

Quote:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson
Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
AshwaniAssam and steadyman like this.
Zen is offline   Reply With Quote

Old   July 19, 2020, 06:38
Default
  #10
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by Zen View Post
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:



Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
Hello Zen
Do you perform "export" issues on your .bashrc ?
Thanks in advance
steadyman is offline   Reply With Quote

Old   July 19, 2020, 07:38
Default
  #11
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by Zen View Post
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:



Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
Thanks zen
It worked.
steadyman is offline   Reply With Quote

Old   July 21, 2020, 02:07
Default
  #12
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 18
Rep Power: 13
averis007 is on a distinguished road
Hi, I followed the instructions given by Zen and the installation was successful. I also included the required paths in my bashrc file. But unfortunately, my parallel cases are still not running as desired.

Specifically, when I do

parallel_computation.py -n 10 -f turb_ONERAM6.cfg

on the test case (Turb ONERA M), it's doing everything 10 times (once each on 10 cores), instead of distributing over 10 cores. For, reference, this is what I put in my .bashrc file:


export SU2_RUN="/usr/local/bin"
export SU2_HOME="/home/avi/Downloads/SU2/su2code-SU2-d0e10f8"
export PATH=$PATH:$SU2_RUN
export PYTHONPATH=$PYTHONPATH:$SU2_RUN



Am I missing something here?
averis007 is offline   Reply With Quote

Old   July 21, 2020, 06:41
Default
  #13
New Member
 
Join Date: Jul 2020
Posts: 22
Rep Power: 5
steadyman is on a distinguished road
Quote:
Originally Posted by averis007 View Post
Hi, I followed the instructions given by Zen and the installation was successful. I also included the required paths in my bashrc file. But unfortunately, my parallel cases are still not running as desired.

Specifically, when I do

parallel_computation.py -n 10 -f turb_ONERAM6.cfg

on the test case (Turb ONERA M), it's doing everything 10 times (once each on 10 cores), instead of distributing over 10 cores. For, reference, this is what I put in my .bashrc file:


export SU2_RUN="/usr/local/bin"
export SU2_HOME="/home/avi/Downloads/SU2/su2code-SU2-d0e10f8"
export PATH=$PATH:$SU2_RUN
export PYTHONPATH=$PYTHONPATH:$SU2_RUN



Am I missing something here?
Suggestion:
When I used openmpi I faced this issue,maybe installing "mpi4py" will help.If you use ubuntu I recommend you to use synaptics installation manager to install openmpi.
Summary:
I cannot solve this wrong parallellization issue with openmpi.I installed Mpich and it is solved by zen 's feedbacks.

Last edited by steadyman; July 21, 2020 at 06:43. Reason: Extra Info
steadyman is offline   Reply With Quote

Old   July 21, 2020, 06:47
Default
  #14
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 465
Rep Power: 13
pcg is on a distinguished road
The "repeated output" problem is due to one of two problems:
You did not actually compile the parallel version,
The version of mpi used to run the code is different from the one used to compile it.
Given that you specified "custom-mpi" the latter is more likely.

Try running the code with /full/path/to/mpirun -n 10 SU2_CFD ...
koray likes this.
pcg is offline   Reply With Quote

Old   July 21, 2020, 07:48
Default
  #15
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 18
Rep Power: 13
averis007 is on a distinguished road
Quote:
Originally Posted by pcg View Post
The "repeated output" problem is due to one of two problems:
You did not actually compile the parallel version,
The version of mpi used to run the code is different from the one used to compile it.
Given that you specified "custom-mpi" the latter is more likely.

Try running the code with /full/path/to/mpirun -n 10 SU2_CFD ...

Yes, the second option worked for me. Thanks a ton.
averis007 is offline   Reply With Quote

Old   April 24, 2021, 06:35
Default
  #16
New Member
 
Jędrzej
Join Date: May 2019
Posts: 4
Rep Power: 6
Kusy is on a distinguished road
Quote:
Originally Posted by averis007 View Post
Yes, the second option worked for me. Thanks a ton.



So what versionof OpenMPI should I use for SU2 v7Blackbird? I've got the mpirun --version 3.4.1 and gcc 4.8.3 - same issue...
Kusy is offline   Reply With Quote

Old   April 24, 2021, 16:33
Default
  #17
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 465
Rep Power: 13
pcg is on a distinguished road
You need to upgrade your compiler. Namely for something with complete C++11 support.
gcc 4.8.5+, but since you are upgrading you might as well get the newest possible.
pcg is offline   Reply With Quote

Old   April 25, 2021, 11:25
Default
  #18
New Member
 
Jędrzej
Join Date: May 2019
Posts: 4
Rep Power: 6
Kusy is on a distinguished road
I installed gcc 4.9.3 but it still starts many solver at once... Anything else? Some options to solve it?
I'm using SU2 from file - i can't make it from source becouse meson.py doesn't work offline. Maybe i've got wrong paths to mpich?
Kusy is offline   Reply With Quote

Old   April 25, 2021, 16:13
Default
  #19
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 465
Rep Power: 13
pcg is on a distinguished road
If you are using a precompiled binary you need mpich, not openmpi (see https://su2code.github.io/docs_v7/SU...-parallel-mode)

Once you install mpich you need to make sure "mpirun" is the mpirun of MPICH, for example by using the full path.
I'll try to make meson work offline.
pcg is offline   Reply With Quote

Old   April 25, 2021, 16:50
Default
  #20
New Member
 
Jędrzej
Join Date: May 2019
Posts: 4
Rep Power: 6
Kusy is on a distinguished road
I made that, I've got full PATH to mpich, SU2_RUN etc...
I don't know where other to find the issue with combine SU2-MPICH.
Maybe it's old Python's version? I've got Python 3.4.1 as default

Last edited by Kusy; April 25, 2021 at 18:03.
Kusy is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Foam::error::printStack(Foam::Ostream&) with simpleFoam -parallel U.Golling OpenFOAM Running, Solving & CFD 52 September 23, 2023 03:35
[Other] Tabulated thermophysicalProperties library chriss85 OpenFOAM Community Contributions 62 October 2, 2022 03:50
[swak4Foam] funkyDoCalc with OF2.3 massflow NiFl OpenFOAM Community Contributions 14 November 25, 2020 03:30
[Other] Adding solvers from DensityBasedTurbo to foam-extend 3.0 Seroga OpenFOAM Community Contributions 9 June 12, 2015 17:18
[swak4Foam] swak4foam building problem GGerber OpenFOAM Community Contributions 54 April 24, 2015 16:02


All times are GMT -4. The time now is 03:28.