CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2 > SU2 Installation

SU2 parallel install build from source failed

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree2Likes
  • 1 Post By pcg
  • 1 Post By Zen

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   July 1, 2020, 07:13
Default SU2 parallel install build from source failed
  #1
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Hello Everyone

I use UBUNTU 18.04 LTS for my linux distro.

I'm trying to build SU2(7.0.5 Blackbird version) from source. I have issues about parallel build. Although I specify MPICC and MPICXX paths(shown in bashrc_mpicc_mpiccx_paths.png) in my .bashrc file,it gives that MPICC and MPICXX not found message when I run meson.py. It is shown in MPICC_and_MPICXX_fail.png

I want to give some facts regarding to problem and what I have done so far.

1.I have installed simple parallel version of SU2(7.0.5 Blackbird). It is working fine with both serial and parallel.In other words, I can run SU2 with both using parallel_computation.py and mpirun command ,seperately. No problem regarding to mpi issue I think. By the way, I use MPICH and I built it from source. Its location is given in mpi_path_in_bashrc.png . Also when I run which command on my terminal it gives me location of mpicc and mpicxx shown in mpicc_locations_seen_from_terminal.png

2. On my first attempt,it gives pkg-config not found when I run meson.py.Then,I have installed pkg-config and specified its PATH in my .bashrc(shown in pkg-config_path_in_bashrc.png in thumbnail figure). No problem regarding pkg-config I think.

3. On my second attempt, when I run meson.py. Terminal showed me that pkg-config is found message. However, it gave no MPICC and MPICXX found mesage shown in MPICC_and_MPICXX_fail.png

To sum up, although I have properly installed MPICH (and wrote PATH in .bashrc) in mpicc_locations_seen_from_terminal.png , meson.py of SU2 gives me NO MPICC and MPICXX found error.MPICC_and_MPICXX_fail.png



Could you help me for this issue ?
How can I solve MPICC issue on meson.py
I will be very thankful.
Attached Images
File Type: png pkg-config_path_in_bashrc.png (14.6 KB, 10 views)

Last edited by steadyman; July 1, 2020 at 07:29. Reason: wrong attachment
steadyman is offline   Reply With Quote

Old   July 3, 2020, 19:39
Default
  #2
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 175
Rep Power: 4
pcg is on a distinguished road
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]
steadyman likes this.
pcg is offline   Reply With Quote

Old   July 4, 2020, 03:30
Default
  #3
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by pcg View Post
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]
Thanks I'll try it
and I think Openmpi is less problematic to these kind of things I will also try that
steadyman is offline   Reply With Quote

Old   July 10, 2020, 06:09
Default
  #4
New Member
 
Hyun Ko
Join Date: Jun 2009
Posts: 22
Rep Power: 13
hyunko is on a distinguished road
hi steadyman

I have same issues for compiling SU2.
Do you have any update for that?
hyunko is offline   Reply With Quote

Old   July 10, 2020, 11:10
Default OPENMPI update
  #5
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by hyunko View Post
hi steadyman

I have same issues for compiling SU2.
Do you have any update for that?
Hello Hyunko I cannot run SU2 with OPENMPI. Now I'll try MPICH built from source you can visit my other post I mention this problem also there but with MPICH
SU2 paralllel Build from source ninja failed on ubuntu 18.04

Last edited by steadyman; July 10, 2020 at 11:11. Reason: Extra Information
steadyman is offline   Reply With Quote

Old   July 11, 2020, 00:58
Default
  #6
New Member
 
Hyun Ko
Join Date: Jun 2009
Posts: 22
Rep Power: 13
hyunko is on a distinguished road
Quote:
Originally Posted by steadyman View Post
Hello Hyunko I cannot run SU2 with OPENMPI. Now I'll try MPICH built from source you can visit my other post I mention this problem also there but with MPICH
SU2 paralllel Build from source ninja failed on ubuntu 18.04


Hello stradyman

You mean that you still have a problem to build SU2 by mpich?

I have a same problem to compile SU2 with mpich.
I have tried two ways within this thread. But not successful.
hyunko is offline   Reply With Quote

Old   July 11, 2020, 09:51
Default meson build python script change does not work
  #7
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by pcg View Post
This is because the pkg-config dependency for mpich is not "mpi" and so the build system is looking for the wrong thing.
I've worked around this kind of issue in two ways:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson. (with Intel MPI this works without extra environment variables, I have not tried mpich this way)

2 - Edit the file SU2/meson.build, replace:
mpi_dep = [dependency('mpi', language:'c', required : get_option('with-mpi')),
dependency('mpi', language:'cpp', required : get_option('with-mpi'))]
with:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]

Hello pcg

I have performed 2. option you recommended but it gives error shown in meson_prefix.png In this I have used prefix option.
In my 2nd attempt , I do not use prefix just use ./meson.py build and it gives meson_witout_prefix.png
steadyman is offline   Reply With Quote

Old   July 16, 2020, 06:33
Default
  #8
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 7
Rep Power: 9
averis007 is on a distinguished road
Even with intel mpi, I'm facing the same issues. Can anyone help?
Attached Images
File Type: jpg Screenshot from 2020-07-16 16-01-21.jpg (96.2 KB, 7 views)
averis007 is offline   Reply With Quote

Old   July 17, 2020, 08:51
Default
  #9
Zen
Member
 
Zeno
Join Date: Sep 2013
Location: Delft, The Netherlands
Posts: 62
Rep Power: 8
Zen is on a distinguished road
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:

Quote:
1 - Specify -Dcustom-mpi=true in your call to meson.py, and do "export CC=$MPICC; export CXX=$MPICXX" before calling meson
Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
steadyman likes this.
Zen is offline   Reply With Quote

Old   July 19, 2020, 06:38
Default
  #10
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by Zen View Post
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:



Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
Hello Zen
Do you perform "export" issues on your .bashrc ?
Thanks in advance
steadyman is offline   Reply With Quote

Old   July 19, 2020, 07:38
Default
  #11
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by Zen View Post
Hi all,

I had the same issue in Ubuntu 16.04 , namely that pkg-config could not find mpicc and mpicxx. I have solved the problem following the recommendation from pcg:



Specifically, I have done the following:

Code:
export MPICC=/usr/bin/mpicc (your path/to/mpicc may be different)
export MPICXX=/usr/bin/mpicxx
export CC=$MPICC
export CXX=$MPICXX
./meson.py build -Dcustom-mpi=true
./ninja -C build install
hope this helps,

Z
Thanks zen
It worked.
steadyman is offline   Reply With Quote

Old   July 21, 2020, 02:07
Default
  #12
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 7
Rep Power: 9
averis007 is on a distinguished road
Hi, I followed the instructions given by Zen and the installation was successful. I also included the required paths in my bashrc file. But unfortunately, my parallel cases are still not running as desired.

Specifically, when I do

parallel_computation.py -n 10 -f turb_ONERAM6.cfg

on the test case (Turb ONERA M), it's doing everything 10 times (once each on 10 cores), instead of distributing over 10 cores. For, reference, this is what I put in my .bashrc file:


export SU2_RUN="/usr/local/bin"
export SU2_HOME="/home/avi/Downloads/SU2/su2code-SU2-d0e10f8"
export PATH=$PATH:$SU2_RUN
export PYTHONPATH=$PYTHONPATH:$SU2_RUN



Am I missing something here?
averis007 is offline   Reply With Quote

Old   July 21, 2020, 06:41
Default
  #13
New Member
 
Join Date: Jul 2020
Posts: 19
Rep Power: 2
steadyman is on a distinguished road
Quote:
Originally Posted by averis007 View Post
Hi, I followed the instructions given by Zen and the installation was successful. I also included the required paths in my bashrc file. But unfortunately, my parallel cases are still not running as desired.

Specifically, when I do

parallel_computation.py -n 10 -f turb_ONERAM6.cfg

on the test case (Turb ONERA M), it's doing everything 10 times (once each on 10 cores), instead of distributing over 10 cores. For, reference, this is what I put in my .bashrc file:


export SU2_RUN="/usr/local/bin"
export SU2_HOME="/home/avi/Downloads/SU2/su2code-SU2-d0e10f8"
export PATH=$PATH:$SU2_RUN
export PYTHONPATH=$PYTHONPATH:$SU2_RUN



Am I missing something here?
Suggestion:
When I used openmpi I faced this issue,maybe installing "mpi4py" will help.If you use ubuntu I recommend you to use synaptics installation manager to install openmpi.
Summary:
I cannot solve this wrong parallellization issue with openmpi.I installed Mpich and it is solved by zen 's feedbacks.

Last edited by steadyman; July 21, 2020 at 06:43. Reason: Extra Info
steadyman is offline   Reply With Quote

Old   July 21, 2020, 06:47
Default
  #14
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 175
Rep Power: 4
pcg is on a distinguished road
The "repeated output" problem is due to one of two problems:
You did not actually compile the parallel version,
The version of mpi used to run the code is different from the one used to compile it.
Given that you specified "custom-mpi" the latter is more likely.

Try running the code with /full/path/to/mpirun -n 10 SU2_CFD ...
pcg is offline   Reply With Quote

Old   July 21, 2020, 07:48
Default
  #15
New Member
 
Avijeet
Join Date: Nov 2012
Posts: 7
Rep Power: 9
averis007 is on a distinguished road
Quote:
Originally Posted by pcg View Post
The "repeated output" problem is due to one of two problems:
You did not actually compile the parallel version,
The version of mpi used to run the code is different from the one used to compile it.
Given that you specified "custom-mpi" the latter is more likely.

Try running the code with /full/path/to/mpirun -n 10 SU2_CFD ...

Yes, the second option worked for me. Thanks a ton.
averis007 is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Foam::error::printStack(Foam::Ostream&) with simpleFoam -parallel U.Golling OpenFOAM Running, Solving & CFD 42 May 22, 2020 12:20
[Other] Tabulated thermophysicalProperties library chriss85 OpenFOAM Community Contributions 60 April 23, 2020 10:53
[swak4Foam] funkyDoCalc with OF2.3 massflow NiFl OpenFOAM Community Contributions 11 November 1, 2016 06:43
[Other] Adding solvers from DensityBasedTurbo to foam-extend 3.0 Seroga OpenFOAM Community Contributions 9 June 12, 2015 17:18
[swak4Foam] swak4foam building problem GGerber OpenFOAM Community Contributions 54 April 24, 2015 16:02


All times are GMT -4. The time now is 10:15.