CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

Parallel execution of SU2

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree3Likes
  • 1 Post By bigfootedrockmidget
  • 1 Post By bigfootedrockmidget
  • 1 Post By bigfootedrockmidget

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 15, 2016, 05:59
Default
  #1
New Member
 
kowalski
Join Date: Jul 2016
Posts: 9
Rep Power: 10
marco.fossati is on a distinguished road
Hi all,

even if SU2 is compiled with parallel support (see configuration summary below). When I try:

mpirun -n 8 SU2_CFD config.cfg

I get 8 serial independent jobs instead of an actual parallel job. Is there anything I am missing?

Thanks
Marco

Here's the output of the configuration script:

checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking target system type... x86_64-unknown-linux-gnu
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking for style of include used by make... GNU
checking whether the C++ compiler works... yes
checking for C++ compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C++ compiler... yes
checking whether /usr/bin/mpicxx accepts -g... yes
checking dependency style of /usr/bin/mpicxx... gcc3
checking whether we are using the GNU C compiler... yes
checking whether /usr/bin/mpicc accepts -g... yes
checking for /usr/bin/mpicc option to accept ISO C89... none needed
checking dependency style of /usr/bin/mpicc... gcc3
checking whether /usr/bin/mpicc and cc understand -c and -o together... yes
checking for ranlib... ranlib
checking whether we are using the GNU C++ compiler... (cached) yes
checking whether /usr/bin/mpicxx accepts -g... (cached) yes
checking dependency style of /usr/bin/mpicxx... (cached) gcc3
checking how to run the C preprocessor... /usr/bin/mpicc -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking size of short int... 2
checking size of int... 4
checking size of unsigned int... 4
checking size of long int... 8
checking size of float... 4
checking size of double... 8
checking size of void *... 8
checking X11/Intrinsic.h usability... no
checking X11/Intrinsic.h presence... no
checking for X11/Intrinsic.h... no
<<< Configuring library with Metis support >>>
<<< Configuring library with Parmetis support >>>
checking for /usr/local/lib/libcgns.a... yes
checking for /usr/local/include/cgnslib.h... yes
checking that generated files are newer than configure... done
configure: creating ./config.status
config.status: creating externals/tecio/Makefile
config.status: creating externals/metis/Makefile
config.status: creating externals/parmetis/Makefile
config.status: creating Makefile
config.status: creating externals/Makefile
config.status: creating Common/lib/Makefile
config.status: creating SU2_CFD/obj/Makefile
config.status: creating SU2_DOT/obj/Makefile
config.status: creating SU2_MSH/obj/Makefile
config.status: creating SU2_DEF/obj/Makefile
config.status: creating SU2_SOL/obj/Makefile
config.status: creating SU2_GEO/obj/Makefile
config.status: creating SU2_PY/Makefile
config.status: executing depfiles commands


-------------------------------------------------------------------------
| ___ _ _ ___ |
| / __| | | |_ ) Release 4.2.0 'Cardinal' |
| \__ \ |_| |/ / |
| |___/\___//___| Suite |
| |
-------------------------------------------------------------------------
| SU2 Lead Dev.: Dr. Francisco Palacios, Francisco.D.Palacios@boeing.com|
| Dr. Thomas D. Economon, economon@stanford.edu |
-------------------------------------------------------------------------
| SU2 Developers: |
| - Prof. Juan J. Alonso's group at Stanford University. |
| - Prof. Piero Colonna's group at Delft University of Technology. |
| - Prof. Nicolas R. Gauger's group at Kaiserslautern U. of Technology. |
| - Prof. Alberto Guardone's group at Polytechnic University of Milan. |
| - Prof. Rafael Palacios' group at Imperial College London. |
-------------------------------------------------------------------------
| Copyright (C) 2012-2016 SU2, the open-source CFD code. |
| |
| SU2 is free software; you can redistribute it and/or |
| modify it under the terms of the GNU Lesser General Public |
| License as published by the Free Software Foundation; either |
| version 2.1 of the License, or (at your option) any later version. |
| |
| SU2 is distributed in the hope that it will be useful, |
| but WITHOUT ANY WARRANTY; without even the implied warranty of |
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU |
| Lesser General Public License for more details. |
| |
| You should have received a copy of the GNU Lesser General Public |
| License along with SU2. If not, see <http://www.gnu.org/licenses/>. |
-------------------------------------------------------------------------

Build Configuration Summary:

Source code location: /home/marco/Useless/solver-su2/SU2
Install location: /home/marco/Useless/solver-su2
Version: 4.2.0
C++ Compiler: /usr/bin/mpicxx
C Compiler: /usr/bin/mpicc
Preprocessor flags: -DHAVE_MPI
Compiler flags: -O3
Linker flags:
MPI support: yes
Metis support: yes
Parmetis support: yes
TecIO support: no
CGNS support: yes
HDF5 support: no
SZIP support: no
ZLIB support: no
Mutation++ support: no
Jsoncpp support: no
LAPACK support: no
Datatype support:
double yes
complex no
codi_reverse no
codi_forward no

External includes: -DHAVE_PARMETIS -I$(top_srcdir)/externals/parmetis/include -DHAVE_METIS -I$(top_srcdir)/externals/metis/include
External libs: $(top_builddir)/externals/parmetis/libparmetis.a $(top_builddir)/externals/metis/libmetis.a

Build SU2_CFD: yes
Build SU2_DOT: yes
Build SU2_MSH: yes
Build SU2_DEF: yes
Build SU2_SOL: yes
Build SU2_GEO: yes

Please be sure to add the $SU2_HOME and $SU2_RUN environment variables,
and update your $PATH (and $PYTHONPATH if applicable) with $SU2_RUN.
marco.fossati is offline   Reply With Quote

Old   August 15, 2016, 06:24
Default
  #2
New Member
 
Join Date: Jun 2012
Posts: 19
Rep Power: 14
asonda is on a distinguished road
Hi Marco,

try this: parallel_computation.py -f config.cfg -n 8

Cheers,

Alberto
asonda is offline   Reply With Quote

Old   August 15, 2016, 06:29
Default
  #3
New Member
 
kowalski
Join Date: Jul 2016
Posts: 9
Rep Power: 10
marco.fossati is on a distinguished road
Thanks Alberto, but still no dice...same behaviour as before

I am checking the log of the make and the make install, but as of now I cannot see anything major.

Cheers
Marco
marco.fossati is offline   Reply With Quote

Old   August 15, 2016, 06:38
Default
  #4
New Member
 
Join Date: Jun 2012
Posts: 19
Rep Power: 14
asonda is on a distinguished road
Stupid question: you say that they are running in serial because they are all running on the same core (and when you kill one process the other ones are not killed as well)?
asonda is offline   Reply With Quote

Old   August 15, 2016, 06:50
Default
  #5
New Member
 
kowalski
Join Date: Jul 2016
Posts: 9
Rep Power: 10
marco.fossati is on a distinguished road
What I see is that I get printed the same exact line 8 times, i.e.

101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.
101217 triangles.
66956 quadrilaterals.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.

------------------------- Geometry Preprocessing ------------------------
Setting point connectivity.
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Recomputing point connectivity.
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Renumbering points (Reverse Cuthill McKee Ordering).
Recomputing point connectivity.
Recomputing point connectivity.
Recomputing point connectivity.
Recomputing point connectivity.
Recomputing point connectivity.
Recomputing point connectivity.

I believe that this is the typical output of a code which is not running properly in parallel. Other posts indicate that this is due to an incorrect configuration (with no metis/parmetis), which I do not have ( see original post) and I get no error / warning while compiling, except for this warning:

ar: `u' modifier ignored since `D' is the default (see `U')

which seems related to a way of updating the libraries. But since I have removed everything and made a clean installation from scratch, this might not be relevant for my case.

Thanks
Marco
marco.fossati is offline   Reply With Quote

Old   August 18, 2016, 12:08
Default
  #6
Super Moderator
 
Tim Albring
Join Date: Sep 2015
Posts: 195
Rep Power: 11
talbring is on a distinguished road
Hi Marco,

this might solve your problem : https://github.com/su2code/SU2/wiki/...ne-for-example

Tim
talbring is offline   Reply With Quote

Old   August 18, 2016, 12:14
Default
  #7
New Member
 
kowalski
Join Date: Jul 2016
Posts: 9
Rep Power: 10
marco.fossati is on a distinguished road
Hi Tim,

this was my first thought, but unless the printout I get from the configure script and make install (see my previous posts) is faulty, I am compiling with METIS. PARMETIS and the mpicc and mpicxx, which to the best of my knowledge is saying I am doing things in parallel.

Any other suggestion would be welcome.

Marco
marco.fossati is offline   Reply With Quote

Old   August 18, 2016, 13:08
Default
  #8
hlk
Senior Member
 
Heather Kline
Join Date: Jun 2013
Posts: 309
Rep Power: 14
hlk is on a distinguished road
Quote:
Originally Posted by marco.fossati View Post
Hi Tim,

this was my first thought, but unless the printout I get from the configure script and make install (see my previous posts) is faulty, I am compiling with METIS. PARMETIS and the mpicc and mpicxx, which to the best of my knowledge is saying I am doing things in parallel.

Any other suggestion would be welcome.

Marco
Based on what you've described (compiling from source a few times, and changing configuration options), it may be possible that you have multiple versions of the SU2 executables on your system. Try 'which SU2_CFD' to check the location of the executable you are calling.
hlk is offline   Reply With Quote

Old   August 23, 2016, 17:12
Default
  #9
New Member
 
Giulio
Join Date: Apr 2014
Location: Milano
Posts: 17
Rep Power: 12
LaSerpe is on a distinguished road
It could also be related to which mpi implementation you are using on your machine.
Do you have openmpi or mpich?
if you have both of them you may have to specify the right one (which is the one you used to compile SU2) in order to run SU2 on multiple cores.
Try using either "mpirun.openmpi" or "mpirun.mpich"
__________________
Giulio Gori
Phd candidate, Politecnico di Milano
LaSerpe is offline   Reply With Quote

Old   August 25, 2016, 17:56
Default
  #10
Senior Member
 
Zach Davis
Join Date: Jan 2010
Location: Los Angeles, CA
Posts: 101
Rep Power: 16
RcktMan77 is on a distinguished road
I would have to agree with LaSerpe. This behavior suggests that the mpirun that your shell environment is pointing to is not from the same MPI installation that was used to compile SU2.
RcktMan77 is offline   Reply With Quote

Old   January 21, 2023, 06:49
Default
  #11
New Member
 
Philip Agenmonmen
Join Date: Oct 2022
Posts: 10
Rep Power: 3
PhilipA is on a distinguished road
Hello,

Were you able to solve this problem?

Best Regards

Philip
PhilipA is offline   Reply With Quote

Old   January 21, 2023, 09:03
Default
  #12
Senior Member
 
bigfoot
Join Date: Dec 2011
Location: Netherlands
Posts: 615
Rep Power: 18
bigfootedrockmidget is on a distinguished road
if you run SU2 in parallel on N cores, and you see SU2 output messages being repeated N times, then your mpi is not set up correctly. Make sure that you have mpi installed correctly, then make sure that you are using SU2 with the correct compilation option to be able to use mpi. Please read the installation carefully:


https://su2code.github.io/docs_v7/Installation/


And if you still have problems, please open a new issue and give us the details of your setup (windows/linux, mpich or openmp, etc...).
giovanni.medici likes this.
bigfootedrockmidget is offline   Reply With Quote

Old   June 12, 2024, 08:52
Default
  #13
Senior Member
 
Sakun
Join Date: Nov 2019
Location: United Kingdom
Posts: 114
Rep Power: 6
Sakun is on a distinguished road
Quote:
Originally Posted by bigfootedrockmidget View Post
if you run SU2 in parallel on N cores, and you see SU2 output messages being repeated N times, then your mpi is not set up correctly. Make sure that you have mpi installed correctly, then make sure that you are using SU2 with the correct compilation option to be able to use mpi. Please read the installation carefully:


https://su2code.github.io/docs_v7/Installation/


And if you still have problems, please open a new issue and give us the details of your setup (windows/linux, mpich or openmp, etc...).

Hi,


Sorry to reply on this thread.
I believe i have installed SU2 using source code (SU2 version 8.0.1 "Harrier") . For this i have followed following codes,
Code:
./meson.py build
./ninja -C build install
Then i have tried to run "QuickStart" ,then i have got an error as shown in the picture.


i have installed MPICH and specified it path as well.

Appreciate if you could guide me on this
Attached Images
File Type: png Run error.png (44.1 KB, 10 views)
Sakun is offline   Reply With Quote

Old   June 12, 2024, 11:25
Default
  #14
Senior Member
 
bigfoot
Join Date: Dec 2011
Location: Netherlands
Posts: 615
Rep Power: 18
bigfootedrockmidget is on a distinguished road
try this for mpi, or set it to disabled for single core:


Code:
./meson.py build --optimization=2 -Dwith-mpi=enabled --prefix=/home/Codes/su2HTML]
Sakun likes this.
bigfootedrockmidget is offline   Reply With Quote

Old   June 13, 2024, 06:30
Default
  #15
Senior Member
 
Sakun
Join Date: Nov 2019
Location: United Kingdom
Posts: 114
Rep Power: 6
Sakun is on a distinguished road
Quote:
Originally Posted by bigfootedrockmidget View Post
try this for mpi, or set it to disabled for single core:


Code:
./meson.py build --optimization=2 -Dwith-mpi=enabled --prefix=/home/Codes/su2HTML]
Hi bigfoot,

Thank you very much for your reply and the suggestion.

I have further did some dig up related to my case, then I have found out that for the source code compiled SU_2, mpirun works (used this code;
Code:
 mpirun -n 10 SU_2 inv_NACA0012.cfg
) but single core simulation didn’t work (used this code;
Code:
SU2_CFD inv_NACA0012.cfg
).

As for the precompiled binary SU2, single core simulation worked but when I used parallel code (mpi), simulation just repeated 10 times and it looks quite chaotic.
Since I will be working on 5.5 million mesh in the HPC, I would go with the source code compiled SU2 package.

Also, is there a way to get to know that allocated cores in the mpirun code have been actually worked in the simulation ?, like I have tested with
Code:
mpirun -n 10 SU_2 inv_NACA0012.cfg & mpirun -n 6 SU_2 inv_NACA0012.cfg
, but I couldn’t tell a different.

Thank you very much for your time.
Sakun is offline   Reply With Quote

Old   June 13, 2024, 16:39
Default
  #16
Senior Member
 
bigfoot
Join Date: Dec 2011
Location: Netherlands
Posts: 615
Rep Power: 18
bigfootedrockmidget is on a distinguished road
Hi,
Well if you use 10 cores instead of 5, it should finish 2 times faster. If you print the time per iteration,
Code:
SCREEN_OUTPUT= WALL_TIME, ...
Then you can see how much time each iteration takes.
Sakun likes this.
bigfootedrockmidget is offline   Reply With Quote

Reply

Tags
metis, parallel job, parmetis

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 18:45
SU2 in parallel suman91 SU2 4 September 26, 2014 20:38
SU2 parallel installation error JAlbert SU2 Installation 2 November 19, 2013 02:43
Parallel interfom trouble in execution mer OpenFOAM Running, Solving & CFD 6 October 18, 2005 05:45


All times are GMT -4. The time now is 10:12.