CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

MPI issues on debian Debian 11 bullseye

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 26, 2022, 06:48
Question MPI issues on debian Debian 11 bullseye
  #1
New Member
 
Flavio Giannetti
Join Date: Mar 2021
Location: Italy
Posts: 13
Rep Power: 3
flavio73 is on a distinguished road
Hi guys
I have compiled SU2 on a new machine running Debian GNU/Linux 11 (bullseye). I installed Open MPI 4.1.0 with apt-get install and compiled everything. Whenever I try to run the program I get an error concerning MPI_win_create.
I tried both SU2 version 7.3 and 7.4. I also tried mpi and mpich getting the same problem.
The version of the libraries installed on the machine are

libmpi.so.40.30.0
libmpich.so.12.1.10

This is weird! I have another machine running linux mint 19 (tessa) on which I compiled SU2 version 7.3 without problems. The only difference I can see is the MPI version that on the old machine is Open MPI version 3.

Does anyone found a similar behaviour ? Any hints on how to solve the problem ? Thanks in advance for the help you can give me
Flavio

Here is the message I get


flavio@cfd1 ~/prova $ mpirun -n 2 SU2_CFD inv_ONERAM6.cfg
[cfd1:151413] *** An error occurred in MPI_Win_create
[cfd1:151413] *** reported by process [1424949249,0]
[cfd1:151413] *** on communicator MPI_COMM_WORLD
[cfd1:151413] *** MPI_ERR_WIN: invalid window
[cfd1:151413] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[cfd1:151413] *** and potentially your MPI job)
[cfd1:151409] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[cfd1:151409] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages



PS
I also tried the pre-compiled version of SU2 which uses mpich. The program starts but then it always crashes!
flavio73 is offline   Reply With Quote

Old   August 27, 2022, 14:08
Default
  #2
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 419
Rep Power: 11
pcg is on a distinguished road
Hello,

I've had some problems on Ubuntu 22 with OpenMPI 4, related to HWLOC and something about 32 bit pci devices *shrug*.
Maybe it's the same for you but you have the warnings silenced, see here https://github.com/open-mpi/hwloc/issues/354

With mpich 4 I get the warnings but the code runs fine. How did you build SU2 with mpich? Be careful if you have open mpi alongside mpich.
This is my build command for mpich:
export CC=mpicc.mpich
export CXX=mpicxx.mpich

export CXXFLAGS="-march=native -funroll-loops -ffast-math -fno-finite-math-only"

./meson.py build --optimization=2 --warnlevel=3 --prefix=$PWD/build -Dcustom-mpi=true

If you find out the issue with Open MPI please update this thread.
pcg is offline   Reply With Quote

Old   August 28, 2022, 07:26
Default mpich ok !
  #3
New Member
 
Flavio Giannetti
Join Date: Mar 2021
Location: Italy
Posts: 13
Rep Power: 3
flavio73 is on a distinguished road
Hi Pedro thanks a lot for your support.

I tried again implementing your hints. I had no success with openmpi. I alwasys get the same output without additional hints. However I used your example to recompile SU2 with mpich and its now working !!!

I have a last question concerning the -Dwith-omp option. Can I recompile the code with mpich and -Dwith-omp=true or the option is just for Openmpi ?

Thanks a lot fot your help

Flavio







Quote:
Originally Posted by pcg View Post
Hello,

I've had some problems on Ubuntu 22 with OpenMPI 4, related to HWLOC and something about 32 bit pci devices *shrug*.
Maybe it's the same for you but you have the warnings silenced, see here https://github.com/open-mpi/hwloc/issues/354

With mpich 4 I get the warnings but the code runs fine. How did you build SU2 with mpich? Be careful if you have open mpi alongside mpich.
This is my build command for mpich:
export CC=mpicc.mpich
export CXX=mpicxx.mpich

export CXXFLAGS="-march=native -funroll-loops -ffast-math -fno-finite-math-only"

./meson.py build --optimization=2 --warnlevel=3 --prefix=$PWD/build -Dcustom-mpi=true

If you find out the issue with Open MPI please update this thread.
flavio73 is offline   Reply With Quote

Old   August 29, 2022, 20:10
Default
  #4
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 419
Rep Power: 11
pcg is on a distinguished road
Hi Flavio,
Glad it works. Yes you can use mpich and openmp
pcg is offline   Reply With Quote

Old   November 29, 2022, 21:04
Default
  #5
New Member
 
Brandon Gleeson
Join Date: Apr 2018
Posts: 24
Rep Power: 6
CSMDakota is on a distinguished road
Just tagging onto this thread; I observe the same fatal error when running the Quickstart in serial mode, but it runs just fine in parallel.

  • SU2 v7.4.0
  • Ubuntu 22.04
  • Open MPI 4.1.2
--Brandon--
CSMDakota is offline   Reply With Quote

Reply

Tags
su2 and openmpi gcc

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
nspan Turbomachinery MPI issues AmarotoS7 SU2 3 May 29, 2022 05:25
MPI issues with my own solver davide_c OpenFOAM Running, Solving & CFD 1 March 23, 2012 09:57
Sgimpi pere OpenFOAM 27 September 24, 2011 08:57
Error using LaunderGibsonRSTM on SGI ALTIX 4700 jaswi OpenFOAM 2 April 29, 2008 11:54
Is Testsuite on the way or not lakeat OpenFOAM Installation 6 April 28, 2008 12:12


All times are GMT -4. The time now is 19:58.