CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

slower performance when running more than one job

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree1Likes
  • 1 Post By pcg

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 6, 2021, 03:06
Default slower performance when running more than one job
  #1
Member
 
Jose Daniel
Join Date: Jun 2020
Posts: 36
Rep Power: 6
jdp810 is on a distinguished road
Hello all,

I am running several jobs using:
parallel_computation.py -n 6 -f solver_settings > solver_out

but when I run more of these (making sure I don't go over the number of cores available) it slows down drastically.

Do you know how to make the cores independent from each other?

Thank you,
JD
jdp810 is offline   Reply With Quote

Old   April 6, 2021, 04:57
Default
  #2
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14
pcg is on a distinguished road
What is drastically? Do you get more throughput (simulations per hour lets say) or less?

Assuming that drastically is really drastic, you probably need to look at how MPI binds processes to cores, look into the --bind-to option.

Otherwise, you will never, ever, get that kind of perfect scaling of running two jobs on the same machine in the same time it takes you to run one.
CFD in general is bound by memory bandwidth, not by compute power.
Once you get to 2-3 cores per memory channel you will not be able to go much faster, you may even go slower overall because there will be more pressure on the CPU cache and the CPU will start running at lower frequency.
pcg is offline   Reply With Quote

Old   April 7, 2021, 03:00
Default
  #3
Member
 
Jose Daniel
Join Date: Jun 2020
Posts: 36
Rep Power: 6
jdp810 is on a distinguished road
Maybe drastically is a little dramatic. The test I've been running (200k cells in 2D, steady, Incompressible RANS, SA w/transition and euler implicit) in 6 cores has an iteration time of around 0.58s. If I start running another one with the same characteristics, the iteration time goes up to 1.35s...

I am using mpich as it is recommended in the SU2 page, compiling the code with:
Code:
./meson.py build -Denable-autodiff=true -Dwith-mpi=enabled
and changing the line with mpi_dep in meson to
Code:
mpi_dep = [dependency('mpich', required : get_option('with-mpi'))]
How do I use the --bind-to option? Or can you point me to a page where it is explained?

Thanks!
jdp810 is offline   Reply With Quote

Old   April 7, 2021, 06:08
Default
  #4
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14
pcg is on a distinguished road
It is a CLI argument of "mpirun", not an SU2 option.
https://www.open-mpi.org/doc/v3.0/man1/mpirun.1.php

I don't know how to pass it to parallel_optimization.py, but that script is not really needed with v7 anyway.

Something else that may explain the slowdown is if you use virtual threads.
That is also generally not good for CFD in general, you should only use physical cores.
There are good discussions about these hardware aspects in the hardware section of CFD-online.
jdp810 likes this.
pcg is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Abysmal performance of 64 cores opteron based workstation for CFD Fauster Hardware 8 June 4, 2018 11:51
[OpenFOAM.com] OpenFOAM install on high performance computing machine running x86_64 linux gauravsavant OpenFOAM Installation 1 March 8, 2018 16:52
Parallel running error (high performance computer) Aadhavan OpenFOAM Running, Solving & CFD 0 February 3, 2013 12:35
Running 2 CFD jobs on one PC steve podleski Main CFD Forum 17 February 16, 2000 15:40
cfd job Dr. Don I anyanwu Main CFD Forum 20 May 17, 1999 16:13


All times are GMT -4. The time now is 16:39.