CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

AMD Epyc hardware for ANSYS HPC

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 1 Post By flotus1
  • 1 Post By flotus1

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 26, 2021, 07:44
Question AMD Epyc hardware for ANSYS HPC
  #1
New Member
 
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 4
chefbouza74 is on a distinguished road
Dear Members,


I want to build a 128 cores hardware to use with ANSYS solvers (mainly FLuent, and a bit Mechanical) with 3 HPC packs licenses (go up to 128 cores on a single calculation).
In my understand, nowadays AMD Epyc Milan processors are much more interesting than the Intel Xeon Ice Lake ones (better performance/price ratios).

My company IT supplier is DELL. Dell propose a 2 sockets rack servers where one can have for example 2x 7763 processors, with 16x32GB of DDR4 RAM (use the 2x8 memory channels of the CPUs). And with 2x 6.4 Tb NVME mixed use SSDs.

In the CFD-online forum, I never saw an Epyc 7763 hardware configuration. I don't know if it is a price matter or a performance one.

One can also think to build a 2 compute nodes with 2 CPU each and then have for example 2x2xEpyc 7543 CPUs (128 cores to use the 3 ANSYS HPC pack licenses simultaneously). But the inconvenient point is that we will have to interconnect the 2 compute nodes and we don't have any experience to manage cluster architecture (with the single compute node with 2x7763, we have nothing to manage).


Can you give an advice if a 2x 7763 configuration is a good choice ?


Ps: My plan is to invest on the next year Epyc Genoa generation (DDR5, 12 memory channels, ...), still the Milan one is a good parallel.

Thank you in advance for you help !
chefbouza74 is offline   Reply With Quote

Old   October 26, 2021, 08:13
Default
  #2
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,399
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
The reason why 64-core Epyc CPUs are rarely recommended here is the usual: with 8 cores per memory channel, scaling to all 128 cores of a dual-socket machine will be far from ideal.
Which means you would make better use of those rather expensive Ansys licenses by using two nodes with two 32-core CPUs each. These 32-core CPUs don't even have to be from the top of the stack in order to be faster.

If it has to be 128 cores in a single system, I guess the 7763 is the best money can buy.
For connecting two systems, you can either go with 10 GIgabit Ethernet and see how that goes, or with directly connected Infiniband. No expensive IB switch is required for only 2 nodes. Setting up such a mini-cluster isn't terribly complicated, but sure does take some time. I can't make that decision for you. You are basically trading convenience for performance.

Quote:
Ps: My plan is to invest on the next year Epyc Genoa generation (DDR5, 12 memory channels, ...), still the Milan one is a good parallel.
Same plan here. My personal workstation is still 1st gen Epyc. I decided to postpone the next upgrade until DDR5 comes around. Epyc Genoa does look promising. But with the shortages we have already, that's still far away.
chefbouza74 likes this.
flotus1 is offline   Reply With Quote

Old   October 26, 2021, 08:32
Default
  #3
New Member
 
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 4
chefbouza74 is on a distinguished road
Thank you very much for your answer !

Quote:
Originally Posted by flotus1 View Post
For connecting two systems, you can either go with 10 GIgabit Ethernet and see how that goes, or with directly connected Infiniband. No expensive IB switch is required for only 2 nodes. Setting up such a mini-cluster isn't terribly complicated, but sure does take some time. I can't make that decision for you. You are basically trading convenience for performance.
I'll check that with our IT department.
Do we have such benchmarks where we compare high cores count single machine with 2 machines connected in Infiniband ?

Thank you!
chefbouza74 is offline   Reply With Quote

Old   October 26, 2021, 09:35
Default
  #4
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,399
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Ansys used to publish benchmarks with intra-node scaling. Not anymore unfortunately. But the fact that most of their benchmarks use the 32-core Epyc 75F3 tells us a lot
All I can offer is OpenFOAM scaling found herein: General recommendations for CFD hardware [WIP]
And we have this intra-node benchmark for Fluent with 16-core Epyc CPUs: Xeon Gold Cascade Lake vs Epyc Rome - CFX & Fluent - Benchmarks (Windows Server 2019)
You can already see how scaling is lower than ideal, only ~75% efficiency on 32 cores.
chefbouza74 likes this.
flotus1 is offline   Reply With Quote

Old   October 26, 2021, 11:51
Smile
  #5
New Member
 
Chefbouza
Join Date: Oct 2021
Posts: 10
Rep Power: 4
chefbouza74 is on a distinguished road
Quote:
Originally Posted by flotus1 View Post
Ansys used to publish benchmarks with intra-node scaling. Not anymore unfortunately. But the fact that most of their benchmarks use the 32-core Epyc 75F3 tells us a lot
All I can offer is OpenFOAM scaling found herein: General recommendations for CFD hardware [WIP]
And we have this intra-node benchmark for Fluent with 16-core Epyc CPUs: Xeon Gold Cascade Lake vs Epyc Rome - CFX & Fluent - Benchmarks (Windows Server 2019)
You can already see how scaling is lower than ideal, only ~75% efficiency on 32 cores.

Thank you very much Alex for all this help !
chefbouza74 is offline   Reply With Quote

Reply

Tags
amd epyc, ansys fluent, hardware, hpc


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
General recommendations for CFD hardware [WIP] flotus1 Hardware 19 February 29, 2024 12:48
Operating System for AMD Epyc Workstation jakethejake Hardware 14 November 19, 2019 05:52
AMD Epyc Mini Cluster Hardware for StarCCM+ clearsign Hardware 1 April 24, 2019 16:28
Building Workstation using 2 x AMD EPYC 7301 Ivanrips Hardware 16 January 21, 2019 09:39
AMD Epyc CFD benchmarks with Ansys Fluent flotus1 Hardware 55 November 12, 2018 05:33


All times are GMT -4. The time now is 15:07.