CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Server on 2 CPU - AMD EPYC 7313

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 1 Post By flotus1
  • 1 Post By wkernkamp

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 10, 2021, 08:39
Default Server on 2 CPU - AMD EPYC 7313
  #1
Rec
New Member
 
Sergey
Join Date: Jan 2018
Posts: 18
Rep Power: 8
Rec is on a distinguished road
I selected the following configuration for the settlement cluster, I plan to purchase 3 settlement servers.

Configuration 1 server:
SuperMicro H12DSi-N6 motherboard - 1 pcs.
RAM 8Gb DDR4 3200MHz Samsung ECC Reg OEM - 16 pcs.
Server processor AMD EPYC 7313 - 2 pcs.
Cooler Noctua NH-U9 TR4-SP3 - 2 pieces
SSD 500Gb Samsung 970 EVO Plus (MZ-V7S500BW) - 1 pc.
Fan Be Quiet Silent Wings 3 - 140mm High-Speed ​​- 4 pcs.
Power supply unit 1000W Be Quiet Straight Power 11 Platinum - 1 piece
Be Quiet Dark Base 900 Black case - 1 pc.

Questions:
1. Are the computer parts selected correctly?
2. I didn’t find 8Gb Dual Rank memory with 3200 MHz, how much is lost in performance compared to 2666 MHz with Dual Rank?
3. How reasonable is it to use InfiniBand? If justified, what kind of equipment would you recommend, the manufacturer of the model? Is it a good idea to connect directly without a switch? What wires should I use?
4. Do I need to speed up the master server compared to the slaves? Faster processor, disk, memory?
Rec is offline   Reply With Quote

Old   June 11, 2021, 01:19
Default
  #2
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,399
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
1. Are the computer parts selected correctly?
According to the specs, the case you picked can't hold SSI-EEB motherboards. Maybe you can squeeze it in with some modifications, maybe you can't. I can recommend the Phanteks Enthoo Pro for this kind of board.
If you are using large workstation cases anyway, you can pick larger CPU coolers. NH-U14S TR4-SP3.
1000W for the PSU is total overkill for the system as configured. You could save some money here if there are no plans to upgrade with more power-hungry components later.
Quote:
2. I didn’t find 8Gb Dual Rank memory with 3200 MHz, how much is lost in performance compared to 2666 MHz with Dual Rank?
AFAIK, 8GB DDR4-3200 reg ECC is only sold in single-rank variants. Dropping down to DDR4-2666 just to get dual-rank memory is not recommended, i.e. slower.
Quote:
3. How reasonable is it to use InfiniBand? If justified, what kind of equipment would you recommend, the manufacturer of the model? Is it a good idea to connect directly without a switch? What wires should I use?
If you are still debating whether to use Infiniband or not, I would recommend the "NT" variant of your motherboard. It comes with 10Gigabit LAN onboard. Maybe you don't need Infiniband for your applications, 10G Ethernet can be fine for small CFD clusters, if you can sacrifice some strong scaling capability at the low cell count side.
Quote:
4. Do I need to speed up the master server compared to the slaves? Faster processor, disk, memory?
Totally depends on what the head node is supposed to be used for.
Just another compute node that also handles login to the cluster? -> No additional requirements
House some fancy storage system on top of being a regular compute node? Enable GUI acces to pre- and post-processing? -> you'll need some more/better parts, an some CPU performance to spare.
Rec likes this.
flotus1 is offline   Reply With Quote

Old   June 18, 2021, 04:05
Question
  #3
Rec
New Member
 
Sergey
Join Date: Jan 2018
Posts: 18
Rep Power: 8
Rec is on a distinguished road
Are the components selected correctly based on price - speed? Or would you choose other components?

I decided to try using IB, what cards and switches would you recommend for me?

If i make a simple computer a master, and 3 servers will be engaged in the calculation, won't this be a bottleneck?

In the calculations I use Ansys CFX 17.2, which will be faster with a 16 core processor and 3200 MHz or a 24 core processor and 2650 MHz?

If you have come across Ansys products, does it make sense to upgrade to the latest Ansys version? Will it work faster?
Rec is offline   Reply With Quote

Old   June 18, 2021, 18:12
Default Infiniband without router
  #4
Senior Member
 
Will Kernkamp
Join Date: Jun 2014
Posts: 308
Rep Power: 12
wkernkamp is on a distinguished road
If you get a very cheap ConnectX3 dual port FDR Infiniband card for each of your nodes, you will have very good performance due to a direct interconnect between all three nodes. A router makes set-up easier, but mine is very noisy so I try not to use it. Download latest drivers, ibtools and opensm from Mellanox. It will give you SR-IOV so you can set up virtual interfaces for virtual machines in case you run nodes as virtual machines. Avoid "Pro" cards, because they provide only 40 Gb/s ethernet over Infiniband, instead of 56 Gb/s Infiniband (as well as ethernet over IB). Look for 354 in the number designation of the card if I remember correctly. There are faster versions of Infiniband, nowadays, but with just three nodes it should't make much difference because you will have very good bandwidth and very low latency already. mpi is optimized for infiniband with direct memory access, so there is no need for tuning. Non-Mellanox cards can be flashed with mftflint to the latest Mellanox. It is not hard.



Caution: If you need to get this up and running for a time critical application at work, get a router, because it will do the routing automatically and make most of it plug and play.
Rec likes this.
wkernkamp is offline   Reply With Quote

Old   July 2, 2021, 04:50
Default
  #5
Rec
New Member
 
Sergey
Join Date: Jan 2018
Posts: 18
Rep Power: 8
Rec is on a distinguished road
Quote:
Originally Posted by wkernkamp View Post
Caution: If you need to get this up and running for a time critical application at work, get a router, because it will do the routing automatically and make most of it plug and play.
Thank you for your answer

Could you recommend a router for this task?
Optimal price / performance
Rec is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Xeon Gold Cascade Lake vs Epyc Rome - CFX & Fluent - Benchmarks (Windows Server 2019) SLC Hardware 18 June 13, 2020 16:48
AMD Epyc CFD benchmarks with Ansys Fluent flotus1 Hardware 55 November 12, 2018 05:33
Superlinear speedup in OpenFOAM 13 msrinath80 OpenFOAM Running, Solving & CFD 18 March 3, 2015 05:36
Star cd es-ice solver error ernarasimman STAR-CD 2 September 12, 2014 00:01
OpenFOAM 13 AMD quadcore parallel results msrinath80 OpenFOAM Running, Solving & CFD 1 November 10, 2007 23:23


All times are GMT -4. The time now is 07:40.