|
[Sponsors] |
May 14, 2024, 08:46 |
Switch for new cluster.
|
#1 |
New Member
Join Date: Jun 2023
Posts: 3
Rep Power: 2 |
Hi,
I am in the process of specifying a small HPC cluster for our team based in the UK, this will be mostly used for some FEA and FSI but mostly CFD, meshes in the region of 40M cells. We have some Ansys licenses, but I think the bulk of the heavy work will be done in OpenFOAM. Initially we had a quote for 2 PowerEdge nodes from Dell, where one would work as a head node as well as solving. We are now changing the configuration to 1 separate lower spec node for a head node and the 2 PowerEdge nodes for solving. I have added in a switch for connecting everything and future expansion. My question is would a 25Gbe switch bottleneck this configuration? We are trying to keep the costs down so trying to stay away from 100Gbe and Infiniband. The nodes are based around 2x 9384X CPUs and 24x 16GB RDIMMs in each node. Thanks. |
|
May 14, 2024, 13:55 |
|
#2 |
Senior Member
Will Kernkamp
Join Date: Jun 2014
Posts: 343
Rep Power: 12 |
with just two nodes you should be fine even with less bandwidth than that. Remember that most domain boundaries are going to be internal to each node. This limits the bandwidth needed for boundary transfer between cores on different machines. I would say go for dirt cheap. You can always upgrade later. 2.5 Gb probably good enough.
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Setup Ansys RSM with ARC on Linux cluster | -Maxim- | ANSYS | 7 | March 29, 2023 08:14 |
Cluster and local computer giving different results for reactive flow | RickJ1 | FLUENT | 1 | December 9, 2021 03:08 |
[Other] Basic questions about OpenFOAM cluster running and installing | Fauster | OpenFOAM Installation | 0 | May 25, 2018 15:00 |
Small cluster for ANSYS Fluent | lasersim | Hardware | 4 | October 21, 2017 00:00 |
Why not install cluster by connecting workstations together for CFD application? | Anna Tian | Hardware | 5 | July 18, 2014 14:32 |