|
[Sponsors] |
Looking for a new workstation or small cluster, max 12k€ |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 19, 2018, 03:49 |
Looking for a new workstation or small cluster, max 12k€
|
#1 | ||||
Senior Member
Join Date: Oct 2013
Posts: 397
Rep Power: 18 |
Hello,
I finally have the budget to buy a new machine for running a custom OpenFOAM-based CFD solver. Cell count will typically be < 1M cells since a lot of equations are solved transiently. As mentioned in the topic, I have a budget of 12k€. A small cluster might be viable, but I'm not sure if I will get much better results on it compared to a strong workstation? Is there much to be gained by using multiple nodes, i.e. more latency between nodes but more memory bandwidth in return? As far as I know Epyc can be a considerable alternative right now regarding price/power ratio. Is this likely to hold for above mentioned cell counts, given Intels somewhat higher single-core performance? I've been looking at https://www.deltacomputer.com offerings which were suggested to me once by an admin of an academical CFD group. If I max out a workstation I could get: Quote:
Quote:
Quote:
Quote:
In the end I'm not really sure how to get most performance from my budget so I would be glad for some pointers. If you can suggest other suppliers from Europe, preferably Germany, I would be glad as well. Last edited by chriss85; June 19, 2018 at 06:34. |
|||||
June 19, 2018, 04:41 |
|
#2 |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,400
Rep Power: 47 |
There is quite a lot to go over here...
1) Epyc 7601 is by no means a good value for CFD. Too many cores for too little bandwidth. With Eypc 7251 the opposite is true: not enough cores to saturate the memory bandwidth. The overhead cost for motherboards, cases, memory, power supplies and interconnects is too high. The sweetspot for bandwidth-limited codes are the 16-core variants. For more general purpose nodes 24 cores per CPU are also an option. 2) Your cost estimates seem to be missing a lot of stuff like cases, motherboards and power supplies. 3) "SuperMicro AOC-MGP-i2M" What exactly should this be used for? All motherboards have built-in Gigabit Ethernet, some even have 10G. For a node-interconnect, better use decommissioned Infiniband gear from ebay. For only two nodes, you might get away with 10G Ethernet. But then again, two Infiniband cards and a cable cost barely more than 100€. A good investment when you have low cell counts which increases the communication/computation ratio. 4) Broadwell Xeons (v4) should no longer be bought new. Skylake-SP provides more than 50% more peak memory bandwidth for about the same price. Buying new, you will barely be able to cram more than 2-3 nodes into your budget. I think I would go with two nodes using AMD Epyc 7351 and use leftover budget for a decent amount of fast SSD storage. You surely want to save some of the results during transient simulation runs |
|
June 19, 2018, 06:32 |
|
#3 | ||||
Senior Member
Join Date: Oct 2013
Posts: 397
Rep Power: 18 |
Quote:
Quote:
Quote:
Quote:
|
|||||
June 19, 2018, 10:20 |
|
#4 | |
Super Moderator
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,400
Rep Power: 47 |
Quote:
Bear in mind that the prices from the site you quoted don't include taxes. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Small cluster configuration for pump simulation at CFX | Nevel | Hardware | 2 | April 7, 2014 06:07 |
dynamic Mesh is faster than MRF???? | sharonyue | OpenFOAM Running, Solving & CFD | 14 | August 26, 2013 07:47 |
[snappyHexMesh] Adding layers goes wrong with SnappyHexMesh | Elise | OpenFOAM Meshing & Mesh Conversion | 1 | April 22, 2013 02:32 |
Force can not converge | colopolo | CFX | 13 | October 4, 2011 22:03 |
Small cluster configuration for pump simulation at CFX | Nevel | CFX | 3 | February 3, 2010 22:37 |