CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Looking for a new workstation or small cluster, max 12k€

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 19, 2018, 03:49
Default Looking for a new workstation or small cluster, max 12k€
  #1
Senior Member
 
Join Date: Oct 2013
Posts: 397
Rep Power: 18
chriss85 will become famous soon enough
Hello,

I finally have the budget to buy a new machine for running a custom OpenFOAM-based CFD solver. Cell count will typically be < 1M cells since a lot of equations are solved transiently.

As mentioned in the topic, I have a budget of 12k€. A small cluster might be viable, but I'm not sure if I will get much better results on it compared to a strong workstation? Is there much to be gained by using multiple nodes, i.e. more latency between nodes but more memory bandwidth in return?

As far as I know Epyc can be a considerable alternative right now regarding price/power ratio. Is this likely to hold for above mentioned cell counts, given Intels somewhat higher single-core performance?

I've been looking at https://www.deltacomputer.com offerings which were suggested to me once by an admin of an academical CFD group.

If I max out a workstation I could get:

Quote:
https://www.deltacomputer.com/d20z-uln-zn.html
Chassis
1 x Fractal Desgin Define XL R2 - 2x GPU + 765.90 €
CPU
2 x AMD EPYC 7601 + 8159.32 €
RAM
16 x Micron MTA18ASF1G72PDZ-2G6 + 1448.32 €
Disk
1 x Crucial MX500 CT500MX500SSD1 + 97.87 €
1 x WD Ultrastar DC HC320 HUS728T8TALN6L4 + 231.66 €
GPU
1 x NVIDIA GeForce GT 1030 + 87.03 €
11.465,06 €
As for clusters, I could either try to get

Quote:
https://www.deltacomputer.com/d20-4z-m2-zn.html
CPU
8 x AMD EPYC 7251 + 3691.12 €
RAM
64 x Micron MTA18ASF1G72PDZ-2G6 + 5793.28 €
Disk
4 x Crucial MX500 CT250MX500SSD1 + 260.96 €
SIOM
4 x SuperMicro AOC-MGP-i2M + 201.16 €
14.219,81 €
which is somewhat over my budget but might be possible, or a Xeon E5-v4 build with 2

Quote:
CPU
4 x Intel Xeon E5-2690 v4 + 8021.32 €
RAM
16 x Micron MTA18ASF1G72PDZ-2G6 + 1448.32 €
Disk
2 x Crucial MX500 CT250MX500SSD1 + 130.48 €
11.440,74 €
or 4 nodes:

Quote:
CPU
8 x Intel Xeon E5-2637 v4 + 7611.52 €
RAM
32 x Micron MTA18ASF1G72PDZ-2G6 + 2896.64 €
Disk
4 x Crucial MX500 CT250MX500SSD1 + 260.96 €
14.030,23 €
The 4 nodes Intel solution doesn't seem to be very attractive, or is the larger number of nodes more important here?

In the end I'm not really sure how to get most performance from my budget so I would be glad for some pointers. If you can suggest other suppliers from Europe, preferably Germany, I would be glad as well.

Last edited by chriss85; June 19, 2018 at 06:34.
chriss85 is offline   Reply With Quote

Old   June 19, 2018, 04:41
Default
  #2
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,400
Rep Power: 47
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
There is quite a lot to go over here...

1) Epyc 7601 is by no means a good value for CFD. Too many cores for too little bandwidth. With Eypc 7251 the opposite is true: not enough cores to saturate the memory bandwidth. The overhead cost for motherboards, cases, memory, power supplies and interconnects is too high. The sweetspot for bandwidth-limited codes are the 16-core variants. For more general purpose nodes 24 cores per CPU are also an option.

2) Your cost estimates seem to be missing a lot of stuff like cases, motherboards and power supplies.

3) "SuperMicro AOC-MGP-i2M" What exactly should this be used for? All motherboards have built-in Gigabit Ethernet, some even have 10G. For a node-interconnect, better use decommissioned Infiniband gear from ebay. For only two nodes, you might get away with 10G Ethernet. But then again, two Infiniband cards and a cable cost barely more than 100€. A good investment when you have low cell counts which increases the communication/computation ratio.

4) Broadwell Xeons (v4) should no longer be bought new. Skylake-SP provides more than 50% more peak memory bandwidth for about the same price.

Buying new, you will barely be able to cram more than 2-3 nodes into your budget. I think I would go with two nodes using AMD Epyc 7351 and use leftover budget for a decent amount of fast SSD storage. You surely want to save some of the results during transient simulation runs
flotus1 is offline   Reply With Quote

Old   June 19, 2018, 06:32
Default
  #3
Senior Member
 
Join Date: Oct 2013
Posts: 397
Rep Power: 18
chriss85 will become famous soon enough
Quote:
Originally Posted by flotus1 View Post
1) Epyc 7601 is by no means a good value for CFD. Too many cores for too little bandwidth. With Eypc 7251 the opposite is true: not enough cores to saturate the memory bandwidth. The overhead cost for motherboards, cases, memory, power supplies and interconnects is too high. The sweetspot for bandwidth-limited codes are the 16-core variants. For more general purpose nodes 24 cores per CPU are also an option.
Thanks, this is good to know what kind of CPUs work best.

Quote:
2) Your cost estimates seem to be missing a lot of stuff like cases, motherboards and power supplies.
This is copied from the webpage, these things are included in the selection. I will link the configuration pages for each system in the first post.

Quote:
3) "SuperMicro AOC-MGP-i2M" What exactly should this be used for? All motherboards have built-in Gigabit Ethernet, some even have 10G. For a node-interconnect, better use decommissioned Infiniband gear from ebay. For only two nodes, you might get away with 10G Ethernet. But then again, two Infiniband cards and a cable cost barely more than 100€. A good investment when you have low cell counts which increases the communication/computation ratio.
I just checked and it looks to be a modular system for ethernet/infiniband systems: https://www.supermicro.com/white_pap...paper_SIOM.pdf There are also network cards to choose from with various speeds, from ethernet over infiniband solutions, but what kind of speed is suitable here? 25GB/s? I'm not sure if a custom build cluster is a solution for me.

Quote:
4) Broadwell Xeons (v4) should no longer be bought new. Skylake-SP provides more than 50% more peak memory bandwidth for about the same price.

Buying new, you will barely be able to cram more than 2-3 nodes into your budget. I think I would go with two nodes using AMD Epyc 7351 and use leftover budget for a decent amount of fast SSD storage. You surely want to save some of the results during transient simulation runs
That's a good point regarding the Xeons, I've only looked at that supplier as of now and they don't seem to have Skylake yet on their webpage. I'll try to get an offer for a 4-node chassis with 2 or 3 nodes to see what is realistic.
chriss85 is offline   Reply With Quote

Old   June 19, 2018, 10:20
Default
  #4
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,400
Rep Power: 47
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
There are also network cards to choose from with various speeds, from ethernet over infiniband solutions, but what kind of speed is suitable here? 25GB/s? I'm not sure if a custom build cluster is a solution for me.
The slowest aka cheapest Infiniband solution will do. While Ethernet transfer speeds are slowly getting in the same range as Infiniband, latency is still much higher.
Bear in mind that the prices from the site you quoted don't include taxes.
flotus1 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Small cluster configuration for pump simulation at CFX Nevel Hardware 2 April 7, 2014 06:07
dynamic Mesh is faster than MRF???? sharonyue OpenFOAM Running, Solving & CFD 14 August 26, 2013 07:47
[snappyHexMesh] Adding layers goes wrong with SnappyHexMesh Elise OpenFOAM Meshing & Mesh Conversion 1 April 22, 2013 02:32
Force can not converge colopolo CFX 13 October 4, 2011 22:03
Small cluster configuration for pump simulation at CFX Nevel CFX 3 February 3, 2010 22:37


All times are GMT -4. The time now is 19:13.