CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Planning a (EPYC?) workstation for thermal/reacting simulations (OpenFOAM)

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 17, 2020, 16:13
Default Planning a (EPYC?) workstation for thermal/reacting simulations (OpenFOAM)
  #1
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Hi everyone. I'm considering a workstation build, since the commodity hardware that I currently use (i7 etc) can take weeks to finish some of the heavier simulations I do (typically combustion simulations). Most of the simulations I do involve heat transfer in one way or another, and some also include chemical reactions. Right now, I exclusively use OpenFOAM (mainly rhoPimpleFoam and rhoReactingFoam) but I'm interested in expanding (for certain applications) to Nek5000 (spectral element method) and various Lattice-Boltzmann solvers. So, I'd like to make the right hardware choices with those factors in mind. I have no plans to use commercial software at this point.

Let's start with the CPU(s). I'm thinking a two-socket Epyc workstation. Here are the processors that I'm considering right now, but I am totally open to other options.

Epyc 7352 (24c, 2.3-3.2 GHz)
Epyc 7302 (16c, 3.0-3.3 GHz)

Basically I'm weighing the cost and base clock advantage of the 7302 against the higher core count of the 7352. I understand that the conventional wisdom is that most OpenFOAM work is limited by memory bandwidth, and so Epycs in the 16-24 core range tend to be the sweet spot. Does the fact that I sometimes do rhoReactingFoam simulations, or the fact that I'm interested spectral element methods / Lattice Boltzmann methods influence that at all? For example, does clock become more important given any of those considerations?

I'd be curious for any and all advice regarding CPU selection. Thanks.
hsnyder is offline   Reply With Quote

Old   April 17, 2020, 17:49
Default
  #2
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,503
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
I would say that with your applications in mind, the 7352 is hands down the better choice compared to the 7302. And when factoring in the total system cost, the price difference between the CPUs becomes less pronounced.
While base clock makes it look like the 7352 has a clock speed disadvantage, you have to keep in mind that these CPUs never really operate at base frequency. When running 16 cores, both CPUs will turbo to very similar frequencies. And the 7352 still has 50% more cores for situations where cores are more important than frequency.
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   April 17, 2020, 18:17
Default
  #3
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Hey, thanks for the quick reply / the advice. What's your reasoning, just so I understand what the relevant factors are? I'm guessing it has something to do with the increased workload per iteration compared with incompressible/isothermal flow, but I'm not well versed in these sorts of assessments... Would going to a 32 core processor be a good idea?
hsnyder is offline   Reply With Quote

Old   April 17, 2020, 18:46
Default
  #4
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,503
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
My reasoning for higher core count CPUs when using open source parallel CFD codes: workloads are rarely 100% memory bandwidth limited at 2 cores per memory channel. Scaling might get worse, but there is still some more performance to be had with higher core counts. The more computationally expensive your models get, the better.

If I had to buy a new AMD Epyc setup now, and I had to buy new retail CPUs, my choice would probably be the 7452. With the 7352 coming in at a close second place. At least if your budget allows the 32-core CPUs without compromising on other aspects of the machine.
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   April 17, 2020, 19:16
Default
  #5
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Ah I see. I don't have to buy epyc, nor new retail CPUs - I'm open to alternatives if there are better ones. I've just been hearing good things about the epyc rome chips. Is something else more advisable?



I can upgrade to a 7452 if I start with one socket filled and fill up the second one a few months down the road.
hsnyder is offline   Reply With Quote

Old   April 18, 2020, 04:00
Default
  #6
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,503
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
Don't get me wrong, Epyc Rome is the best choice. I just meant that you don't necessarily have to pay new retail prices for these CPUs, when you are building a system yourself https://www.ebay.com/itm/100-0000000...cAAOSwnTBeZjF3
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   April 19, 2020, 18:43
Default
  #7
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Excellent point. So...

CPU: 2x Epyc 7452

Motherboard: the only appropriate dual socket epyc board that I know of is the H11DSi from supermicro, so I guess it'll be that.

RAM: What I understand from reading elsewhere on the forum is that I should be looking for 3200MHz dual rank memory... perhaps this: https://www.newegg.com/p/1X5-003Z-017J5 (I would of course need one set per CPU).

CPU Coolers: I've seen the Noctua NH-U14s recommended for epyc processors in this TDP range elsewhere on the forum, so I figured I'd go with that for CPU cooling.

GPU: Nvidia GTX 1070 (I already have it).

Storage: Various SATA SSDs I have lying around.


Case and case fans: Unsure, still looking into what fits.



I'm still drawn to the idea of filling one socket and upgrading to two down the road, from a budget standpoint. If I were to do that, I'm curious what the power supply situation would be... Should I just get a ~750 watt power supply right out of the gate, and run it at fairly low load until I fill the second socket, or should I use a lower power unit until I get the second CPU? I have a 500W 80+ bronze supply lying around, would that be acceptable for 1 CPU, or is that too sketchy?
hsnyder is offline   Reply With Quote

Old   April 20, 2020, 03:22
Default
  #8
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,503
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
Sounds like a plan. Though I would be a bit sketched out by the memory manufacturer. I know the situation seems dire on newegg. Maybe you can find a more reputable brand on a different site. 16GB DDR4-3200 reg ECC should cost in the range of 100$ per DIMM
https://geizhals.eu/?cat=ramddr3&xf=...=p#productlist

Starting with only one CPU and half the memory should be possible. You probably could use your existing power supply in the beginning, but with all the savings from memory and CPU, I personally would not bother with that. Get a proper power supply, with two 8-pin EPS connectors.

Noctua NH-U14s TR4/SP3 CPU coolers are fine for this.
For the case, my default recommendation is Phanteks Enthoo Pro. It is one of the very few PC cases that can fit SSI-EEB motherboards without any hassle, and doesn't look too gamer-y. For the fans, Noctua is always a good choice. Arctic has some good value alternatives https://www.arctic.ac/de_en/f14-pwm.html

Edit: be sure to get a rev. 2 of the H11DSi. Apparently, there are ways to make Epyc Rome work on rev. 1, but you can easily avoid that hassle.
__________________
Please do not send me CFD-related questions via PM

Last edited by flotus1; April 20, 2020 at 06:03.
flotus1 is offline   Reply With Quote

Old   April 20, 2020, 12:26
Default
  #9
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Thanks again Alex. I'll keep all of that in mind.



The case seems like a great pick, and affordable too.


Regarding the memory, keep in mind what I linked was for a set of 8 DIMMs not 16, so it's about $96 per DIMM... Still sketchy? Either way, I'll look around for alternatives as per your suggestion.
hsnyder is offline   Reply With Quote

Old   April 20, 2020, 13:07
Default
  #10
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,503
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
It wasn't about the price. That's about what you would expect.
The thing is, Nemix just puts whatever ICs they want on memory modules. Which is why they will never end up on a supported memory list for server boards. Or a list of "reputable" memory manufacturers for that matter.
If you can return the memory in case it is not compatible, you can give it a try. And if you have an easy way to handle a return in case of premature failure.
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   April 20, 2020, 13:12
Default
  #11
New Member
 
Harris Snyder
Join Date: Aug 2018
Posts: 17
Rep Power: 4
hsnyder is on a distinguished road
Got it. I'll avoid them. Thank you.
hsnyder is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OpenFOAM course for beginners Jibran OpenFOAM Announcements from Other Sources 3 July 1, 2020 09:58
OpenFOAM Training Jan-Jul 2017, Virtual, London, Houston, Berlin CFDFoundation OpenFOAM Announcements from Other Sources 0 January 4, 2017 06:15
OpenFOAM Training Jan-Apr 2017, Virtual, London, Houston, Berlin cfd.direct OpenFOAM Announcements from Other Sources 0 September 21, 2016 11:50
OpenFOAM Training, London, Chicago, Munich, Houston 2016-2017 cfd.direct OpenFOAM Announcements from Other Sources 0 September 14, 2016 03:19
Frequently Asked Questions about Installing OpenFOAM wyldckat OpenFOAM Installation 0 January 1, 2014 19:21


All times are GMT -4. The time now is 20:39.