CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Info needed: HP ML350 Gen9 and Dell T7910

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 7, 2020, 22:40
Default Info needed: HP ML350 Gen9 and Dell T7910
  #1
New Member
 
JJ
Join Date: Sep 2019
Posts: 21
Rep Power: 2
gnwt4a is on a distinguished road
Hello,
I have been asking around for info on the title systems, but the response seems meager. I am looking into buying second hand one or the other of the above systems with 2x 16c core v3 and 256 gig @2133 ddr4. Both are out of production and there is the question if it were possible to install later versions of linux -- specifically Ubuntu 18.04 and CentOs 7.6. These two distros have tested (by myself) drivers for the radeon_vii cards which i use as opencl accelerators for doing dns.



The basic question is: would ubuntu 18.04 and/or CentOS 7.6 install on either hp ml350 g9 or dell t7910? If yes, is it possible to know if the amd drivers for the radeon_vii will be working on them?


btw, these two system were chosen because they have 4x x16 pci-3 slots with min 3 slots having x16 bandwidth.
Thanks.

--

Last edited by gnwt4a; May 7, 2020 at 22:41. Reason: typos
gnwt4a is offline   Reply With Quote

Old   May 8, 2020, 05:07
Default
  #2
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,500
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
I don't have any experience with these two models in particular. But I have never had any trouble installing current Linux versions on somewhat recent OEM workstations.
I see a bigger problem with the power supply and cooling. Radeon VII cards have a TDP of 295W, and require two 8-pin connectors each. Not sure if the power supplies in these workstations have the connectors and the power for 3 or even 4 of these cards.
And in addition to that, Radeon VII have a cooler design with axial fans, dumping all heat into the case. The cooling of these workstations is not designed to handle that. Instead, they rely on blower-style GPUs that exhaust heat directly through the back.
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   May 8, 2020, 09:00
Default
  #3
New Member
 
JJ
Join Date: Sep 2019
Posts: 21
Rep Power: 2
gnwt4a is on a distinguished road
Thanks. Helpful as usual.

Power and thermal issues can be overcome by the user. In general not all components run at full clip at the same time. When the gpus are busy, the load on the cpus may not be large - and the inverse.

By contrast, if the system gets bricked because of software problems, it is very serious for eol systems. Also, power usage and noise are very difficult to deal with. With these uncertainties in mind, who would risk ~3K euro on such systems?
--
gnwt4a is offline   Reply With Quote

Old   May 8, 2020, 13:10
Default
  #4
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,500
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
Quote:
With these uncertainties in mind, who would risk ~3K euro on such systems?
I would not

It is not just the total power consumption you have to keep an eye on, by balancing power between CPUs and GPUs. A power supply has individual rails for each of these, with individual power limits.
It is hard to find exact specifications for these OEM parts. Let me give you a different example: An HP Z840 workstation with the largest PSU has a total of three 6-pin PCIe power connectors. In contrast to the usual specifications for this type of connector, it can carry up to 150W, and thus can be split up into 2 6-pin connectors. If I wanted to connect a Radeon VII with it's 8-pin connectors, I would have to use two rails from the power supply. So without exceeding the specs and risking to trigger OCP, one Radeon VII card is the maximum for this workstation. Despite the PSUs 1450W rating @230V.
You could probably undervolt the GPUs, or limit their power consumption. But I don't know how easy that is on Linux.

Long story short: personally, with the intent of using several Radeon VII GPUs, I would not go with an OEM workstation. Instead, I would cobble something together from aftermarket parts. With a special focus on a large case with plenty of airflow, and a decent power supply with all the 8-pin connectors these cards need. It would probably land below 3000$ with Xeon E5 v3 CPUs, excluding the GPUs.
__________________
Please do not send me CFD-related questions via PM
flotus1 is offline   Reply With Quote

Old   May 11, 2020, 02:25
Default
  #5
New Member
 
JJ
Join Date: Sep 2019
Posts: 21
Rep Power: 2
gnwt4a is on a distinguished road
the following prices r from ebay in euros:

asus Z10PE-D16 WS ~410, 2x E5-2698 V3 1500, 16x 16GB - DDR4 2133MHz (PC4-17000, 2Rx4) rdimm 900.

that is ~2800 before u add psu, box. so 3K for and OEM box sound about right - unless u know better.
gnwt4a is offline   Reply With Quote

Old   May 11, 2020, 05:14
Default
  #6
Senior Member
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 2,500
Rep Power: 35
flotus1 will become famous soon enoughflotus1 will become famous soon enough
3000 dollars for an OEM workstation with these exact specs are about the usual market rate. I won't argue with that.
But I think we already established that you would run into some severe limitations with such a box. So it is not just about the spec sheet.

You have not yet disclosed what specs you really need.
Let's say you really benefit from the 16 cores per CPU of a Xeon E5-2698 v3. Then it can still be had for less than 500$ each on ebay.
Comparing some of the prices for these used Xeons, you will notice that the 2698v3 has a horrible price/performance ratio. 12-core variants like the 2678 v3 start as low as 100$. The only reason to buy a system with Haswell Xeons in 2020 is price. Expensive CPUs defeat the purpose.
Same for paying 3.50$/GB on used DDR4-2133 reg ECC. There are cheaper offers, even on ebay.
And be careful which motherboard you chose. The PCIe slot spacing on Asus Z10PE-D16 WS will only allow you to fit three dual-slot GPUs. The D8 variant can fit four cards.
__________________
Please do not send me CFD-related questions via PM

Last edited by flotus1; May 11, 2020 at 07:30.
flotus1 is offline   Reply With Quote

Reply

Tags
dns, workstations

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On



All times are GMT -4. The time now is 10:00.