CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   Hardware (https://www.cfd-online.com/Forums/hardware/)
-   -   OpenFOAM machine recommendation (https://www.cfd-online.com/Forums/hardware/135152-openfoam-machine-recommendation.html)

snowygrouch May 9, 2014 13:43

OpenFOAM machine recommendation
 
Hi,
OpenFOAM, using mostly PisoFOAM solver for carbody external aerodynamics.
Unstructured mesh, cellcount around 10million

Want to spend under £3500 ($5900 USD) on the whole thing. (I have looked over the various existing threads on this forum - but not quite satisfied)

4 Options:

1) Single board QUAD socket 16 core Opteron 6274 £3500
64 Cores, 256 GB RAM, Supermicro H8QGi-F MB

http://www.pugetsystems.com/featured...orkstation-100

Downside = only one FPU per core
Upside = All in one built system - no messing around

2) Build HELMER cluster with perhaps 10 boards each with £3500
AMD FX8350 with 8 core cpu. (one board + cpu + 8GB ram is £250)

Upside = can afford more cores
Downside = got to set up network etc & would be ethernet with a
Netgear 16port GS116 switch.

3) Rent out cluster time from Penguin / Amazon or Rescale.com
who all claim to have OpenFOAM enabled systems.

Upside = No hardware to get
Downside = can be expensive to do all the setup runs when doing
new simulations, performance of these type of services is not quantified in publications.


4) Something I didnt consider for under $5900 USD....


Im well aware that Intel CPUs are in many ways better options, but when I have no parallel licence cost issues and can run as many cores as I can buy - from a cost per FLOP perspective im struggling to see a winning combo from Intel. If I go Intel basically I almost half the number of CPU cores I can buy.

Current workstation spec:
Dual Socket Supermicro X7DWA-N MBoard
Dual QUAD 3.33ghz X5470 XEONs
QUADRO FX3700 GFX

Any advice much appreciated.

Regards

Calum Douglas
www.calumdouglas.ch
www.scorpion-dynamics.com

CapSizer May 12, 2014 05:31

I would suggest that you shouldn't get too hung up on the number of compute cores that the AMD processors offer. Rather look at what you can get in terms of the memory system performance. There have been some benchmarks published in this forum that quantify the benefit of the memory performance quite nicely. The quad socket AMD solution is not a bad option (because you get 16 memory channels on a single board), but you won't benefit that much from spending the money on the 16 core variants of the CPU. The downside of the 1X4 socket board option is that it limits you to 1600 MHz RAM, when it is possible to get significantly faster RAM on a single socket board, but then you also need to take care of the networking, preferably Infiniband.

Core i-7 computers are favoured, because you can get 4 memory channels per socket, which you can't get on other single socket systems.

snowygrouch May 12, 2014 15:34

network protocol
 
Ah yes I was afraid of the network speeds.

Anyone got any opinion on the new 10Gb Ethernet or
Thunderbolt network protocols ?

If I cant afford the networking of sufficient speed will probably just go
for the 4 socket motherboard.

Thanks for the info

Calum

CapSizer May 12, 2014 16:02

10 Gb ethernet is a strange animal. The cards and switches are virtually as expensive as infiniband, but much slower, FDR IB being able to hit 56 Gb/s. What makes it even worse for 10 Gb ethernet is that the latency is much worse than IB. The cheap trick is to look for used IB cards, cables and switch on eBay. The stuff is surprisingly cheap,but you will be on your own making it work.

ghost82 May 16, 2014 11:52

Hi, I agree with Charles, you can get cheap ib cards and cables on ebay.
Recently I bought two cards from USA (refurbished), 2 mellanox MHGS18-XTC (20 Gb/s) and 5m mellanox cable (20 Gb/s) to connect them (MCC4L28-005).

Each Mellanox card costed 32,99$
Cable costed 19$

Shipping costs to Italy: 25,38$ (ib cards) + 21,23$ (cable)

VAT (yes...we have VAT here in Italy....:(): 11,66$ (for the cards) + 0$ (luckly the cable was very cheap, so no applied VAT)

Conclusion: 2x 20 Gb/s mellanox ib cards + 5m cable for 143,25$, which I think is very cheap..

Daniele

derekm May 17, 2014 08:28

my approach was to go even cheaper

H8QME boards with 4 x 8381HE processors giving only 16 cores per board.

unfortunately this means building custom cases.

board £100
4 cpus £36
16Gb memory £70
psu £30
fans £15

CapSizer May 17, 2014 14:37

Daniele, did you have any trouble getting the IB stuff to work, and what operating system are you running?

Quote:

Originally Posted by ghost82 (Post 492404)
Hi, I agree with Charles, you can get cheap ib cards and cables on ebay.
Recently I bought two cards from USA (refurbished), 2 mellanox MHGS18-XTC (20 Gb/s) and 5m mellanox cable (20 Gb/s) to connect them (MCC4L28-005).

Daniele


ghost82 May 17, 2014 16:54

I'm running windows 7 professional 64 bit in 2 nodes.
No problems at all.
As suggested by Erik in another post the only thing to do is to install correct drivers (I downloaded winof 2.1.2) and enable winsock direct protocol during installation.

Ps: I'm using fluent, not openfoam

http://www.cfd-online.com/Forums/har...d-win7x64.html

derekm May 19, 2014 18:17

for openfoam when to go IB
 
With 20 x 8381 (quad core 2.5Ghz) processors spread across 7 boards is it worthwhile going to IB rather than gb ethernet? How substantial a performance diiference?

CapSizer May 20, 2014 03:32

You should absolutely go with infiniband. Don't even think of doing such a cluster with Gb ethernet. Unfortunately I don't have the results with me here, but when testing a 6 node, dual socket, hex-core Opteron system a few years ago, the results were startling. For nominal testcases (something like 5 million cells), when using anything more than about 20 cores on Gb ethernet the performance was essentially flat, and even got worse with more cores. With QDR IB the improvement was more or less linear all the way up to 72 cores. You are looking at 80 cores for your system ... definitely must be IB. To put it differently, you would probably do better with 40 cores and IB than with 80 on Gbe.

Quote:

Originally Posted by derekm (Post 492957)
With 20 x 8381 (quad core 2.5Ghz) processors spread across 7 boards is it worthwhile going to IB rather than gb ethernet? How substantial a performance diiference?


derekm May 20, 2014 05:18

As I though, GbE two nodes/boards ok, three nodes/boards pushing it.

That means I can get to 32/48 cores if I only use the Quad socket boards
with GbE, but the remaining cores will need InFB.

More expense ... the InFB boards are quite cheap on Ebay but the switches are still much much bigger money than GbE.


All times are GMT -4. The time now is 12:09.