CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   Hardware (https://www.cfd-online.com/Forums/hardware/)
-   -   Why not install cluster by connecting workstations together for CFD application? (https://www.cfd-online.com/Forums/hardware/139045-why-not-install-cluster-connecting-workstations-together-cfd-application.html)

Anna Tian July 16, 2014 09:59

Why not install cluster by connecting workstations together for CFD application?
 
For CFD application, what about building cluster for CFD by connecting 8 workstations instead of building a rack mounted cluster? I just don't understand why people usually go for rack mounted cluster.

It seems that if setup 8 workstations then build a cluster by connecting the 8 workstations, I can also manage them easily and also the cluster can be easily expanded. The hardware contained are almost the same. So does the cost. If I build a rack mounted cluster, i can't increase the box size to contain more motherboards easily. It seems that the cooling for the '8 workstations cluster' are also better than rack mounted cluster. Anyone has any comment on that?

I'm wondering what would be the disadvantages of the '8 workstations cluster' compared with the rack mounted installation way. Will that limit anything for CFD application?

By the way, I'm currently building my cluster for CFD application. I haven't purchased any hardware yet. I'm deciding which cluster installation way shall I choose.

CapSizer July 17, 2014 09:01

Quote:

Originally Posted by Anna Tian (Post 501753)
For CFD application, what about building cluster for CFD by connecting 8 workstations instead of building a rack mounted cluster?

It seems that the cooling for the '8 workstations cluster' are also better than rack mounted cluster. Anyone has any comment on that?

I'm wondering what would be the disadvantages of the '8 workstations cluster' compared with the rack mounted installation way. Will that limit anything for CFD application?

You can certainly do that, I think most of the guys on this forum who have been around for a while have done it. First, your advantages:

  1. It can be a bit cheaper
  2. You can do it with off the shelf commodity hardware
  3. Because the individual systems are larger and reasonably quietly cooled, the noise is not normally too bad. If you use high quality power supplies and good CPU coolers (perhaps even liquid cooling), the cluster could live in your office from a noise perspective. Typical rack-mount server nodes or blades are far too noisy to have anywhere near humans.
Disadvantages:
  1. You need quite a lot of space
  2. The noise may be acceptable, but the heat may make it impractical to have inside your office anyway
  3. It easily gets to be quite untidy
  4. There's normally not that much real cost advantage. You can get very cheap commodity based single socket blades if you want them.
When you look at it this way, there is no great disadvantage to the clustered workstations, apart from space. However, there is a hidden factor, which is that you really want a cluster to sit in a separate well-cooled, clean, sound-insulated server room, if that is at all possible. If you build a cluster out of rackmounted nodes or blades, you are in no temptation to put it in your office.

Anna Tian July 18, 2014 03:52

Quote:

Originally Posted by CapSizer (Post 501976)
You can certainly do that, I think most of the guys on this forum who have been around for a while have done it. First, your advantages:

  1. It can be a bit cheaper
  2. You can do it with off the shelf commodity hardware
  3. Because the individual systems are larger and reasonably quietly cooled, the noise is not normally too bad. If you use high quality power supplies and good CPU coolers (perhaps even liquid cooling), the cluster could live in your office from a noise perspective. Typical rack-mount server nodes or blades are far too noisy to have anywhere near humans.
Disadvantages:
  1. You need quite a lot of space
  2. The noise may be acceptable, but the heat may make it impractical to have inside your office anyway
  3. It easily gets to be quite untidy
  4. There's normally not that much real cost advantage. You can get very cheap commodity based single socket blades if you want them.
When you look at it this way, there is no great disadvantage to the clustered workstations, apart from space. However, there is a hidden factor, which is that you really want a cluster to sit in a separate well-cooled, clean, sound-insulated server room, if that is at all possible. If you build a cluster out of rackmounted nodes or blades, you are in no temptation to put it in your office.

The cluster will be placed in my office. We don't have a separate room for it. So I more or less care about the noise it will make. The space shall be enough. My office room could be 40 degrees during the summer as the air conditioner will be turned off during the night. So I will install a liquid cooling CPU cooler for my 130W CPU. In my case, shall I go for workstations cluster or rack mounted cluster?

By the way, I'm wondering will the isolated liquid cooling CPU be always efficient than air cooler? I don't know much about the liquid cooler. I just imagine if the CPU runs for a long time, the liquid cooler water temperature will be also very high and it will be hard to dissipate the heat from the water out also.

CapSizer July 18, 2014 04:04

There are two good reasons for using the liquid coolers. One is that they can help to extract a lot of heat very effectively, so that you can keep the CPU nice and cool, despite heavy use. The second one is that because they use nice big fans to blow air through the heat exchangers, they are quieter. However, you are still stuck with 8 X 130 W coming from your CPU's, and probably another 8 X 150 W (or whatever) coming from the rest of each system. So no matter what type of cooler you are using, you are going to be dumping more than 2 kW of heat into your office all the time. You have to find a way of dealing with that, and I don't think switching the A/C off at night is an option.

Anna Tian July 18, 2014 04:21

Quote:

Originally Posted by CapSizer (Post 502104)
There are two good reasons for using the liquid coolers. One is that they can help to extract a lot of heat very effectively, so that you can keep the CPU nice and cool, despite heavy use. The second one is that because they use nice big fans to blow air through the heat exchangers, they are quieter. However, you are still stuck with 8 X 130 W coming from your CPU's, and probably another 8 X 150 W (or whatever) coming from the rest of each system. So no matter what type of cooler you are using, you are going to be dumping more than 2 kW of heat into your office all the time. You have to find a way of dealing with that, and I don't think switching the A/C off at night is an option.

Is there a way to avoid opening the air conditioner during night? You know A/C cost is much higher than the energy cost of the CPU and the system.

CapSizer July 18, 2014 14:32

Use forced ventilation (like an extractor fan) to suck hot air out of the room. Make provision for cooler air to enter from somewhere.


All times are GMT -4. The time now is 17:53.