CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   Hardware (http://www.cfd-online.com/Forums/hardware/)
-   -   Hardware specifications for CFD simulations! (http://www.cfd-online.com/Forums/hardware/93297-hardware-specifications-cfd-simulations.html)

parisa- October 11, 2011 05:01

Hardware specifications for CFD simulations!
 
Hello all,

My friend is a PhD and he needs to order a new computer for his CFD simulations and he wants to know details of the system specifications. How can he figure what is the most efficient system? Since the computer is orderd by the department I think the price is not very restrictive!

The simulations are mostly concentarted on the flow and heat transfer. The mesh consists of around 2 million cells, but higher resolution might be needed in further simulation steps. There are several UDFs and UDSs also used. The simulation program is ANSYS FLUENT (ansys workbench).

I appereciate if someone give me detail information about number and type of CPU and cores, RAM, graphic card, etc and whatever more he should think of.:confused:
Thanks a lot!

/Parisa

abdul099 October 11, 2011 17:54

For a case as small as 2M cells, you don't need a huge amount of memory. Anything above 8GB will be sufficient for sure, even when the cell count doubles. Anyway, it's comfortabel to have more memory, it gives the option to open more than one simulation at a time even when running a much bigger case.

When you need to solve the case on this machine, look for a dual cpu machine, a total of 8 or 12 cores should be fine. Intel CPU's perform much better than the ones from AMD.

Also don't forget the hard disk, CFD usually produces huge amounts of data. I don't think, a huge hard disk is necessary, as external hard disks are available for nearly no money. But it's annoying to wait ages until a file is saved or read, so your friend should look for a fast one.

All toghether your questions are easy to answer: The more and the faster, the better it is.

parisa- October 12, 2011 05:38

Thank you for replying Abdul. I have been talking to him and I realized that the mesh size will increase to 40M further in the calculations! I wonder how does the computational power (especially the number of cores) changes regarding the mesh size and the models, is there a rule of thumb for that? Cause I think there exists an optimized point, and the relation between the computer power and the demand is not linearly changing!
Can you help me with more information?

Thanks!

LuckyTran October 15, 2011 19:50

If he is wanting to do 40M cells, then there are very few options. Basically, he will need all the RAM he can get.

A personal desktop computer is unlikely to reach enough RAM for this task. Consider a workstation with dual Intel Xeon cpu's (get quad, preferably hex cores) on triple channel DDR3 memory, get the fastest ram you can find (bandwidth will be limited by overall capacity).

Otherwise, start looking for a high performance compute cluster.

Once you have purchased a/any computer, your computational power is fixed. Regardless of how your mesh/problem changes, there is no way to change your computational power. Don't even worry about optimizing the configuration, there isn't an optimum configuration. You need all you can get/whatever you can afford.

parisa- October 16, 2011 05:11

Thanks a lot for the help!

abdul099 October 20, 2011 23:16

Well, I don't think your friend needs to buy all ram he can get. You can easily get a workstation with up to 192GB ram (or even more), which will heavily overshoot his requirements.

For a 40M cell case I would recommend a workstation with at least 48GB, better 64GB ram. But for a 40M cell case, that's enough for sure. As LuckyTran said, a PC will closely miss the target, the current maximum ram for PC's is I think 32GB, which is not enough in this case.

Your friend will need at least one machine with this configuration for meshing, as long as no parallel meshing is available in fluent (I don't use it, I'm not aware of the available options).

But to solve a 40M cell case, he will need more than just a 12core workstation. He will need either a small cluster or a long, long time to wait and a good shaver to get rid of his long beard which grows while he is waiting for results :D

An option would be to buy some small workstations or even PC's and build a small beowulf cluster, using hardware of the shelf. For sure that will be cheaper and more flexible than buying the high performance cluster mentioned by LuckyTran. The disadvantage is, it also will be slower due to the slower interconnect. The main advantage besides the price is the option to upgrade the system by adding an additional machine and the option to use a machine for something different.

But it's all up to your friend's budget.
When I would have a 100 thousand bucks left, I would buy a cluster, designed as a cluster using cluster hardware. When I would only have a 10 thousand bucks left, I would buy some PC's, not even expensive workstations. (Both additionally to the one workstation with enough memory).

parisa- October 24, 2011 04:08

Meshing!
 
Thanks for the advice, I figured that there are really good cluster computers available at the university for the simulations, and this computer must be used mainly for meshing. Do you know also what is important for a good meshing computer?!

Thanks,
Parisa

abdul099 October 24, 2011 22:57

For a meshing computer I would say, enough memory and a fast cpu.

Memory is important because the process will slow down a lot when the machine starts swapping. But there will be no further speedup when memory is no issue, so it doesn't make sense to buy a ridiculous amount of memory. As mentioned before, for a 40M cell case 48GB should be fine, but 64 would provide even some safety when a case gets a little bigger than expected.

A fast CPU is also helpful, as meshing is a serial process with most software. So for meshing it would be enough to purchase a single socket machine. But when money is no issue, I would go for a dual socket machine anyway. In my experience, one often wants so solve a testcase with only a few million cells or just run a bigger case for some iterations before putting it on the cluster. And that's much more comfortable with twice the cpu cores available.


All times are GMT -4. The time now is 14:28.