TR/epyc vs Xeon scalable vs Xeon W vs i9 ansys CFD workstation
Hello,
I'm trying to determine which is the best CPU for CFD in ansys/Fluent. The simulation dimensions that this build is intended for are 2D and up to half-million cells. This system needs to be able to run the simulations very fast and continuously. The budget for the system is up to 5500$ Please if you can, attach a benchmark that proves your claim. |
What kind of benchmark would you accept as proof?
In my opinion, you would need to provide more details about the type of simulation you want to run, if you need benchmark-level certainty that any given setup is the best to run your cases. But I doubt that someone would go out of their way to re-create those benchmarks for you. What you could do instead -in addition to adding more info about your simulation setup- is run a strong scaling analysis on your current hardware. And list the hardware specs. From this data, it would be possible to extrapolate the best possible hardware setup with a low margin of error. And tell us how many cores you can use in parallel due to license constraints. |
Hello,
thank you for replaying. I would accept any kind of 2d simulation comparative benchmark with half-million cells or more but not more then few millions cells, Transient solver. It would be also fine to give me your recommendation without benchmark. Edit: I can use up to 16 cores for one simulation but i would like to run several simulations at once. How can you extrapolate 4 cores 8 threads up to 32 cores and 64 threads? with different CPU tech. |
Quote:
Quote:
If the goal here was to extrapolate absolute performance numbers, your doubts would be well justified. But picking the platform that will deliver best performance is somewhat easier. And if you can run several of these simulations at once with lower thread counts, it gets even easier, because strong scaling becomes less of an issue. |
I'm using a student Ansys llicense which is limited for 16 cores.
but, isn't limited in the number of simulations. I'm planning to run two simulations at once. Never mind the benchmark, what is your recommendation for a system based all i have told you until now? |
Given your budget of 5500$, the best way to spend it is on a system with two AMD Epyc 7302 CPUs. Accompanied by the cheapest way you can fill all 16 DIMM slots with DDR4-3200 reg ECC memory. Most likely 16x8GB. You don't need anything close to this much RAM, but there are no 4GB modules at that frequency. At least none that I am aware of.
And an NVMe SSD for the working directory would probably be a good idea when running several small transient simulations at once. |
why i need to fill all ram sluts ?
|
Because of memory channels and bandwidth. Epyc CPUs have 8 memory channels each, so with two of them, you need 16 DIMMs for maximum performance.
|
thank you very much!
|
I have read an interesting post regarding the memory speed: https://forums.servethehome.com/inde...5/#post-261881
I don't know if this is true, but certainly makes a lot of sense to me. Just throwing it out there. |
You have to take statements like these with a grain of salt, especially when they are uttered on STH. The people over there are server guys, and they often don't know or don't care how different applications than their own perform. But they are still eager to make generalizations. "Tellerrand" for our German readers ;)
I haven't come across a single real-world benchmark that showed the performance impact of DDR4-3200 vs. DDR4-2933 on Epyc Rome. My money for CFD applications is still on higher bandwidth, and not on lower latency. The benefit of buying DDR4-3200: it doesn't cost that much more compared to 2933, and you can still simply try if your application performs better at DDR4-2933 speeds. The other way round doesn't always work. Memory overclocking on AMD Epyc, even if you remain within the specs of the CPU, is not the most reliable thing in the world. |
Isn't the ANSYS student license limited to 512K cells?
|
Yes, it is. OP mentioned half a million cells in his requirements.
|
All times are GMT -4. The time now is 22:28. |