CFD Online Discussion Forums

CFD Online Discussion Forums (http://www.cfd-online.com/Forums/)
-   FLUENT (http://www.cfd-online.com/Forums/fluent/)
-   -   How to guess/predict max mesh size that can be run on a certain hardware! (http://www.cfd-online.com/Forums/fluent/133026-how-guess-predict-max-mesh-size-can-run-certain-hardware.html)

mariachi April 9, 2014 07:47

How to guess/predict max mesh size that can be run on a certain hardware!
 
Hi Dear All,

I was wondering if anybody could tell me how to guess/predict the upper limit for a mesh size that can be run/simulated on a given PC?

To explain it further,

1. System 'A' is Intel(R) Core(TM) i7-2600 CPU @ 3.40 GHz 3.40 GHz with 16.0 GB of RAM.

2. System 'B' is Intel(R) Xeon(R) CPU E5520 @ 2.27 GHz 2.27 GHZ (2 processors) with 24.0 GB of RAM.

What is the maximum mesh size that can be run on each system???


Thanking in advance!

brunoc April 9, 2014 08:51

There is no single answer for that. The maximum case size you'll be able to run will depend on many things:

- Whether the solver is single or double precision, 2D or 3D
- The number of equations being solved
- The number of phases you're using (relates to the number of equations)
- The number of interfaces in your model
- The number of processors you're using
- So on...

What you can do is run some cases that are representative of your usual models and check the memory usage for them. Divide it by the mesh size and you'll get an estimate for the memory usage per cell ratio. Then compare this with the amount of RAM available on your system (not the amount installed, since the OS and other software will use some of that). This will give you an idea of the largest problem you'll be able to solve.

Cheers

ghost82 April 9, 2014 10:37

As Bruno said the limit to run cases on a pc depends on the amount of ram; with that amount of ram you can load meshes with several milion cells; but this means you can only load the case, time to have the solution depends on your cpu(s) and in general on your hardware.

mariachi April 10, 2014 02:26

Thanks Bruno!

I got your point. One confusion though, as Daniele said,
"limit to run cases on a pc depends on the amount of ram; with that amount of ram you can load meshes with several milion cells"

So what im getting out of this statement is that, RAM's job is to load (and hold it like so for subsequent calculations) the case/mesh only. All other calculation related activities, such as mentioned by Bruno in his post, are processor based and so they are going to affect the simulation convergence time and thats all...

Daniele
, thanks to you as well and same question! :)

(The background to all this discussion is that i have a mesh having 25 million cells. When i try to load it on a system with 24.0 GB of RAM, it gets loaded as well no issues with that. But during iterations, it takes a very long time per iteration which gives me some hint that maybe the mesh size is bigger than what the available RAM can hold, and so the system takes additional RAM from Hard Disk which is going to induce a lot of delay per iteration.

Hence rises the question:

What mesh size should i run on THIS system so as to be within the bounds and limits of RAM and at the same time get maximum output of available computational power?) :confused:

A rough guess which i make is:

'X' GB of RAM = 'X' million cells
i.e.
a system with 24.0 GB of RAM should be able to handle 24 million mesh size!
(comments on this plzzz)

Thanks...

ghost82 April 10, 2014 02:44

If you are on windows you can right click on the taskbar and see how much ram is being used by the system.
25 milion cells are a lot, and even with a lot of ram with that system it will be very slow.
Daniele

brunoc April 10, 2014 13:09

Yes, you're correct. The amount of RAM determine the size of the problem you'll be able to run; number and speed of processors will determine how fast it'll run.

I don't known exactly how much RAM you'll need to run a 25 million cell problem, but I am positive that 24 Gb is not enough. So one reason why each iteration is taking so long is that your computer might be swapping/paging. Since it has to constantly read/write stuff from disk, you're limited by you disk R/W speed, which can be orders of magnitude slower then RAM R/W speed.

You can either:
1. Use a smaller mesh
2. Use a computer with more RAM
3. Run a distributed parallel run

For the 3rd option, you'll use the RAM from both computers and offload some of the work to other CPUs, so your simulation will run a lot faster. Check the FLUENT documentation to see how to do that.

mariachi April 11, 2014 02:04

Thanks Bruno and Daniele, for your time!

mariachi May 6, 2014 09:33

An update to the topic under discussion...
 
ok, so i was able to run a 25 million cell problem across two machines whose combined RAM is 24+24 GB. The taskbar RAM usage indicated 21+21 GB RAM usage.

My problem is 3D, Double Precision, two-equation turbulence model, static i.e. no moving or rotating stuff involved.

So this gives us another benchmark regarding, the relationship between RAM usage and mesh size :)

brunoc May 6, 2014 12:47

Do you know if the RAM used by the OS is included in this? My guess is that it is, so the actual RAM usage should be a bit smaller.

mariachi May 7, 2014 03:40

Yes it should be included, bcoz these numbers are what i see once i open Task manager-- and then click the performance tab and memory usage in shape of green vertical bars can be seen. So i bet thats the total memory usage including everything. Ill try to paste a snapshot of the task manager usage later for clarity...


All times are GMT -4. The time now is 17:04.