60 million cell case loading in openFoam
I have to solve the case of size 60 million. I have generated .cas file and loading it into openFoam using fluentMeshToFoam. While loading the case, the RAM consumption is too high. I could not load it with 32 GB RAM. I would like to know how I should tackle this issue, any specific strategy I need to adopt e.g. some memory optimization parameter, sparse matrix storage parameter or any other.
I was trying to search through the forum but could not get much information other than the post-processing thread of billion cell problem by Dr Hrvoje Jasak.
What strategy I should follow for the solver and data handling. I am interested in running the LES and want to study the effect of grid density on the energy spectrum for jet mixing.
What is maximum size problem solved using OpenFoam?
Thanks and regards
|All times are GMT -4. The time now is 10:59.|