The growth of CFD mesh sizes over the years
Hi everyone,
I was wondering if I could find somewhere data about the growth of CFD mesh sizes over the years. I remember for example performing a CFD run with a mesh size of around two hundred thousand cells back in 1997 and I think and was impressed at the time. Any suggestions about sources of such information? thanks |
Hey,
In my opinion, the mesh size follows the development of computers and supercomputers (vector, parallel). Do you think it follows the Moore law? For example, In 2000 I was impressed by the 129^3 mesh points for the DNS of the lid driven cavity with spectral methods (E. Leriche Physics of Fluids V.12 N 6). 10 times more in 3 years. And In 2002, I was astonish by the Japanese team who compute the DNS using 4096^3 to modelize the Turbulence in a cube. So I don't think there is a limit for the mesh size but only a question of computation time, size memory and storage. |
Hi Thomas,
Thanks for the reply. Indeed what you say is right, but I am actually looking for real typical industrial CFD mesh sizes, how they have grown say since say the mid 90s, or even earlier if possible, upto today. Vangelis |
This is all I can think of. From ntrs.nasa.gov
" 20 Plus Years of Chimera Grid Development for the Space Shuttle. STS-107, Return to Flight, End of the Program" http://hdl.handle.net/2060/20100033498 |
Quote:
There are mainly two types of CFD calculations - (1) with meshes on regular cartesian meshes with explicit fractional step type treatments. (the 4096x4096x4096) falls into that category, and (2) with SIMPLE type of algorithm that you can use on any type of meshes. This is what being mostly used in industries. In cartesian types very large meshes are fairly regular, but in SIMPLE type algorithms current state , i guess (might be wrong though) that few billion cells is maximum people have tried. At the time of writing this message, i am running a calculation with 1.5billion cells and intend to run few with roughly 5 billion cells with SIMPLE type algorithm in coming days. The 5 billion one might be the biggest SIMPLE type algo's run in todays time. |
Thank you Martin, interesting presentation from NASA.
Arjun, are the 5 billion element simulations that you are planning for an automotive application? |
Quote:
At the moment though biggest issue is post processing such a large data, i would be busy planning about it. We tried paraview but it seems paraview is not upto the challenge. I never been fan of paraview, after working with it for last few days, my dislike has only increased. |
Quote:
|
Quote:
I know about other alternatives, for the same data set fieldview takes 5 times less memory than paraview. So if fieldview can do it then why not paraview?? This is why i am frustrated with paraview. It is good for small things but it is not cool for big things untill you have unlimited resources. |
Quote:
1. My problem size was smaller (only upto 150 million). 2. I had access to a very large supercomputer for entire CFD cycle so computing resources was never a constraint. Good to know that FieldView offers a better and more efficient alternative. Raashid |
All times are GMT -4. The time now is 01:21. |