CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Main CFD Forum

No one talking about FluidX3D???

Register Blogs Community New Posts Updated Threads Search

Like Tree33Likes

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 25, 2022, 02:52
Default
  #41
Senior Member
 
Joern Beilke
Join Date: Mar 2009
Location: Dresden
Posts: 501
Rep Power: 20
JBeilke is on a distinguished road
Quote:
Originally Posted by ProjectPhysX View Post
I But unlike other software, it has the option to render directly in VRAM, which allows for very fast, at lower resolution even interactive previews of simulations.

Postprocessing on the fly is not really new. StarCCM+ can do this from the first moment and I'm sure that other codes can do it as well because it is the only way to do an efficient postprocessing for larger transient cases.
ProjectPhysX likes this.
JBeilke is offline   Reply With Quote

Old   November 25, 2022, 13:03
Default
  #42
Senior Member
 
Sayan Bhattacharjee
Join Date: Mar 2020
Posts: 495
Rep Power: 8
aerosayan is on a distinguished road
Quote:
Originally Posted by ProjectPhysX View Post
I think you misunderstood that. FluidX3D can of course export the data and do IO like any other software.

But unlike other software, it has the option to render directly in VRAM, which allows for very fast, at lower resolution even interactive previews of simulations.

In some of the YouTube videos, grid resolution is so large (10 billion cells) that one single frame of the velocity field alone is 120GB, and every ~40 seconds a new frame (180 LBM time steps) is computed and ready for rendering/export. You can imagine how fast that would fill all available hard drives; this is not feasible without built-in rendering at all. Other software could not even handle such cases.
Oh okay I understand it.

So it simulates once, and throws the data away in large cases.

I would still prefer to be able to store some of the data somehow. Maybe by averaging the data from the high resolution mesh to a low resolution mesh, or even as simple as collecting surface pressure data, or data across a certain slice of the volume.

IMHO we need to be able to review the data many times to make any sense out of it. Since your team is targeting such large scale simulations, it would be helpful to your users if they could save the data they want.


Maybe saving 120 GB of data per iteration is not possible, but saving 20 GB of data per iteration is possible. If I can extract data from slices of the volume, and store them on a low resolution mesh, I would do it.

The rendering capability that you're providing can be obviously useful.

Though to be honest, if someone has the capacity to run such large simulations with 10 Billion cells, they have enough money to buy lots of hard drives for storing the data.
ProjectPhysX likes this.
aerosayan is offline   Reply With Quote

Old   November 28, 2022, 07:43
Default
  #43
Senior Member
 
Arjun
Join Date: Mar 2009
Location: Nurenberg, Germany
Posts: 1,278
Rep Power: 34
arjun will become famous soon enougharjun will become famous soon enough
As I mentioned earlier we have run this type of calculations till 3 to 4 billion cells and this is what we did:

1. One does not need to save the whole data set. We were interested in probe line plots etc and also on the mid plane vectors contours etc. So we partially exported (same options are there for Wildkatze too). So final data saved was quite light weight.

2. As far as iso-surfaces of Q criteria etc goes that was generated by my c++ code. For that rather than saving the whole data set we just saved the iso-surfaces. (that is set of triangles). Since final animation was still generated by my c++ code (no paraview etc was used), we could have generated that in the solver too if we wanted. We exported because after the simulation was done we could orient in many ways we wished. If we did during the simulation then we lose on this freedom.

3. My C++ code did not allocate any memory and generated animation while just reading the solver exported files so it hardly mattered if simulation was 4 billion or 40 billion cells.

4. We did save the restart files so that simulations could be resumed.

5. Company could afford the hard disks and this is how we tranfered the data. When simulations were done on K-Supercomputer then we just drove there since our office was 10 minutes drive, When it was done in Tokyo then they just posted us the hard disks. We avoided downloading the file.



Quote:
Originally Posted by aerosayan View Post
Oh okay I understand it.

So it simulates once, and throws the data away in large cases.

I would still prefer to be able to store some of the data somehow. Maybe by averaging the data from the high resolution mesh to a low resolution mesh, or even as simple as collecting surface pressure data, or data across a certain slice of the volume.

IMHO we need to be able to review the data many times to make any sense out of it. Since your team is targeting such large scale simulations, it would be helpful to your users if they could save the data they want.


Maybe saving 120 GB of data per iteration is not possible, but saving 20 GB of data per iteration is possible. If I can extract data from slices of the volume, and store them on a low resolution mesh, I would do it.

The rendering capability that you're providing can be obviously useful.

Though to be honest, if someone has the capacity to run such large simulations with 10 Billion cells, they have enough money to buy lots of hard drives for storing the data.
aerosayan and ProjectPhysX like this.
arjun is offline   Reply With Quote

Old   December 2, 2022, 02:19
Default
  #44
Senior Member
 
Arjun
Join Date: Mar 2009
Location: Nurenberg, Germany
Posts: 1,278
Rep Power: 34
arjun will become famous soon enougharjun will become famous soon enough
Now that I have little bit time. Let me point out three areas where Lattice Boltzmann methods have advantages over the traditional method. This can be useful because most of these points are bit difficult to come to mind and are not obvious.

1. Acoustics: The traditional method of SIMPLE suffers from noise reflection problem when mesh is coarsened. The origins of this noise can be tracked to Rhie Chow dissipation. LBM does not use this type of dissipation and thus do not suffer from this issue. (still there can be some noise in LBM but should not be this severe as SIMPLE).

2. Decoupling at very small time-step sizes. The Rhie and Chow terms vanish as the time step size vanish and pressure and velocity decouple. Some of the applications like combustion do require very small time-step sizes and this causes pressure and velocity to decouple. LBM on the other hand thrives on lower time-step sizes. There is works the best.

3. Flow at higher Knudson numbers where Navier Stokes become invalid. LBM methods are easy to extend into that realm.
aerosayan likes this.
arjun is offline   Reply With Quote

Old   December 2, 2022, 02:57
Default
  #45
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,400
Rep Power: 47
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Quote:
Originally Posted by arjun View Post
3. Flow at higher Knudson numbers where Navier Stokes become invalid. LBM methods are easy to extend into that realm.
Yes, LB can be extended to higher flow regimes than Navier-Stokes. But I respectfully disagree that doing so could be called "easy"
flotus1 is offline   Reply With Quote

Old   December 2, 2022, 04:29
Default
  #46
Senior Member
 
Arjun
Join Date: Mar 2009
Location: Nurenberg, Germany
Posts: 1,278
Rep Power: 34
arjun will become famous soon enougharjun will become famous soon enough
Quote:
Originally Posted by flotus1 View Post
Yes, LB can be extended to higher flow regimes than Navier-Stokes. But I respectfully disagree that doing so could be called "easy"
When I say easy here, I meant it comes more naturally in this framework. As far as numerical difficulty goes, I am not sure because this depends on implementation.
arjun is offline   Reply With Quote

Old   May 15, 2023, 05:40
Default
  #47
New Member
 
Luis VL
Join Date: Jul 2022
Location: Toulouse
Posts: 18
Rep Power: 3
lvl99 is on a distinguished road
I'm testing this project and I'm still not sure if we are able to export some of the data or not, is it possible to export, say, the velocity over a line probe inside of the domain for all calculated time steps?
lvl99 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to simulate the gravitydriven RayleighTaylor instability luckyluke OpenFOAM Running, Solving & CFD 13 October 15, 2019 10:52
Running dieselFoam error adorean OpenFOAM Running, Solving & CFD 119 February 1, 2016 14:41
ANSYS and CFX not talking in two-way FSI? brashear CFX 6 November 25, 2012 08:13
How to add transport equations alimansouri OpenFOAM Running, Solving & CFD 6 January 12, 2009 16:20
Simulation of a free falling wedge into water 2D nico765 OpenFOAM Running, Solving & CFD 3 January 11, 2009 02:47


All times are GMT -4. The time now is 07:31.