Rendering on GPU ?
Hello everybody,
is it possible to render on the gpu instead on cpu ? Or is paraview able to render with GPU ? In my case all rendering things are done on 1 core cpu ... is that normal? Problem is on big cases >> 4.000.000 cells ... Or is my paraview just set up wrong? Thanks for replay in advance. Tobi |
Hi Tobi,
ParaView does use the GPU if you have the drivers properly installed and if ParaView is using correctly the system's OpenGL... which depends on the graphics card drivers. But keep in mind that the GPU is used for rendering, not for data crunching. If you want to increase the speed of data crunching, then try switching on the "Multi-core" option: http://www.paraview.org/Wiki/ParaVie...Guide/Settings As for which version of ParaView to use, try in this order:
Bruno PS: and yes, I did copy-paste the text above from the PM I sent you a few minutes ago ;) |
Hi Bruno,
I installed my graphiccard-driver for nvidia 560 GTX - so do I need MESA 3D for compilation of paraview too? |
Hi Tobi,
Mesa is used for performing GPU operations with the CPU, so... you only need it if you want to perform graphic operations off-screen on a machine that doesn't have a decent graphics card or if you want to compare the performance of each system, namely Mesa+CPU vs real GPU. Best regards, Bruno |
Quote:
Thanks |
Dear Tobi,
I am experiencing exactly the same problem with a very slow rendering time for a big mesh using Paraview. I have tried a lot of binaries and I see always that the GPU is not used. My problem is a time dependent one (300 time steps) and the size it is not as big (around 1 millon nodes), it just takes a lot of time to go from one time step to the next one so that for interactive visualization it can not be used. Have you found a solution to this problem, is there any? Best regards Rodrigo |
All times are GMT -4. The time now is 22:31. |