CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > General Forums > Hardware

Graphics card for Paraview

Register Blogs Members List Search Today's Posts Mark Forums Read

Like Tree5Likes
  • 2 Post By flotus1
  • 2 Post By flotus1
  • 1 Post By complexFlow

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 3, 2016, 08:31
Default Graphics card for Paraview
  #1
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
I am currently finalizing a new workstation and wanted to double-check on one specific topic: what is the best GPU for Paraview.

In my opinion, since Paraview only uses standard OpenGL instructions in single precision (?), the "professional" graphics cards like Quadro and FirePro take no advantage from the driver optimizations that make them superior in some other professional applications. Especially not since I am running linux.
To be more precise, the Quadro M4000 8GB currently costs around 800€ and delivers a raw performance of 2572GFLOPS (Single) and 107GFLOPS (Double).
The new GTX 1080 which also comes with 8GB of VRAM delivers about three times as much GFLOPS in single precision, has more shading units, faster memory... for a lower price. Even the performance in double precision is much higher.

Am I missing something or is there really no reason to use a Quadro/FirePro graphics card if Paraview is the only program used or visualization?
mdgowhar and pagfetter like this.
flotus1 is offline   Reply With Quote

Old   June 10, 2016, 04:07
Default
  #2
Member
 
Mohammed Gowhar
Join Date: Feb 2014
Posts: 48
Rep Power: 12
mdgowhar is on a distinguished road
One word - reliability. Rendering with Iray for AutoDesk Inventor (or similar) drives the card very hard, especially if used for extended periods. Quadros are designed to withstand that kind of usage, consumer cards aren't.
Really depends on what you need to do with your card. The 1080 is a great card at a good price, but if you need reliability and intend to render for days on end then a Quadro might well be a better bet, in the long run.
mdgowhar is offline   Reply With Quote

Old   June 10, 2016, 04:54
Default
  #3
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Thanks for sharing your opinion. But throwing in the term "reliability" sounds like the usual marketing Nvidia and Intel use to advertise their professional product line. I can say with absolute certainty that this argument is invalid for CPUs and I highly doubt that it holds true for GPUs. Especially since almost every consumer partner graphics card has a better cooling system than the corresponding Quadro card.
Were there ever any studies to support the claim about reliability, or in case of consumer cards, the lack thereof?
I don't think folding@home or bitcoin mining a few years ago would have been a thing if consumer cards could not handle the workload.
flotus1 is offline   Reply With Quote

Old   June 10, 2016, 05:13
Default
  #4
Senior Member
 
Joern Beilke
Join Date: Mar 2009
Location: Dresden
Posts: 497
Rep Power: 20
JBeilke is on a distinguished road
Are you sure that the paraview performance depends a lot on the GPU? It might be CPU bound.
JBeilke is offline   Reply With Quote

Old   June 10, 2016, 05:36
Default
  #5
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
In my experience it is CPU-bound during tasks like loading the model or new time steps.
But interactively manipulating the model and waiting for the "non-decimated" geometry to be rendered uses the GPU. I tested this with different graphics cards, using a faster one definitely helped.
flotus1 is offline   Reply With Quote

Old   June 10, 2016, 11:50
Default
  #6
Senior Member
 
Erik
Join Date: Feb 2011
Location: Earth (Land portion)
Posts: 1,166
Rep Power: 23
evcelica is on a distinguished road
Quote:
Originally Posted by flotus1 View Post
Thanks for sharing your opinion. But throwing in the term "reliability" sounds like the usual marketing Nvidia and Intel use to advertise their professional product line. I can say with absolute certainty that this argument is invalid for CPUs and I highly doubt that it holds true for GPUs. Especially since almost every consumer partner graphics card has a better cooling system than the corresponding Quadro card.
Were there ever any studies to support the claim about reliability, or in case of consumer cards, the lack thereof?
I don't think folding@home or bitcoin mining a few years ago would have been a thing if consumer cards could not handle the workload.
I agree, especially since the hardware is identical between many Quaddro and Geforce cards, and only the firmware and drivers are different.
If you are using a program which doesn't have its graphics tailored specifically for use on a "professional" graphics card, like a lot of CAD programs do, then I see no reason to use it.
evcelica is offline   Reply With Quote

Old   June 13, 2016, 04:04
Default
  #7
Senior Member
 
Joern Beilke
Join Date: Mar 2009
Location: Dresden
Posts: 497
Rep Power: 20
JBeilke is on a distinguished road
@evcelica

There are good reasons to use a Quadro card. The smaller ones (Quadro 600) are usually enough for most of the cfd users. They are cheap, small and don't use a lot of energy (around 45 W).

Cooling and noise are the biggest challenges when you have a workstation under your desk.
JBeilke is offline   Reply With Quote

Old   June 13, 2016, 04:38
Default
  #8
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Cheap, small and low energy consumption are not a unique property of low-range Quadro cards. The consumer cards they are based on have identical properties, but cost even less.
In addition to that, the Quadro cards available now are usually one generation behind the consumer cards. This means the latest "Maxwell" Quadro cards are still based on 28nm chips while the latest "Pascal" consumer cards use 14nm chips with much higher energy efficiency.

Last edited by flotus1; June 13, 2016 at 05:40.
flotus1 is offline   Reply With Quote

Old   October 23, 2017, 13:28
Default
  #9
New Member
 
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 8
complexFlow is on a distinguished road
Did anything come out of the final choice for the GPU? Was there a significant improvement in the final workstation?
I recently was able to run ParaView 5.4 on a PC with a GTX 1080 but did not observe any significant improvements compared to a Geforce GT530...at least not in the streamline generating filter.
Was wondering how your final workstation made it out.
complexFlow is offline   Reply With Quote

Old   October 23, 2017, 13:48
Default
  #10
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
Thanks to budget restrictions, I settled for a GTX 1060 6GB.
So far I have not run into any issues that could be pinned down to the choice of GPU. Paraview does a good job to enable running on lower-end hardware, for example the "decimation" of geometries while interacting with the model.
I honestly don't know if PV can use GPU for some actual computation of filters, I never needed that.
My conclusion based on a bit of research and the experience with the workstations in our lab: Quadro/FirePro: unnecessary. A decent amount of GPU VRAM: nice to have.
roenby and pibil1 like this.
flotus1 is offline   Reply With Quote

Old   October 23, 2017, 13:56
Default
  #11
New Member
 
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 8
complexFlow is on a distinguished road
Were there any steps you took to "better integrate" the GPU to ParaView? I am aware of and updated OpenGL and graphics drivers in general. I believe ParaView can see that the GPU is installed, but am not able to determine if there are any restrictions that were automatically set up that needs changing.
Was there anything else you had to do before you had it running nicely?
gtarang likes this.
complexFlow is offline   Reply With Quote

Old   October 23, 2017, 15:42
Default
  #12
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
I installed the Nvidia driver as usual, that's it. No additional tinkering required. We are using OpensSuse btw.
flotus1 is offline   Reply With Quote

Old   October 24, 2017, 13:22
Default
  #13
New Member
 
Pablo
Join Date: Oct 2017
Posts: 5
Rep Power: 8
complexFlow is on a distinguished road
Thanks for the insight flotus1. Much appreciated.
complexFlow is offline   Reply With Quote

Old   December 5, 2018, 06:50
Default
  #14
New Member
 
samuel
Join Date: Sep 2018
Posts: 5
Rep Power: 7
samuel_rff is on a distinguished road
Do you think a graphic card is mandatory with a great configuration like 2*Epyc?
Because it seems to me it's possible to use multicore with Paraview.

Samuel
samuel_rff is offline   Reply With Quote

Old   December 5, 2018, 17:01
Default
  #15
Super Moderator
 
flotus1's Avatar
 
Alex
Join Date: Jun 2012
Location: Germany
Posts: 3,396
Rep Power: 46
flotus1 has a spectacular aura aboutflotus1 has a spectacular aura about
The processing in Paraview can be done in parallel. I.e. the calculations that PV does on the fields: calculating Q-critetion and the likes.
But this does not decrease the requirements for the graphics adapter. Maybe more cores help if you are using software rendering, a topic I know virtually nothing about.
flotus1 is offline   Reply With Quote

Old   January 27, 2019, 14:22
Default
  #16
New Member
 
Brandon Gleeson
Join Date: Apr 2018
Posts: 26
Rep Power: 7
CSMDakota is on a distinguished road
Another aspect worth considering when choosing a GPU, is getting a CUDA >=3.0 enabled one to allow the use of Nvidia's IndeX ParaView plugin. I'm using an AMD card currently, so don't have any experience with it, but it looks interesting from what I've watched.

Is anybody using IndeX?

This is from the readme under the ParaView 5.6 IndeX plugin directory:
The NVIDIA IndeX for ParaView Plugin is compatible with:

* ParaView-5.5.0 and later.
(depending on the downloaded package, Windows 64-bit).
* OpenMPI-1.7.4
(if running in client-server mode).
* NVIDIA IndeX 2.0
(installed with the ParaView plugin).
* NVIDIA GPU(s) supporting CUDA compute capability 3.0 or higher, i.e. Kepler
GPU architecture generation or later.
* NVIDIA display driver version 387.26 or later on Linux and
391.03 or later on Windows.
CSMDakota is offline   Reply With Quote

Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Error using Fluent 15 and GPU card David Christopher FLUENT 1 June 26, 2014 19:35
Graphics card or Processor for Fluent PedFr0 Hardware 5 March 25, 2013 13:29
Best graphics card grtabor OpenFOAM 2 July 23, 2009 12:02
How to change and fix the size of graphics window? Rob FLUENT 1 February 7, 2003 11:15
graphics card christof FLUENT 2 August 2, 2001 16:03


All times are GMT -4. The time now is 04:19.