CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Lounge

HPC cluster market

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   June 30, 2013, 08:47
Default HPC cluster market
  #1
New Member
 
Anon
Join Date: Jun 2013
Posts: 1
Rep Power: 0
Drell-Yan is on a distinguished road
Hi,

First of all, allow me to introduce myself: I'm a student of a fifth-year engineering career, born and living in Spain. I finished the fourth year a couple of weeks ago. I started this year with the end of career project, which is essentially CFD. I usually read CFD-Online threads and/or Wiki, but never posted before.

Now let's talk about the cluster. I have a company with a couple of friends, and we are looking if there is people who wants to buy some cluster time doing high performance computing, such as rendering or CFD.

Does anyone of you think that selling cluster time (GHz·hour) is a good idea for CFD? Since there are "render farms" I know that we can sell cluster time for rendering, but I have no idea about the same task on CFD.

Regards,
Drell-Yan.
Drell-Yan is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
another issue about HPC cluster for running cfx, hepl PLZ. happy CFX 4 March 5, 2012 00:58
how I can write the file script to submite via HPC cluster at my UNI happy CFX 2 October 17, 2011 02:18
Parallel Solving with HPC Cluster? bah OpenFOAM Running, Solving & CFD 8 October 13, 2011 03:08
192 CPU HPC Cluster Available Steve Booth Main CFD Forum 0 September 24, 2007 14:05
Microsoft HPC cluster for CFD - question Rory - Administrator Main CFD Forum 8 November 13, 2006 13:59


All times are GMT -4. The time now is 02:46.