[GTALUG] NVidia GTX 1080 annoucement

Lennart Sorensen lsorense at csclub.uwaterloo.ca
Mon May 9 00:39:55 EDT 2016


On Fri, May 06, 2016 at 10:43:54PM -0400, William Park wrote:
> Howdy,
> 
> I only caught the last 30min of the live stream.  Made me wonder... Is
> there way to use its 2560 GPU cores as substitute for the usual 4 CPU
> cores?

For general purpose computing?  No.

For specialized stuff that can be run as thousands of parallel streams
with lots of loops over massive amounts of data, especially floating
point data?  Yes.  So video compression and decompression sure, 3D
graphics sure, folding at home certainly, and various other tasks like that.
But not your x86 code running your web browser or your compiler.

Reminds me of the days when Beowulf clusters were new and people would
come on IRC and ask "How to I make a beowulf cluster?" and then be asked
"You do realize it won't make your web browser go faster, it only works
with software written to take advantage of it?" and they would go "Oh,
nevermind then".

-- 
Len Sorensen


More information about the talk mailing list