[GTALUG] I guess I should buy a new video card

Lennart Sorensen lsorense at csclub.uwaterloo.ca
Sat Aug 10 22:26:24 EDT 2024


On Sat, Aug 10, 2024 at 02:47:04PM -0400, Evan Leibovitch via talk wrote:
> In this decade, focusing solely on OS hardware drivers is a mistake when
> evaluating current GPU offerings ... depending on your intended use.

If it can run X or wayland and maybe do openGl, I think it will do fine.

> For gaming and video manipulation these are still a factor (and in general
> pretty solid for both companies, Wayland issues notwithstanding). If that's
> all you want, there are many good Internet sites that will help you find
> the current (and ever-changing) sweet spot that balances budget and
> performance. (Here's my favourite pricing site
> <https://ca.pcpartpicker.com/products/video-card/>, and a good at-a-glance
> comparison of benchmarks
> <https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html>.) However,
> nVidia hasn't become one of the world's most valuable companies because its
> cards run Davinci Resolve so well. GPUs these days are increasingly being
> used directly for apps from crypto mining to AI and beyond, to the point
> where some of the most expensive GPU cards are sold without display
> connectors.

Well crypto mining and AI are definitely things I don't care about at all.
Not that I would have considered calling LLMs AI in the first place,
but that seems to be the trendy thing to call it.  It looks a lot like
AI though so many people think it is much more than it really is.

My gaming is currently on my laptop running windows, although the
graphics is starting to have issues with a few newer things I have tried
(the Quadro K2000M is perhaps getting a bit dated).  Well not counting
the gaming done on the switch or xbox one x.  Or my phone I suppose.

Maybe I should see which steam games will run under linux these days
and see if any of that works on the other system.

> Increasingly, apps use direct access to GPUs using toolkits such as AMD's
> ROCm and nVidia's CUDA, and in this realm nVidia has a massive head start.
> AMD is making good progress, but if you're buying a card based on possible
> future AI and non-raster-video uses, it's still way behind. I have one
> system with an RX6600 and another with an RTX3060; they're pretty similar
> for day-to-day use but I have encountered many apps and platforms that just
> don't support AMD GPUs the way they do the nVideas ... if at all.

Yes for some things Cuda is definitely used.  Nothing I do.  I don't do
video editing.  It's a busy year if I edit down one video to cut some
pieces out of it.

-- 
Len Sorensen


More information about the talk mailing list