[GTALUG] Running Dell branded Nvidia gtx 1060 in non-dell system

Alex Volkov subscriptions at flamy.ca
Thu Oct 3 11:27:42 EDT 2019


Hey Stewart,

I've been using the card for mostly video encoding thus far. I haven't 
had the time to do a lot of ML on it.

The short answer is Yes, but it might be a bit of a pain to set it up. 
Try one of the supported systems by Nvidia to see if you need to jump 
through fewer hoops than I did.

I didn't do any benchmarks so I can only say subjective things about it 
-- sped-up editing in kdenlive feels a lot faster, I needed to get a 
version of the program with disabled HW acceleration and my experience 
with it was a lot more frustrating.

Final video encoding speed doesn't seem to be that much faster, but I 
think I haven't yet tuned all of the parameters.

As I think I mentioned before, gamers are upgrading to GTX 1080 and 
there are a lot of GTX 1060 6GB (6GB is important, the original version 
came with 3GB) cards on the market which you can get for less than $200 
used. If you have a desktop computer with decent power supply and PCIe 
(v2 or v3) this is pretty reasonable option.

I did a talk on how to get the card working back in August -- 
https://youtu.be/eMu7ynAwECY?t=2m53s

You need to jump though some hoops configuring cuda, it might be 
worthwhile trying to install it on a supported system. i.e. Ubuntu LTS.

There are also a lot of different libraries that have different 
licensing from nvidia, so even when most of things work, some might 
still not, the most recent example for me -- scale_npp, this is 
accelerated video-rescaling I used for downscaling 4K and creating 
proxies -- https://developer.nvidia.com/ffmpeg Everything was working 
except for this thing, I had to recompile it myself. This is one of the 
things that seem to be at least twice as fast as doing this on 8-core CPU.

So if something not working or not giving you expected performance look 
in the logs.

Alex.


On 2019-10-02 8:55 p.m., Stewart C. Russell via talk wrote:
> On 2019-07-20 2:47 p.m., Alex Volkov via talk wrote:
>>
>> I'm looking to buy used Nvidia GeForce GXT 1060 to run some ML 
>> tutorials.
>
> Did this work out for you? I find myself in the market for a 
> CUDA-capable card to run Meshroom — https://alicevision.org/#meshroom  
> — a well-regarded photogrammetry suite. It only works on CUDA-equipped 
> systems.
>
> I don't need to spend much. Technically, the package will run on my 
> 2013 Samsung Chronos ultrabook with a GT 640M graphics card, but it's 
> so slow and hot that it's not worth the bother.
>
> cheers,
>  Stewart
> ---
> Post to this mailing list talk at gtalug.org
> Unsubscribe from this mailing list 
> https://gtalug.org/mailman/listinfo/talk




More information about the talk mailing list