QuickSync vs Nvidia NVDEC

BlueWave

Getting the hang of it
Jan 12, 2018
69
38
I see in the wiki and several other spots that Intel QuickSync is the best hardware decoder to reduce CPU usage on larger setups. I'm running BI on an i7-6900K which doesn't support QuickSync however I have a GTX 1070 video card on the machine so I was wondering if there were any benefits to enabling "Nvidia NVDEC" in BI settings page to offload decoding to the graphics card instead of the CPU, or is there some kind of reason one would not want a graphics card doing the decoding? I'm curious if a constant load on the card would burn it out or have some other adverse effect I'm not thinking about so curious what others say.
 
I have an i7-8700 and have been using a EVGA Nvidia card for over a year and a half. No issues. Currently have 19 cams using it for video decode.
 
Hardware accelerated video decoding mostly has two benefits:
1) It reduces CPU usage.
2) It can usually handle high frame rate 4K video better than software decoding.

It won't hurt a properly functioning graphics card to have it decode video 24/7. Just be aware that Nvidia cards require more electricity than your CPU does to decode the same video, so you shouldn't enable Nvidia decoding needlessly unless your house needs extra heating.
 
I know you already have the 1070, but when I tested a 970 vs 1660 SUPER the newer card had drastically lower power draw (about HALF), and could assist processing more threads.

It appears from my research the generation of card & VRAM are the most significant contributors to NVDEC decoding. In my test I could do 12 video streams max per card (6GB 1660 SUPER) due to what I believe is a driver limit.

On a positive note that CPU has quad channel memory which in my testing gives you high long-term capacity!!
 
Last edited:
  • Like
Reactions: sebastiantombs