Upgraded build and Blue Iris version - Current best practices? Specially with Hardware Decode

IReallyLikePizza2

Known around here
May 14, 2019
2,370
5,621
Houston
My old machine had so many issues and I was running an old BI version from 2022. I'm now running 5.9.9.14 on my very upgraded machine.

All my cams are set to hardware decode default like this

1735503675458.png

And then the default is set to Intel + VPP. Is this still good?

1735503722036.png

And under Web Server Encoder settings, I have Intel QSV set. Is that good?

1735503767131.png
 
Hopefully you added the cameras by scratch or you may have brought the problems along by exporting out and importing in to your new system.

Around the time AI was introduced in BI, many here had their system become unstable with hardware acceleration (hardware decode) (Quick Sync) on (even if not using DeepStack or CodeProject). Some have also been fine. I started to see errors when I was using hardware acceleration several updates into when AI was added.

This hits everyone at a different point. Some had their system go wonky immediately, some it was after a specific update, and some still don't have a problem, but the trend is showing running hardware acceleration will result in a problem at some point.

However, with substreams being introduced, the CPU% needed to offload video to a GPU (internal or external) is more than the CPU% savings seen by offloading to a GPU. Especially after about 12 cameras, the CPU goes up by using hardware acceleration. The wiki points this out as well.

My CPU % went down by not using hardware acceleration.

Here is a recent thread where someone turned off hardware acceleration based on my post and their CPU dropped 10-15% and BI became stable.

But if you use HA, use plain intel and not the variants.

Some still don't have a problem, but eventually it may result in a problem.

Here is a sampling of recent threads that turning off HA fixed the issues they were having....

No hardware acceleration with subs?


Hardware decoding just increases GPU usage?


Can't enable HA on one camera + high Bitrate


And as always, YMMV.
 
Very interesting, I should have posted before I upgraded my system, I could have just gone with a beefy Ryzen system instead of Intel! Oh well

I'll switch from Intel + VPP to Intel and see what happens, and then maybe test disabled

For my main cameras, I am not use substreams
 
Yeah I forget you are hard-headed and refuse to use substreams :lmao:

Spend $20k on computer parts and your time when a freebie 4th gen from the county auction would do :lmao:

Just yanking your chain LOL.

But yep substreams opened up a lot of lower powered intel computers and as well as non-intel like Ryzen and others.
 
So far it seems Intel + VPP seems to be working best, zero issue with Deepstack Stability

So many settings!