4.7.8 - August 28, 2018 - Support for the Nvidia CUDA hardware decoding

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
BI does not do clustering and failover.
 

Netwalker

Getting the hang of it
Joined
Aug 8, 2017
Messages
46
Reaction score
28
And as much as it hurts, buying a cheap $200-$300 i5 or i7 system is probably worth it in the long run. My 9 camera system only uses 81w (ups + pc + Poe + cameras) with 24x7 motion detection and recording on all cameras.
 

ryan99alero

Young grasshopper
Joined
Sep 2, 2018
Messages
38
Reaction score
22
Location
Earth
I'll have to see if I can find a calculator to estimate CPU requirement BI. I currently have 8 8mp 4k Camera's and have plans to add 4 more 12mp camera's and 2 of the 7k avigilon camera's. I too also record 24x7 and not just when an event happens. Not sure if my brain will let me do a desktop PC. It has to run Vsphere so I can back it up to AWS for backup and disaster recover. My other issue would be I also store camera feeds onto an ISCSI array that is hidden in another part of the house. That way if they steal the camera's and rack server I still have footage of what happened. Another factor for me is I also have several axis camera's and the intel acceleration doesn't work on them so I'd either need more powerful CPU or go with a NVIDIA card. The server platform ones aren't limited to number of streams like desktop ones. I first had this on a 2 year old Dell i7 and it was hitting 98% CPU usage is why I moved it over to my server and it went down to 80% on 4 cores. The desktop dropped the ISCSI a couple times and it stopped recording video is another reason. I've been up and running on the DL380 for about 2 months with zero issue. Just not sure about CPU once I start expanding to more camera's. I have a comparable server at the office with 6 axis PTZ's and the CPU usage is around 28% but that is on Aimetis and those are running a couple different analytics also.

The more I talk the more I'm thinking I may be outgrowing the use case for BI and may just need to go ahead and migrate to either aimetis or milestone. As it seems the use case repeated on the forum is Intel desktop CPU only and no Nvidia GPU. If I have to use more than one PC I'm not sure if the the usage would be clean. I don't even know if the iOS app supports multiple servers.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
I'll have to see if I can find a calculator to estimate CPU requirement BI. I currently have 8 8mp 4k Camera's and have plans to add 4 more 12mp camera's and 2 of the 7k avigilon camera's. I too also record 24x7 and not just when an event happens. Not sure if my brain will let me do a desktop PC. It has to run Vsphere so I can back it up to AWS for backup and disaster recover. My other issue would be I also store camera feeds onto an ISCSI array that is hidden in another part of the house. That way if they steal the camera's and rack server I still have footage of what happened. Another factor for me is I also have several axis camera's and the intel acceleration doesn't work on them so I'd either need more powerful CPU or go with a NVIDIA card. The server platform ones aren't limited to number of streams like desktop ones. I first had this on a 2 year old Dell i7 and it was hitting 98% CPU usage is why I moved it over to my server and it went down to 80% on 4 cores. The desktop dropped the ISCSI a couple times and it stopped recording video is another reason. I've been up and running on the DL380 for about 2 months with zero issue. Just not sure about CPU once I start expanding to more camera's. I have a comparable server at the office with 6 axis PTZ's and the CPU usage is around 28% but that is on Aimetis and those are running a couple different analytics also.

The more I talk the more I'm thinking I may be outgrowing the use case for BI and may just need to go ahead and migrate to either aimetis or milestone. As it seems the use case repeated on the forum is Intel desktop CPU only and no Nvidia GPU. If I have to use more than one PC I'm not sure if the the usage would be clean. I don't even know if the iOS app supports multiple servers.
Look at nxwitness/ dw ipvms in North America. $70 per camera free upgrades for life.
Also note that you can also have blue iris not decode the video at all by using the limit decoding setting and using the cameras to generate alerts. You have to implement this properly or you will have issues see wiki.
Blue Iris another vmss should be run on bare metal. You always get the most stable results that way. No one is stealing your PC and your cameras. you can easily hide a cheap NVR if you're that paranoid.
 

ryan99alero

Young grasshopper
Joined
Sep 2, 2018
Messages
38
Reaction score
22
Location
Earth
Per attachment is how I have all camera's setup to record direct to disk as I believe that's what you're getting at on not decoding the feed. The system still has high cpu usage even doing that. While you do seem very knowledgeable about BI and video I will still have to respectively disagree on using bare metal being a requirement on a VMS unless you're saying that BI is written in such a way it is incompatible with virtualization. I went to a slower CPU with a VM and my CPU usage actually went down. This is likely due to have better storage array and more plus faster memory. Then again I don't know how the program utilizes memory and or virtual memory but I can learn more just by running it virtualize as it gives you a lot of conduits into resource usage that you don't really easily get on a bare metal windows machine. At work I built a fully virtualized environment spanning across 20 rack servers and have certifications in VMware virtualization. I've been in the IT industry and dealing with CPU intense workloads for the better of 25 years and managing others in it for the last 10. From things as simple as vm's running a simple linux update server, VOIP Phone Solution, 3 SQL servers in a cluster, Web Servers, AD Servers, Certificate server's, and some seriously GPU, CPU, memory intense application for the graphics arts industry. With a rough tally of about 15 virtual machines per rack server. I can fully understand the everyday person likely not being able to do virtualization correctly as you usually have to tweak some of the vSphere settings to get optimal performance.

I will take a look at nxwitness as that is way cheaper than Aimetis at around 260 a camera with analytics. I'm not so concerned with the price as I am things performing well and being fluid in its use as this system my family all need to be able to use it easily and quickly.
 

Attachments

Netwalker

Getting the hang of it
Joined
Aug 8, 2017
Messages
46
Reaction score
28
Incase if theft, I have a camera on the server that sends all of its events to a cheap NAS hidden elsewhere in the house.

In regards to the rest, there’s an easy way and a hard way to do things. You’re definitely taking the hard route, but with your qualifications, if anyone can do it you should be able to. Good luck and make sure to report your findings back to the forum so it can help others.
 

Netwalker

Getting the hang of it
Joined
Aug 8, 2017
Messages
46
Reaction score
28
In the per camera video options, there’s a “limit decoding unless required “ checkbox that fenderman is talking about. If you turn it on, you need to offload your motion detection to the cameras, so it’s not for everyone, but it can save a ton of CPU. Your cameras have to be supported by “pullpoint subscription” in blue iris (video, then configuration tab I think) and you would also uncheck “motion detection” on the triggers page and check “get events from camera”.

Check out this link
https://ipcamtalk.com/wiki/optimizing-blue-iris-s-cpu-usage/#limit-decoding-unless-required
 

ryan99alero

Young grasshopper
Joined
Sep 2, 2018
Messages
38
Reaction score
22
Location
Earth
Thanks for the additional clarification on what fenderman was referring too. I don't have that checked but all my camera's do support the "Get events with PullPointSubscription" and that is how I have most of them configured with the camera's doing the actual analytic part. I will investigate a little further. Going to deploy nxwitness and Qognify today and test both of them out. I do like that it has a linux version as I seriously dislike windows anymore.

Just want to say thanks for everyone's input. I think I'm going to go ahead and buy
 

Netwalker

Getting the hang of it
Joined
Aug 8, 2017
Messages
46
Reaction score
28
That’s good news. Once it’s all configured correctly, your server load should drop to just above whatever it takes for basic I/O whenever your not viewing a stream. When you’re viewing a stream, it will only require the cpu resources for that stream. If you tile all cameras at once however, you may still have CPU issues, so try to keep them in smaller cycle groups if you need to monitor all of them at once.
 

technet

Getting the hang of it
Joined
Dec 25, 2014
Messages
136
Reaction score
17
Is it possible to have both GPUs, Intel and NVidia, decoding at the same time?
 

awsum140

Known around here
Joined
Nov 14, 2017
Messages
1,254
Reaction score
1,128
Location
Southern NJ
You can set each camera individually in the configuration for the camera, video tab. The one you mention is a global setting for all cameras.
 

technet

Getting the hang of it
Joined
Dec 25, 2014
Messages
136
Reaction score
17
Maybe a global load balance setting would be a welcome feature.
 

awsum140

Known around here
Joined
Nov 14, 2017
Messages
1,254
Reaction score
1,128
Location
Southern NJ
I have multiple Cuda cards and do load balancing very easily manually. All my cameras run the same frame rates, same resolution and bit rates which make it very easy to do.
 

cdoublejj

Young grasshopper
Joined
Apr 19, 2019
Messages
62
Reaction score
4
Location
MO
I've updated my post above several times, adding new information.

Some conclusions I am able to draw from this are:

1) Both types of hardware acceleration (Intel / Nvidia) reduce CPU usage by a similar amount.
2) The GT 1030 (2GB GDDR5) card could only handle about half of my cameras.
3) Nvidia CUDA acceleration raised memory usage more than Intel Quick Sync.

Maybe a faster graphics card would be able to handle more video before maxing out. However faster GPUs also consume a lot of power so it could end up costing more to run the GPU than to run without it. Modern GPUs are not especially cheap, either, so based on what I've seen today, it is not a good option for a low-budget Blue Iris build.
I see there a ton of pages here that unfortunate i don't have the time to go through right now. but, FYI Nvidia on consumer cards limits the usage of the NVENC encoder chip. There are supposedly patches to over come said artificial limit. Some Nvidia cards even have more than 1 NVENC encode/decode chip.

I have multiple Cuda cards and do load balancing very easily manually. All my cameras run the same frame rates, same resolution and bit rates which make it very easy to do.
I haven't thought of that, so it makes use of both or can if configured?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
I haven't thought of that, so it makes use of both or can if configured?
It can be configured on a per-camera basis. The GUI for it kind of sucks though, not telling you anything but a GPU index number, which I guess you need to correlate with GPU index numbers in Task Manager's Performance tab. And hope they don't ever change.

1572274036071.png
 

SirAce135

n3wb
Joined
Mar 18, 2018
Messages
25
Reaction score
5
Is anyone here using or has tested with a Quadro card in their system?

I found some brief discussion about it online but nothing conclusive from anyone with hands on knowledge.

I am speccing out a rig for a 28 4K 15FPS camera system.
Plan on using a I9-9900X and potentially a Quadro RTX 40000.

Would love some insight.

Thanks! :D
 
Top