DeepStack Case Study: Performance from CPU to GPU version

mercer2

n3wb
Joined
Jan 7, 2020
Messages
24
Reaction score
8
Location
houston
with the amount of cpu, I though using main streams would not matter
also using VBR
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
I use sub streams, CBR with 10240 on 4MP and 5120 on 2MP cameras. All are set to 15/15 for frame and iframe rates. The CPU, an i7-6700K, rarely gets over 30% and then only for a few seconds at a time. Throughput as reported by BI is around 200kb/ps and 200Mp/ps. Sub streams will reduce the CPU load by a factor of five or more. In fact when using sub streams hardware acceleration isn't needed at all.
 

mercer2

n3wb
Joined
Jan 7, 2020
Messages
24
Reaction score
8
Location
houston
so i want to record continous at full resoution quality, any way to use the quadrop to drop cpu usage
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
You can record at full resolution although exactly why should be more compelling than "I just want to". BI will record the sub stream until motion is detected and then switch, automagically, to the main stream for the duration of the alert. This significantly reduces video file size and increase retention time without having to add more drive space. Look in the BI help file, it's all covered in there.
 

TBurt

Getting the hang of it
Joined
Aug 14, 2021
Messages
68
Reaction score
97
Location
Houston
I use sub streams, CBR with 10240 on 4MP and 5120 on 2MP cameras. All are set to 15/15 for frame and iframe rates. The CPU, an i7-6700K, rarely gets over 30% and then only for a few seconds at a time. Throughput as reported by BI is around 200kb/ps and 200Mp/ps. Sub streams will reduce the CPU load by a factor of five or more. In fact when using sub streams hardware acceleration isn't needed at all.
What bitrate do you run your substreams at?
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
I raised bit rates recently. It did add ore detail to both the 4MP and 2MP cameras. The CPU load increase was not significant at all.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,451
Reaction score
47,588
Location
USA
If your CPU is spiking to 100%, then obviously you don't have enough CPU.

Substreams are your friend. If you are using them for the main workhorse of BI (motion detection, alerts, etc.), then you can record the mainstream 24/7 without much CPU overhead, but you need to still use substreams or the CPU will spike.

While this thread was directed towards LPR, the same principals apply regarding resolution. You would be surprised how good a D1 substream can be to save on overall storage requirements:

 

hajalie24

Getting the hang of it
Joined
May 20, 2021
Messages
51
Reaction score
36
Location
Colorado
Has anyone tried mining while also running Deepstack off the GPU? Wondering if it will significantly slow things down or just a bit. Sometimes I game while mining and it still runs fine, but my hashrate drops.
 

T_Tronix

Young grasshopper
Joined
Sep 6, 2017
Messages
47
Reaction score
11
Is the amount of GB on the gpu matter? For example would a gtx1060 3gb suffice or would an 6gb for about 7 hd cameras be a better choice?
 

bqz

Young grasshopper
Joined
Jul 14, 2016
Messages
36
Reaction score
16
Location
Avellaneda, Buenos Aires, Argentina
Hi..

Yes I think so. Every task uses memory. If I enable Blue Iris to use hardware video decode, video RAM usage increases. I'm currently with nVidia 1030 GT (2 GB RAM). It works ok with Deepstack and Codeproject, although I disabled all modules except custom object detection. Current video RAM usage is about 1 GB.

1662242435136.png


It's the amount of gb on the gpu matter? For example would a gtx1060 3gb suffice for about 7 hd cameras?
 

TBurt

Getting the hang of it
Joined
Aug 14, 2021
Messages
68
Reaction score
97
Location
Houston
Is the amount of GB on the gpu matter? For example would a gtx1060 3gb suffice or would an 6gb for about 7 hd cameras be a better choice?
Get the 6GB version. I have the 3GB 1060 and I will get the occasional Ai not responding if I have too many custom models loaded/being used with Deep Stack. Or it just would not start DS at all, and/or Blue Iris would stop responding. I found it works best to load maybe 2 or three custom ones. Also if you find those custom ones do what you need, you can turn off the built-in DS detection model to save resources even more. It is not as big of a problem now that I have moved over to CodeProject.AI. It is go easier on memory for the computer, and GPU seems. Although I have been able to push it also to where it will stop responding also. The 6GB model should not be too much more and should keep the problems down.
 

T_Tronix

Young grasshopper
Joined
Sep 6, 2017
Messages
47
Reaction score
11
I ended up getting a used gtx1060 6gb. Tried to get CodeProject.AI working but couldn't so went back to deepstack. So far it's very stable and low on resources. I'm not using custom models just the default ones.

Get the 6GB version. I have the 3GB 1060 and I will get the occasional Ai not responding if I have too many custom models loaded/being used with Deep Stack. Or it just would not start DS at all, and/or Blue Iris would stop responding. I found it works best to load maybe 2 or three custom ones. Also if you find those custom ones do what you need, you can turn off the built-in DS detection model to save resources even more. It is not as big of a problem now that I have moved over to CodeProject.AI. It is go easier on memory for the computer, and GPU seems. Although I have been able to push it also to where it will stop responding also. The 6GB model should not be too much more and should keep the problems down.
 

Pentagano

Getting comfortable
Joined
Dec 11, 2020
Messages
576
Reaction score
269
Location
Uruguay
I ended up getting a used gtx1060 6gb. Tried to get CodeProject.AI working but couldn't so went back to deepstack. So far it's very stable and low on resources. I'm not using custom models just the default ones.
What inference speeds are you getting with the gtx1060? I have the gtx970 and it is fairly respectable between 50 and 80ms
 

Pentagano

Getting comfortable
Joined
Dec 11, 2020
Messages
576
Reaction score
269
Location
Uruguay
There is a possibility that I might be upgrading my motherboard. Looking at the x570 series for AMD. I need more sata ports and just generally more options as mine is a very basic motherboard.

Most have double slots for gpu cards.

Has anyone tried sli mode with 2 cheaper cards?
I wonder how deepstack works with 2 cards in SLI mode.
I'm happy with my gtx970 but curiosty always strikes.
 

T_Tronix

Young grasshopper
Joined
Sep 6, 2017
Messages
47
Reaction score
11
I could test it but not sure what tool is best to show the ms I'm getting.

What inference speeds are you getting with the gtx1060? I have the gtx970 and it is fairly respectable between 50 and 80ms
 

pbc

Getting comfortable
Joined
Jul 11, 2014
Messages
1,024
Reaction score
156
i've been happily running Deepstack via AITool until a couple months ago upgrading BI which apparently broke the AITools trigger/flagging into BI. So am looking now at trying to figure out how to get what I was doing in AITools done in BI without the added app.

I had the Deepstack GPU version installed as I installed a Nvidia GTX 1050 card (640 CUDA Cores but only 2GB).

Is this only for Deepstack running on Windows vs the Docker GPU version?
 
Top