[tool] [tutorial] Free AI Person Detection for Blue Iris

SyconsciousAu

Getting comfortable
Joined
Sep 13, 2015
Messages
872
Reaction score
825
From what I have read it would seem that AWS is taking around 1500ms per image to analyse
Out of curiosity, what is "good" when it comes to deepstack doing it's thing. I'm averaging in the 500ms - 750ms range though occasionally it will blow out to 1500ms - 2000ms. That's processing 720P images on an I7-8700 BI Box with OpenALPR running as well
 

cscoppa

Getting the hang of it
Joined
Dec 14, 2019
Messages
50
Reaction score
26
Out of curiosity, what is "good" when it comes to deepstack doing it's thing. I'm averaging in the 500ms - 750ms range though occasionally it will blow out to 1500ms - 2000ms. That's processing 720P images on an I7-8700 BI Box with OpenALPR running as well
Here's a sampling of my numbers, latest version of Deepstack:

Code:
[GIN] 2020/12/16 - 19:52:48 | 200 | 440.1914ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:52:50 | 200 | 412.3246ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:52:52 | 200 | 349.1985ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:52:54 | 200 | 378.7322ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:52:56 | 200 | 420.3993ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:52:58 | 200 | 394.851ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:23 | 200 | 183.1901ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:27 | 200 | 210.3503ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:40 | 200 | 196.2532ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:44 | 200 | 181.4243ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:54 | 200 | 174.5811ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:53:58 | 200 | 182.4597ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:56:29 | 200 | 425.209ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 19:56:34 | 200 | 512.897ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:26:54 | 200 | 532.6159ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:26:58 | 200 | 412.207ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:27:02 | 200 | 537.0633ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:28:42 | 200 | 357.8743ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:29:01 | 200 | 316.5681ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:29:47 | 200 | 509.3337ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:29:59 | 200 | 442.3388ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:30:03 | 200 | 220.8143ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:31:49 | 200 | 397.5408ms | 172.17.0.1 | POST /v1/vision/detection
[GIN] 2020/12/16 - 20:35:40 | 200 | 390.4652ms | 172.17.0.1 | POST /v1/vision/detection
 

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
147
Reaction score
23
Location
TX
It's version cpu-2020.12 from 7 days ago.
as long as you are using .......deepstack deepquestai/deepstack:latest
it should always pull the newest version. That is my understanding. I use it on Windows however so if you are doing anything else that may be incorrect.
 

cscoppa

Getting the hang of it
Joined
Dec 14, 2019
Messages
50
Reaction score
26
as long as you are using .......deepstack deepquestai/deepstack:latest
it should always pull the newest version. That is my understanding. I use it on Windows however so if you are doing anything else that may be incorrect.
Yep, that's what I use in my Docker Pull command.
 

SyconsciousAu

Getting comfortable
Joined
Sep 13, 2015
Messages
872
Reaction score
825
Yep, that's what I use in my Docker Pull command.
Haven't been using docker up to this point but it's time to learn a new thing apparently. It doesn't look like there is an installable for windows with the latest versions.
 

gawainxx

n3wb
Joined
Mar 7, 2018
Messages
28
Reaction score
2
Got this up and running, thank you! Took me a bit of effort to figure out the terminology for the mask portion but I got that sorted out.

I did however notice a HUGE performance difference between DeepStack Windows and DeepStack docker installs. The windows DeepStack would continuously max out the CPU (I'd jump from 12% idle to 100% CPU Utilization on 6c&1.8ghz) was also kind of sluggish processing images whereas the Docker CPU container was quick and rarely exceeds 20% of 2cores & 1.8ghz
 

cscoppa

Getting the hang of it
Joined
Dec 14, 2019
Messages
50
Reaction score
26
Got this up and running, thank you! Took me a bit of effort to figure out the terminology for the mask portion but I got that sorted out.

I did however notice a HUGE performance difference between DeepStack Windows and DeepStack docker installs. The windows DeepStack would continuously max out the CPU (I'd jump from 12% idle to 100% CPU Utilization on 6c&1.8ghz) was also kind of sluggish processing images whereas the Docker CPU container was quick and rarely exceeds 20% of 2cores & 1.8ghz
I saw the same thing, the Windows version pegged the CPU so hard that it was messing up the recordings. They would get stuck on one frame for the entire duration of the actual processing.
 

shannondalebreaux

Getting the hang of it
Joined
Jun 6, 2018
Messages
82
Reaction score
29
Location
louisana
How many cores and what kind of processor you guys are running for it to max the CPU out won't out using docker? I just recently got a custom PC built im curious to know if I will have the same issues with CPU getting maxed out.

Sent from my SM-G965U using Tapatalk
 

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
147
Reaction score
23
Location
TX
Okie dokie. Believe it or not I am trying...trying SO HARD, to do some custom models for deepstack. I KNOW RIGHT- WTH is this guy thinking?! Seriously here is where I am at and hope someone here is already doing this and can help me.
I have fumbled through the following-
Preparing Your Dataset
Step 1: Install LabelIMG
Step 2: Organize Your Dataset
Step 3: Run LabelIMG
Change Annotation to YOLO Format
Step 4: Annotate Your Dataset
Annotate Your Train /Test Dataset
Cloned the Google Caolb
copied it to my GD, mounted it, uploaded and unzipped my dataset folders.

Now I am stuck. Do I need to create a new cell and run some code or something so it "trains"? I am not seeing any model files or pth files so either I just think I did everything right to this point or just don't understand where to go from here.

As always TIA for any help.
Bumping my own post to say THANKS John Olafenwa for the help in getting the deepstack custom training model up and running. it's currently humming away training on my ugly mug.
For those that are interested John has put out a vidieo on the process- DeepStack Object Detection Guide - YouTube
 

cscoppa

Getting the hang of it
Joined
Dec 14, 2019
Messages
50
Reaction score
26
How many cores and what kind of processor you guys are running for it to max the CPU out won't out using docker? I just recently got a custom PC built im curious to know if I will have the same issues with CPU getting maxed out.

Sent from my SM-G965U using Tapatalk
In my case it was a 4th gen i7 - 4770K. Native Windows version would peg it, Docker for Windows version doesn't.
 

shannondalebreaux

Getting the hang of it
Joined
Jun 6, 2018
Messages
82
Reaction score
29
Location
louisana
Thanks glad to hear that money well spent! I was trying to future proof as much as possible and keep the performance level as high as possible. I was thinking of running all my home automation and security cameras on it wanted to make sure it could handle good bit of cameras at high MP per camera still have to be careful I guess. Just on the fence what system to go with. Looked at blue iris, still looking though. I'm about AI keeping these false alarms down. Had cameras in the past terrible false alarms even with expensive cameras.

Sent from my SM-G965U using Tapatalk
 

cscoppa

Getting the hang of it
Joined
Dec 14, 2019
Messages
50
Reaction score
26
Yeah, can't go wrong with that CPU. 10 cores / 20 threads makes it very good for this task. I also am running Blue Iris. I would say having run both versions I would still run Docker for Windows and install one of the Docker builds of DeepStack. Its updated far more frequently.
 

shannondalebreaux

Getting the hang of it
Joined
Jun 6, 2018
Messages
82
Reaction score
29
Location
louisana
@cscoppa thank for the advice. I haven't used docker for Windows I may need your guidance for that. Seems like docker has grown greatly. Works great on some thing guess depends who you ask.

Sent from my SM-G965U using Tapatalk
 

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
147
Reaction score
23
Location
TX
@Tinbum / @Village Guy in the Windows version of AI-Tool you can select 1 or all the different detection models from the deepstack tab, When you run the Docker version that tab is gone so how would you tell it to run more than 1 model? Do you just run sudo docker run -e XXXX-DETECTION=True -v localstorage:/datastore -p 80:5000 \ deepquestai/deepstack
multiple times and change where I put XXXX to each model you want it to use?
Thanks.
 
Joined
May 1, 2019
Messages
2,215
Reaction score
3,504
Location
Reno, NV
About to jump into the Deepstack AI and AI-tool game to help knock down false alerts above and beyond what simple motion detection false alerts create.
Some questions.
Currently, my Blue Iris server runs on a i5-9700K cpu, direct-to-disc 24/7 continuous for 14-16 cameras, 8TB vido storage HD, 1TB M.2 SSD main drive, 16 GB optimized RAM, leaving me with around 35%-45% CPU usage.
What CPU usage could one expect with the use of Deepstack and the duration of the CPU usage, located on the same machine?
Windows vs Docker. Which version is recommended when it comes to ease of use, resources used, upgradeablity, functionality? I've never had issues with Windows 10 for my server so am comfortable with it's use. Have actually never dabbled with Docker in any sense or form. The above Youtube video of FamilyTechExec... he runs with the Windows version without batting an eye. I bring this up due to an earlier post mentioning that the Docker version might get more updates than the Windows version?
 

cjowers

Getting the hang of it
Joined
Jan 28, 2020
Messages
107
Reaction score
36
Location
AUS
Currently, my Blue Iris server runs on a i5-9700K cpu, direct-to-disc 24/7 continuous for 14-16 cameras, 8TB vido storage HD, 1TB M.2 SSD main drive, 16 GB optimized RAM, leaving me with around 35%-45% CPU usage.
What CPU usage could one expect with the use of Deepstack and the duration of the CPU usage, located on the same machine?
Windows vs Docker. Which version is recommended when it comes to ease of use, resources used, upgradeablity, functionality? I've never had issues with Windows 10 for my server so am comfortable with it's use. Have actually never dabbled with Docker in any sense or form. The above Youtube video of FamilyTechExec... he runs with the Windows version without batting an eye. I bring this up due to an earlier post mentioning that the Docker version might get more updates than the Windows version?
what gpu do you have? integrated 630? adding a discrete gpu or dedicated jetson could probably help a lot with so many cameras.
I haven't tried your setup, but that is not a very bulky CPU for AI vision tasks, and integrated GPUs tend to be weak as well... Maybe almost approaching 1sec response time for sequential 2MP images?? (Just a guess)

but you might as well wait a couple days for them to release the updated native windows version and try that first. it should have similar speeds to docker version I would think (<1s per image, possibly much less). If speed no good, try docker version. Configure them to max the processor usage during the 'busy' motion times (adjust resolution, seconds per frame, # cameras, etc.) and see if acceptable for you, or if you need a GPU / jetson nano / etc to meet your needs.

It sounds like windows is not a huge market for them, and obviously hasn't be a major focus of their development (so less updates likely), but there is reason to be excited about the soon to be released version with windows gpu support for simplicity with BI.
 
Top