Should I have a graphics card for DeepStack or is onboard graphics good enough?

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
I have no added graphics card, I've an HP SSF PC with an Intel i7-7700 CPU with the onboard Intel HD Graphics 630. I've 7 cameras which here is almost enough, an 8th camera is a vague possibly for next year. The average Mega p for my 7 cameras is around 2.7Mp.
Current CPU load is mostly under 10%, occasionally going up briefly to !6%, the max spike I've once seen was around 25%, which I suspect was when DeepStack was active.
What's your opinion, should I have a dedicated graphics card with GPU processing for DeepStack? If so what would be the advantages? I see the disadvantages being a likely increase in both power consumption and noise from the cooling fan(s).
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,884
Reaction score
48,535
Location
USA
DeepStack CPU does not use the onboard graphics, it is solely the CPU.

The GPU version will use a NVIDIA GPU if installed. The GPU version is faster.

What are your times for detection using the CPU? If they are under 100-150ms, not much advantage unless you have a lot of cameras.

And then of course the cost of a good GPU is expensive.

If your current times are low and the CPU isn't maxing out, then no real advantage for the cost.
 

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
Thanks wttaj
With this SFF PC the maximum recommended power available for an add on graphics card is 35W, the PSU is 180W from memory, so not much room for a high performance card, even 40W for an Nvidia Quadro P620 or T600 is pushing things. I suspect though that HP have allowed a slightly generous margin.
It's so easy to get side tracked. How would I actually notice the effects of slow processing by DeepStack CPU processing, rather than just looking in the System Status log file?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,884
Reaction score
48,535
Location
USA
Go into the BI logs and see what it shows for timing in ms of object detection.

I was able to put a 1030 GPU in my 4th gen SFF and it brought the time down by a factor of 5-8, but if your ms times are under 150ms, then probably not much reason to change.
 

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
Yes my t_ave = 313ms, so rather slow then?
wittaj, what difference did you actually notice after you added in your 1030 GPU?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,884
Reaction score
48,535
Location
USA
I was in the 350-450ms range and it dropped to 50-75ms range with the $100ish 1030 GPU.

You are still under a half second, so not terrible.
 

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
Progress but no progress here. 2 days ago a Quadra T600 graphics card went into my SFF PC, (i7-7700 CPU with onboard Quick Sync enabled Intel HD Graphics 630). The appropriate Nvidia driver was installed.
The result currently is that I see no improvement in my Blue Iris Status DeepStack analyses figures, at the moment its around 350ms (it's night time and quiet), about the same as I had before I added in the Quadro T600. Was it unreasonable to expect a sub 100ms figure?
Task Manager shows total CPU at 8% and total GPU 21%, that's with the 7 cameras here as mentioned in my first post.

I've no real idea if the installation went well, but there seemed to be no problems. Visual Studio, do I need it or do I don't? Do I have to install VS separately or will the Express install of CUDA install just enough of VS that I will need? I tried a separate install of VS 2019 at one stage to no effect. I'd earlier copied over cuDNN files as required.
Nvidia CUDA install docs also wants zlib.dll with link to an insecure http site, plus warnings here of zlib.dll downloads can be a source of malware. I did a scan and found nothing but didn't install.

To test if CUDA is working the documentation indicates that you need Visual Studio, also Nvidia's documentation about using their Samples testing is outdated too, also finding deviceQuery.cu is needle in haystack stuff.
I believe I have followed correctly what advice there is on IPCT and elsewhere, but as I say some of it seems outdated. I went with CUDA and cuDNN 11.5, that's the latest stuff that's available on the Nvidia website.

I congratulate those who successfully migrated from using CPU to GPU Nvidia graphics with DeepStack and got low ms figures, but currently I haven't joined your club.
I've had enough, that said just for a brief moment I wondered if DeepStack might have been better with GPU processing.
 

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
The vs_community.exe installer (2019 IDE download) asks for which components you want to install, which should I choose, or just go with the basic install, Visual Studio core editor?

edit: A web search suggests the component I need to choose to add in Visual Studio install is Desktop Development with C++, we'll see if that helps.
edit2: VS now installed. Do I have to open VS 2019 and configure it so it's associated with DeepStack? My average speed after 45 analyses is currently 4044ms in BI Status. With VS closed down it's 315ms after 70.
edit3: It seems to be improving over time, after 603 analyses the average is 223ms, that's better than I got using the CPU for DeepStack. I'll see what I get in the morning.
 
Last edited:

CAL7

Getting the hang of it
Joined
Nov 26, 2020
Messages
64
Reaction score
26
Location
Florida
Thanks for your progress reports, toastie. My project for the holidays is to see what I can do to lower AI recognition times. I'm following your odyssey with interest. I've had some concerns about driver and app installation to support an nvidia card.
 

toastie

Getting comfortable
Joined
Sep 30, 2018
Messages
254
Reaction score
82
Location
UK
Thanks for your interest CAL7, here today unfortunately the average for the DeepStack's 5304 analyses is 339ms. So me having a NVIDIA T600 with its 640 CUDA Cores is no better than when I was using the 7-7700 CPU and onboard Intel H630 with Quick Sync. If I make any positive progress I'll report back.
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,692
Location
New Jersey
The more CUDA cores the better. 640 on the T600 is a little on the low side. Have a look at a Tesla model from NVidia. There have been excellent results from a number of users who have installed one. It'll fit a low profile case and only takes one slot. It's probably a good idea to add a fan to it, but not entirely necessary. The fans are third party add-ons.
 

iwanttosee

Pulling my weight
Joined
Dec 27, 2020
Messages
203
Reaction score
186
Location
US
Task Manager shows total CPU at 8% and total GPU 21%, that's with the 7 cameras here as mentioned in my first post.
Zero need for a graphics card, unless you like throwing money away.
 

Pentagano

Getting comfortable
Joined
Dec 11, 2020
Messages
584
Reaction score
272
Location
Uruguay
I have a dell 3420 i7 7700 and decided to go down the gpu route.

The i7 handled DS ok, 350-400+ms but the dell CPU fan got so noisy when CPU went above 60-70% analyzing many photos in sequence (I have a custom model also).
Also, my daily work pc which I didn't want on 24/7, just had the motherboard replaced! A sff pc can get dusty quickly inside if the fans are blowing fast all the time.

I asked a lot around here and decided to go for a used GTX970 (same as @sebastiantombs)

As my dell is sff pc I put the large asus gpu into my Ryzen (5 3400g) atx case and added 3 extra fans.

Do not regret spending the money really. GPU works brilliantly - 45-85ms depending on which camera(I have one on mainstream high quality just because of the location of it), all others on subs, between 6-10 fps so the fastest is about 40ms.

ryzen cpu stays idling and gpu analyses them like a machine gun. Love it.

I use a custom model for a small fast cat also at night time so needed something faster than the i7 7700.
Wasn't money thrown away for me at all btw.
 
Top