5.5.8 - June 13, 2022 - Code Project’s SenseAI Version 1 - See V2 here https://ipcamtalk.com/threads/codeproject-ai-version-2-0.68030/

MikeLud1

IPCT Contributor
Joined
Apr 5, 2017
Messages
2,210
Reaction score
4,251
Location
Brooklyn, NY
OK, here is where I am.
Changing the driver for the GT 710 had no effect.
Updating to CPAI version 2.07 had no effect.
Anytime I google the cuda error, I keep coming across the fact that the pytorch in cuda 10.2 only supports compute of 3.7 or higher, would have to compile pytorch again for a lower compute. I was seeing this more than a year ago when I was trying to run Deepstack GPU on this machine.
So:
Changed the Yolov5 3.1 back to cpu. Started testing with test images.
My favorite, pexels-ngrh-mei-5975635, gives me a round trip of between 480 to 500 ms.
I tried the Yolov5 6.2, with the same images, and I was getting 2 seconds +, which is comparable with what I was seeing running with Blue Iris.
So I am going to use Yolov5 3.1 in CPU mode and change Blue Iris back to using CPAI.
Bottom line is I think the GT 710 is a lost cause,
If anyone has actually successfully used the gpu in a GT 710 with CPAI, I would like to know.
Using CPAI with Yolov5 3.1 in CPU mode.
Done for now.
Post all the details of the issues in the CodeProject.AI forum, Chris should be able to help you, maybe he need some code changes.

 

AlphaBlueIris

Getting the hang of it
Joined
Jan 17, 2022
Messages
125
Reaction score
27
Location
Boston
OK, here is where I am.
Changing the driver for the GT 710 had no effect.
Updating to CPAI version 2.07 had no effect.
Anytime I google the cuda error, I keep coming across the fact that the pytorch in cuda 10.2 only supports compute of 3.7 or higher, would have to compile pytorch again for a lower compute. I was seeing this more than a year ago when I was trying to run Deepstack GPU on this machine.
So:
Changed the Yolov5 3.1 back to cpu. Started testing with test images.
My favorite, pexels-ngrh-mei-5975635, gives me a round trip of between 480 to 500 ms.
I tried the Yolov5 6.2, with the same images, and I was getting 2 seconds +, which is comparable with what I was seeing running with Blue Iris.
So I am going to use Yolov5 3.1 in CPU mode and change Blue Iris back to using CPAI.
Bottom line is I think the GT 710 is a lost cause,
If anyone has actually successfully used the gpu in a GT 710 with CPAI, I would like to know.
Using CPAI with Yolov5 3.1 in CPU mode.
Done for now.
I have the same card, at one point I was talking with the codeproject team where they indicated to test 2.4 at that time. I never was able to get it running under GPU. Blueiris would see it but not codeproject, I, like you, I gave up and got another GPU card tesla p4, hopefully that will not give me any trouble also because due to blueiris I decided to switch to another system more powerful. I'm running in parallel the blueiris software and both systems and they are giving me issues where the one with p4 see it as GPU but right now codeproject can not process it. I know they are actively working on it, hopefully it is only matter of time before everything is working again.

Sent from my SM-S906U using Tapatalk
 

Dixit

Getting the hang of it
Joined
Dec 14, 2015
Messages
75
Reaction score
36
Man I dont know what this 2.0.7 did, but Definitely broke it on mine. I used to use the yolov5l.pt file and dumped it in the custom folders directory. Now with the latest BI 5.6.9.4 going to AI tab and clicking on ... next to custom models it never lists that yolov5l.pt. And seems like it refuses to use it even though its clearly in that directory.

I tried to move it to default object detection on HIGH (which is what it was set to before) and seems to be partially working but slamming my CPU/GPU now at 100%, before never got there. And it seems like I keep crashing CodeProject (and even crashed BI) in 10mins of messing around. Completely uninstalled CodeProject, nuked the directories, tried again and same issue. Im coming from original Code Project going back to when it was released. So probably isnt the optimal setup.

Going to keep messing with this more, one reason for even going to this 2.0.7 is the 2.0.6 never showed any log info on the browser dashboard, before on 1.6.8 it always showed the data live as it was processing footage. Making it hard to analyze stuff without it.

EDIT:
Im hoping this is the correct way to use the Yolov5L model, I just turned off the custom models and checked Default Object detection. Noticed it kicked in. I didnt even change yet anything on the camera side where all of those have the custom model set to yolov5l. Still seeing higher than normal GPU usage but leaving it for now. Does HIGH on the next to the Default Object Detection under AI mean its going to use the 5L file? If not what does the Tiny, Low, Med, High represent (I thought at one point on the thread it talked about how large of resolution it sends to the model).
 
Last edited:

biggen

Known around here
Joined
May 6, 2018
Messages
2,567
Reaction score
2,842
Why don’t you guys roll back to 1.6? I’m staying with that until these bugs get ironed out.
 

105437

BIT Beta Team
Joined
Jun 8, 2015
Messages
2,040
Reaction score
946
I did a couple of days ago in the v2.0.4 thread.

View attachment 151288
This is the procedure I followed to resolve my issue.

My WMI repository was corrupt. I ran the following procedure and repaired the WMI. Once that was done, v2.0.7 installed and runs successfully.

Procedure to repair corrupted WMI that worked for me
• Disable and stop the winmgmt service
• Remove or rename C:\Windows\System32\wbem\repository
• Enable and start the winmgmt service
• Open Command Prompt as Administrator
• Run the following commands:
cd C:\Windows\System32\wbem\
for /f %s in ('dir /b *.mof') do mofcomp %s
NOTE: This will take a minute or so to complete.
for /f %s in ('dir /b en-us\*.mfl') do mofcomp en-us\%s

Hope this helps others with the same issue!
 

spammenotinoz

Getting comfortable
Joined
Apr 4, 2019
Messages
345
Reaction score
276
Location
Sydney
And this is the invisible person ? ;)
Think I will try a different model than ipcam.combined.pt

View attachment 152597
I personally have gone back to the default models, which is now YOLOv5 6.2. Excellent performance over earlier default models, the performance saving is now so small, that it's not worth the inaccuracy of the IPcam models, which unless I am mistaken have not been trained for years.
 

spammenotinoz

Getting comfortable
Joined
Apr 4, 2019
Messages
345
Reaction score
276
Location
Sydney
These are fun, but a bit annoying ;)
Using CP.AI 2.0.7 Yolov5 6.2 with ipcam-combined.pt

View attachment 152467 View attachment 152468
Why both? that essentially doubles processing time. If using custom models like IPCam, you should disable default models.
Both are fine for testing, then you use AI Diagnostics to see the times associated which each model.
 

Futaba

Pulling my weight
Joined
Nov 13, 2015
Messages
227
Reaction score
167
I started a new thread for all topic CodeProject.AI version 2.0.

I guess this thread is not dying... :rofl:
 

ReXX

Young grasshopper
Joined
Dec 28, 2018
Messages
47
Reaction score
15
Location
Denmark
I personally have gone back to the default models, which is now YOLOv5 6.2. Excellent performance over earlier default models, the performance saving is now so small, that it's not worth the inaccuracy of the IPcam models, which unless I am mistaken have not been trained for years.
I actually shifted to use the ipcam-dark. Seems a lot better even in daylight.
The default modes are still in the 5-600ms range for me, and 2-300ms for the ipcam models.
 

ReXX

Young grasshopper
Joined
Dec 28, 2018
Messages
47
Reaction score
15
Location
Denmark
Why both? that essentially doubles processing time. If using custom models like IPCam, you should disable default models.
Both are fine for testing, then you use AI Diagnostics to see the times associated which each model.
I don't use both ? Not use I understand..

I have disabled the default modes in BI, and specified the ipcam-dark.pt (now, before it was the ipcam.combined) file in the camera settings.
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
You need to use Object Detection (YOLOv5 .NET) module for Intel GPU support, make sure that Object Detection (YOLOv5 6.2) is disabled.
Can you tell me why I can't stop the YOLOv5 6.2 module? I press the three dots and select 'stop', but the log shows:
'objectdetection_queue: [Exception] : Error retrieving command [RuntimeError]: Session is closed'

I've tried rebooting, restarting the service - nothing.

All I want to do is get it working with my integrated Intel UHD 750 iGPU.
 

kwil2001

n3wb
Joined
Jan 3, 2021
Messages
3
Reaction score
0
Hi. I've justs installed CP.AI when I go into the AI part on BI can click on "open ai dashboard" I get an error that says
This site cant be reached
127.0.0.1 refused to connect
If I go to the start menu in windows and click on CodeProject.Ai dashboard, it opens the dashboard and says its online.

If I go into BI and click on "Start Now" or "Stop Now" I can see the other dashboard changes to on or off-line.
If I go to command prompt I can see that 127.0.01:32168 is.listed as established

Why can't I connect to the dashboard through BI
Thanks
 
Last edited:

slidermike

Getting the hang of it
Joined
Aug 4, 2022
Messages
47
Reaction score
57
Location
USA
Hi. I've justs installed CP.AI when I go into the AI part on BI can click on "open ai dashboard" I get an error that says
This site cant be reached
127.0.0.1 refused to connect
If I go to the start menu in windows and click on CodeProject.Ai dashboard, it opens the dashboard and says its online.

If I go into BI and click on "Start Now" or "Stop Now" I can see the other dashboard changes to on or off-line.
If I go to command prompt I can see that 127.0.01:32168 is.listed as established

Why can't I connect to the dashboard through BI
Thanks
Screen cap of this page to see the AI ip info listed.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,902
Reaction score
21,274
Hi. I've justs installed CP.AI when I go into the AI part on BI can click on "open ai dashboard" I get an error that says
This site cant be reached
127.0.0.1 refused to connect
If I go to the start menu in windows and click on CodeProject.Ai dashboard, it opens the dashboard and says its online.

If I go into BI and click on "Start Now" or "Stop Now" I can see the other dashboard changes to on or off-line.
If I go to command prompt I can see that 127.0.01:32168 is.listed as established

Why can't I connect to the dashboard through BI
Thanks
Assuming you installed Version 2 there is a separate thread for that here. CodeProject.AI Version 2.0
 

jaydeel

BIT Beta Team
Joined
Nov 9, 2016
Messages
1,132
Reaction score
1,240
Location
SF Bay Area
I stopped BI as a service, started BI locally and, as you wrote, BI recognized the modules. I then restarted the server, changed BI to run as a service and now BI is “talking” with CPAI. Before I did this, BI was not receiving the response from CPAI.
I have experienced this as well (running BI v5.6.9.8, CP.AI 2.07-beta) ... see this post for a full description.
My testing suggested that the 'CodeProject.AI Server' service must have the status "Running" when the Blue Iris Service starts.
To ensure this, you might consider changing the Blue Iris Service 'Startup Type' to 'Automatic (Delayed Start)', then reboot your PC.
 

nhs128

Young grasshopper
Joined
Jul 8, 2017
Messages
91
Reaction score
21
I've got everything running decently smooth for now. However, all I am getting is DayPlate and NightPlate for alpr. I never seem to get the actual plates. I believe I've got this set incorrectly but am still learning. Anyone able to tell me what to check/change?
 

AlphaBlueIris

Getting the hang of it
Joined
Jan 17, 2022
Messages
125
Reaction score
27
Location
Boston
I was going to write a similar post, I have the same problem. I'm running the 2.2. Any settings that I need in blue iris? Try to keep myself updated with all these posts but getting confuse.

Sent from my SM-S906U using Tapatalk
 
Top