CodeProject.AI Version 2.0

This is what i presently have, since I am always playing with this:

1683326533006.png
1683326591195.png
1683326662977.png
 
Haha, beat this:

1683327671805.png

Just don't remember what I did to fix it, but it was around the time I switched to .NET and started playing with custom models.
 
Also have been playing with Delivery model. This one is way kewl. I have Amazon, FedEx, UPS & USPS:


This is Clone:
1683327934601.png

1683328028299.png1683328084198.png1683328126946.png1683328177522.png
 
I live in the UK. It won't work for me unless there's an AI preset for 'Fake palm leaves floor with spike pit underneath' to catch these fools rocking up on my drive
 
  • Like
Reactions: David L
I guess I need to activate the analysis on all cameras to make that work so I record the dat files?
So my thought was getting the AI info on your alert of the cat. The Alert you shared when you double click on it does it start the video of that event/alert? If so, then launch the Log window and go to AI, then go back to the Alerts and double click on the one with the 2 cats, this way you can see what AI did.
 
I've enabled the AI analysis on that indoor cam. I'm sure they'll roam about in the night so I can compare tomorrow morning
 
  • Like
Reactions: David L
So my thought was getting the AI info on your alert of the cat. The Alert you shared when you double click on it does it start the video of that event/alert? If so, then launch the Log window and go to AI, then go back to the Alerts and double click on the one with the 2 cats, this way you can see what AI did.
Quick follow up.

It's displaying those two different percentage detections in BI because it's parsing all models since I don't have a specific model defined on that camera.

models.png
 
  • Like
Reactions: David L
Quick follow up.

It's displaying those two different percentage detections in BI because it's parsing all models since I don't have a specific model defined on that camera.

View attachment 162066
Makes sense. What I am testing now is Cloning my CAMs. One for Delivery model. One for Persons, and one for Vehicles, this one I may abandon since it makes no sense. Love how all my times are in-sync...
1683374986689.png

So maybe you should only use general and animal for your indoor CAM. I like all you do is separate them by a comma in Custom models:

1683375266565.png
 
If you do Clone be sure and check this off...

1683375708451.png

Also don't worry about Cloning causing the Camera to over work, the above stops anymore streaming from the Camera for the Clones (Zero Bitrate):
1683375849899.png
 
BI 5.7.4.2, Upgraded CP.AI from 2.0.8 to 2.1.8. Enabled ALPR and YOLO.NET only, and with GPU.

Before, I was getting <100ms prediction times. Now, it 500ms or worse.
Are the prediction time calculation incorrect in 2.1.8? (I did not change any AI configs in BI so the images sent to CP.AI are the same sizes as before, and leveraging mostly custom model ipcam-general. Not using default detection.)

I am watching Windows Task Manager and I can see spikes in GPU so that I know CP.AI is using it.
And yes, I did reboot Windows server several times just to be sure.

cpai logs.pngcpai2.1.8.pngserver info.png
 
Last edited:
  • Like
Reactions: David L
BI 5.7.4.2, Upgraded CP.AI from 2.0.8 to 2.1.8. Enabled ALPR and YOLO.NET only, and with GPU.

Before, I was getting <100ms prediction times. Now, it 500ms or worse.
Are the prediction time calculation incorrect in 2.1.8? (I did not change any AI configs in BI so the images sent to CP.AI is same sizes as before, , leveraging mostly custom model ipcam-general. Not using default detection.)

I am watching Windows Task Manager and I can see spike in GPU so that is being leveraged...
And yes, I did reboot Windows server several times just to be sure.

View attachment 162074View attachment 162075View attachment 162076
Try YOLOv5 6.2

Seems many here are running it with their GPUs
 
BI 5.7.4.2, Upgraded CP.AI from 2.0.8 to 2.1.8. Enabled ALPR and YOLO.NET only, and with GPU.

Before, I was getting <100ms prediction times. Now, it 500ms or worse.
Are the prediction time calculation incorrect in 2.1.8? (I did not change any AI configs in BI so the images sent to CP.AI are the same sizes as before, and leveraging mostly custom model ipcam-general. Not using default detection.)

I am watching Windows Task Manager and I can see spikes in GPU so that I know CP.AI is using it.
And yes, I did reboot Windows server several times just to be sure.

View attachment 162074View attachment 162075View attachment 162076
Noticed your Server is Offline. Do you know if this is common. Mine goes Offline too but I am presently working on an Internet Rule/Connection which may be the cause.
 
BI 5.7.4.2, Upgraded CP.AI from 2.0.8 to 2.1.8. Enabled ALPR and YOLO.NET only, and with GPU.

Before, I was getting <100ms prediction times. Now, it 500ms or worse.
Are the prediction time calculation incorrect in 2.1.8? (I did not change any AI configs in BI so the images sent to CP.AI are the same sizes as before, and leveraging mostly custom model ipcam-general. Not using default detection.)

I am watching Windows Task Manager and I can see spikes in GPU so that I know CP.AI is using it.
And yes, I did reboot Windows server several times just to be sure.

View attachment 162074View attachment 162075View attachment 162076
What does the detection times show if you run a test using Explorer

1683387371032.png
 
  • Like
Reactions: David L