Love the GPU times though. I tried the Google coral and it's not there yet, but hopefully it can get down to those times because the power consumption vs gpu is a lot better. Even though GPU doesn't use a ton more power. Currently though I have to use a 1080 with 3 fans so it's a little more power. My 1060 only has 6gb so I'm on the hunt for a low power GPU with enough ram.Just a quick update on the topic. I started off on CP AI 2.0.8 with no GPU and the latest version of BI - all working fine. Decided to improve performance. Obtained and installed a GTX1060 6GB card, and started fidgeting with CP AI... naturally at the time the 403 error started appearing...
After various attempts at CP AI 2.1.10, 2.1.11, 2.2 and this morning 2.2.1 -> CP AI is finally installed, working fine standalone and correctly using GPU CUDA with 25-35ms times on the Explore tab with test images from my camera.
BI however does not like this version:
- it does not recognize the path to the AI service, despite it recognizing it is up and running
- it does not accept any path to custom models
- testing clips with AI results in error 200 being thrown
I suppose either CP AI will need further fixes and/or BI will need some change to cater for this new version...
View attachment 172512
I suppose I picked the wrong time to upgrade, but CP AI is still in beta, so it's to be expected.
Love the GPU times though. I tried the Google coral and it's not there yet, but hopefully it can get down to those times because the power consumption vs gpu is a lot better. Even though GPU doesn't use a ton more power. Currently though I have to use a 1080 with 3 fans so it's a little more power. My 1060 only has 6gb so I'm on the hunt for a low power GPU with enough ram.
My bad I meant to say GTX 1060 3GB. 3GB was enough for most AI detection, but not for ALPR.Do you find 6GB is not enough memory? With these issues I have been unable to see what kind of CPU/GPU usage I'd get in challenging conditions (think garden when it's windy). I'm currently running 10 cameras via AI ranging between 5mp and 3mp (most of them are 4, two are 5) at 15fps and using substreams also for AI. It's an old i7 4790k with 24GB RAM but it's only running BI and CP AI. Before getting the GT1060 I was not running AI on the onboard GPU and having removed hardware decoding on BI it was okay, not fast for AI but OK and CPU usage was rarely above 20%. I think it will be substantially faster now, when these issues are resolved.
I agree with you about memory for the GPUs. I know in gaming I have compared video cards, example; one video card had 2Gigs memory and the other 4Gigs and there was a big difference, that is limitations in what Res. and other settings a game could run at without issues.My bad I meant to say GTX 1060 3GB. 3GB was enough for most AI detection, but not for ALPR.
did you try the dual edge coral?Love the GPU times though. I tried the Google coral and it's not there yet, but hopefully it can get down to those times because the power consumption vs gpu is a lot better. Even though GPU doesn't use a ton more power. Currently though I have to use a 1080 with 3 fans so it's a little more power. My 1060 only has 6gb so I'm on the hunt for a low power GPU with enough ram.
Nope, I was using an m.2 coral. I think once they implement custom models on CP.AI we will see better times.did you try the dual edge coral?
Just a quick update on the topic. I started off on CP AI 2.0.8 with no GPU and the latest version of BI - all working fine. Decided to improve performance. Obtained and installed a GTX1060 6GB card, and started fidgeting with CP AI... naturally at the time the 403 error started appearing...
After various attempts at CP AI 2.1.10, 2.1.11, 2.2 and this morning 2.2.1 -> CP AI is finally installed, working fine standalone and correctly using GPU CUDA with 25-35ms times on the Explore tab with test images from my camera.
BI however does not like this version:
- it does not recognize the path to the AI service, despite it recognizing it is up and running
- it does not accept any path to custom models
- testing clips with AI results in error 200 being thrown
I suppose either CP AI will need further fixes and/or BI will need some change to cater for this new version...
View attachment 172512
I suppose I picked the wrong time to upgrade, but CP AI is still in beta, so it's to be expected.
There is an issue with Blue Iris not using standard JSON format when receiving the response from CodeProject.AI. Blue Iris Support was made aware of the issue and should be fixed the next release of Blue Iris.Just installed the 2.2.2 update. Same issues as 2.2.1 for BI
There is an issue with Blue Iris not using standard JSON format when receiving the response from CodeProject.AI. Blue Iris Support was made aware of the issue and should be fixed the next release of Blue Iris.
PS - what a beast of a card you are running... with 56GB memory and 300+ operations/secondThanks. Thought so. So I just tested as you suggested. So the .net directML version uses GPU0 (my onboard intel) and the 6.2 uses GPU1 (the GTX1060)
I guess there is no way of making the .net use GPU1 - I thought I read somewhere that performance was improved when using directML vs CUDA.
Anyway, with Yolo 5 6.2 using CUDA and times around 25-35ms I think it is okay as is
If you are not using GPU0 you can try disabling it in the BIOS.I guess there is no way of making the .net use GPU1
That's how he's training all these models with that 4090 lolPS - what a beast of a card you are running... with 56GB memory and 300+ operations/second