5.5.8 - June 13, 2022 - Code Project’s SenseAI Version 1 - See V2 here https://ipcamtalk.com/threads/codeproject-ai-version-2-0.68030/

spuls

Getting the hang of it
Joined
May 16, 2020
Messages
89
Reaction score
68
Location
at
mouser.com was able to deliver my orders from november (dual TPU Version). I would recommend the usb version only for testing/development since it will add some latency and flaws on high load.

as far i can tell codeproject does only support pytorch => so no coral support for now?
 

tech101

Known around here
Joined
Mar 30, 2015
Messages
1,475
Reaction score
2,131
Location
SF BayArea, USA
mouser.com was able to deliver my orders from november (dual TPU Version). I would recommend the usb version only for testing/development since it will add some latency and flaws on high load.

as far i can tell codeproject does only support pytorch => so no coral support for now?

Ah, Good to know thank you so much for that info, I was able to place at-least a back order from Mouser under $80 with shipping and taxes.

1673115225207.png
 

CCTVCam

Known around here
Joined
Sep 25, 2017
Messages
2,674
Reaction score
3,504
Out of Stock and 81 Week lead time (estimated) for UK. Therein lies the issue. Many shops still Out and those with stock scalping.

I can't believe Coral are allowing official distributors such as Amazon to inflate prices.
 

tech101

Known around here
Joined
Mar 30, 2015
Messages
1,475
Reaction score
2,131
Location
SF BayArea, USA
It looks like there is an in system alternative, but without confirmation from Mike, it's hard ot know if it will work / be supported:


No visible pricing or availability though. also seem to be some questions about slot and power compatability with motherboards.
I wonder how much more performance it will be Vs the USB C, Only thing at-least for now , don't think my MOBO on the Current Lenovo Machine has an extra M.2 PCIe Slot :(..
 

CCTVCam

Known around here
Joined
Sep 25, 2017
Messages
2,674
Reaction score
3,504
I wonder how much more performance it will be Vs the USB C, Only thing at-least for now , don't think my MOBO on the Current Lenovo Machine has an extra M.2 PCIe Slot :(..
It's got 2 TPU's so at least twice. It maybe faster still if it can parallel process like dual channel memory.I don't know whether it does or doesn't so I'm going to assume 2x the performance. It's also going to be 2x the wattage of course but at 4 watts, it hardly breaks the bank!
 

tech101

Known around here
Joined
Mar 30, 2015
Messages
1,475
Reaction score
2,131
Location
SF BayArea, USA
It's got 2 TPU's so at least twice. It maybe faster still if it can parallel process like dual channel memory.I don't know whether it does or doesn't so I'm going to assume 2x the performance. It's also going to be 2x the wattage of course but at 4 watts, it hardly breaks the bank!
Do you know if we Can use two USB Coral on the same machine ?
 

CrazyAsYou

Getting comfortable
Joined
Mar 28, 2018
Messages
247
Reaction score
263
Location
England, Near Sheffield
Do you know if we Can use two USB Coral on the same machine ?
That would normally depend on the software and drivers, but very doubtful for the same process to access two independent devices unless some really cleaver stuff was implement at driver layer. What would be more likely is running two separate software instances (CP.AI or Deepstack on two different ports) with one configured to use one Coral and the 2nd on the other Coral, at the cost of double the RAM usage but not double the performance gain.
 

spuls

Getting the hang of it
Joined
May 16, 2020
Messages
89
Reaction score
68
Location
at
The driver provides access on the dedicated TPU chips. That means it is no problem to use 2 or more USB TPU´s or one or more of the dual TPU version. There are also some quite expensive m.2 e-key pcie adapter with an active cooling solution on the market to handle up to 16 TPU chips (e.g. from asus or blackbox). I guess for home users i would recommend the b-key version with a single TPU since you then can use quite cheap pcie m.2 adapters.
 

harleyl7

Pulling my weight
Joined
Jun 4, 2021
Messages
260
Reaction score
223
When I installed this with docker, it put me on the 2.0 beta. Does this version already have the @MikeLud1 custom models? I noticed in the directory of my docker it has:

1673559759492.png

So do all I need to do is put ipcam-combined.pt into my blue iris custom model folder?
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
374
Reaction score
489
Location
USA
When I installed this with docker, it put me on the 2.0 beta. Does this version already have the @MikeLud1 custom models? I noticed in the directory of my docker it has:

View attachment 150788

So do all I need to do is put ipcam-combined.pt into my blue iris custom model folder?
You may have pulled the trigger on upgrading the Docker version too soon, see this post from @MikeLud1 :

As far as I can tell, unless you are part of the inside Beta testing team, we are advised to stay on the 1.6.x releases.
 

gwithers

Getting the hang of it
Joined
May 18, 2016
Messages
49
Reaction score
38
I've been using 2.0-Beta since it was released over a month ago and this version has worked fine and without issue. So, there is nothing really wrong with that version from the object detection perspective. Maybe there are some other concerns (like the module install/uninstall functions don't seem to work) but nothing I use with BI. It is, after all, the docker container they have pushed as the one you would pull by default. If it was a problem, I assume they would have removed it weeks ago.
 

harleyl7

Pulling my weight
Joined
Jun 4, 2021
Messages
260
Reaction score
223
You may have pulled the trigger on upgrading the Docker version too soon, see this post from @MikeLud1 :

As far as I can tell, unless you are part of the inside Beta testing team, we are advised to stay on the 1.6.x releases.
Lol I saw that and well I guess you can call me a beta tester then because I'm definitely testing it.

I realized that they are included (I think), and then I also dropped them in a folder called custom-models on my desktop on my blue iris server and mapped it on blue iris AI tab.

Only issue I was having was blue iris was trying to use IPcam-combinded, and the codeproject couldn't find it because it's called ipcam-combined.pt on the docker. So I changed the name of it on the container, when I honestly should've just changed the name of it on my blue iris server. It's working now, but I'm getting AI times of 1000ms or so
 
Top