Is a Core i5 6500 Rig Still Considered a Feasible BI Platform?

Pogo

Getting the hang of it
Joined
Apr 26, 2022
Messages
148
Reaction score
49
Location
Reportedly in the Area
Yeah, I know. A pretty loaded question. And of course, 'it depends'. LOL

I'm currently running a dedicated Optiplex 7020 SFF i7 4790, 16GB RAM, 256GB SSD, 4TB spinner, 20 odd cameras from 1080 to 4K, use Desktop mode almost 24/7, basic residential surveillance/observation application, no integrated AI or any planned beyond what's available in camera firmware, mostly motion triggered recording with only a couple 24/7 cameras going, average 18 ~ 20% CPU and similar GPU exclusively using H.264..., until recently which is the basis for the i5 6500 question since Skylake was the first gen to provide QuickSync support for HEVC.

I realize most folks would just jump to an 8500 or better. So would I if budget would allow, but that currently isn't the case and I'm not quite sure if basic IP camera HEVC technology has arrived at a point of actually requiring anything more than an i5 6500 in a setup as modest as mine anyway. While not completely providing a fixed-function H.265/HEVC 10-bit 4:2:0 platform, it's still a big jump from dumping HVEC duties over to the CPU even for just a couple cameras..., at least from what I understand.

So anyway, the current system is working flawlessly and has been since my last supported 5.7.4.2 upgrade last May. I've been reluctantly wanting to upgrade to a simple Gen6 platform for the H265 HA support but the prices have been (and still are) ridiculous. I found an Opti SFF 5040 i5 6500 Win10 Pro box locally for $75 that I can probably snag for $50. I'm strongly considering it before it disappears. Only issue that sorta bothers me is the DDR3 RAM, though I've read it will also take DDR4. Jury is out on that.

The main problem I'm hoping to remedy is just the jerkiness and artifact crap currently all over the place when viewing either live or recorded video from the couple of 4K H265 cameras that don't have a 4K H264 option. (FWIW, they work fine through my Dahua 4108 NVR. )

Is an i5 6500 upgrade a likely fix for this or should I not even bother until I can afford a Gen7 or 8 solution?

As usual, TIA for any enlightenment or productive insight. I always appreciate it.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,028
Reaction score
48,788
Location
USA
Keep in mind your 4th gen can still handle H265, you just can't offload it to the internal GPU for hardware acceleration.

Have you tried turning off Hardware Acceleration in your 4th gen? I have the same 4th gen with more cameras than you and the same range of resolution and I don't have a problem.

A 6th gen would still be the recommended CPU if it were not for Win11 as an 8th gen is the first one that can accept Win 11. So if that is important to you, then you should consider an 8th gen or newer if you can't make your 4th gen work.


Keep in mind though that before substreams were introduced, HA was needed because it was using mainstream video and would choke a CPU, so it needed to be offloaded and the CPU savings to do so was tremendous, but not as great as going to substreams can do.

Substreams use so little CPU% that the CPU uses more CPU% to send the video to the GPU and let it process it and then CPU% to bring it back so it takes more CPU% to offload it than the savings it produces.

Try it and see. Most of us have since turned off HA and see no difference or slight decrease in CPU with HA turned off if you are using substreams. Once you are above about 14 cameras, the CPU usage will be lower without it.

I have well over 14 cameras and my video actually got better not offloading it. I would occasionally see a jitter and not now. Pushing too much bandwidth to the internal GPU was causing problems.

And like I said, we have posts like yours all the time where someone updates and now the system is unstable and turning off HA fixes it.

Some still don't have a problem, but eventually it will result in a problem.

Here is a sampling of recent threads that turning off HA fixed....

No hardware acceleration with subs?


Hardware decoding just increases GPU usage?


Can't enable HA on one camera + high Bitrate


Amcrest T2599ew Lagging really bad
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
If you cant afford an extra 60-70 dollars for an i5-8500 system then just stick with what you have..
 

tigerwillow1

Known around here
Joined
Jul 18, 2016
Messages
3,849
Reaction score
8,519
Location
USA, Oregon
I do not know how QuickSync support for HEVC plays into it, but by raw benchmark scores you'd be stepping up to a slower system with the i5-6500. There wasn't much performance improvement from gen 4 to gen 6, so by moving from an i7 to an i5, even though it's newer, you lose CPU performance. The big deal change from gen 4 to gen 6 is reduced power consumption. Gen 8 gives an improvement in both performance and power consumption.
 

Pogo

Getting the hang of it
Joined
Apr 26, 2022
Messages
148
Reaction score
49
Location
Reportedly in the Area
'Preciate the links and everyone's input.

I sorta know the general evolution of the processor families. Blue Iris doesn't seem to fall into leading edge development considerations for any of them until AI becomes a major element -- and what QuickSync can actually do in conserving CPU horsepower when used under high utilization conditions, particularly in the case of the 6th and 7th Gen versions as H265 more aggressively emerged -- from what I've read.

Benchmark scores aside, how QuickSync plays into it is pretty much the question at hand and whether an i5 6500 will improve 4K H265 performance over a 4th Gen 4790..., at a third of the cost of an 8th Gen upgrade. And as resources go, there's plenty of headroom even in my existing setup. Just no (or enough) H265 support for what Blue Iris seems to need for effective processing/rendering of clean UHD streams.

And wouldn't eliminating HA as a possible solution to the erratic 4K HEVC behavior with the 4790 be irrelevant since it doesn't support H265 and the CPU already handles it? BTW, I killed the Intel HA support, rebooted all cameras and immediately saw nearly a 20% increase in CPU utilization. Admittedly just a quick and dirty test, but bears out what I have come to experience. No change whatsoever in the 4K H265 performance.

I also gather that the main consideration for going straight to the 8500 is more of a practical one for Win11 conformity and the associated future-proofing rather than providing any appreciable performance improvement of Blue Iris regarding H265 implementation over say an i7 7700 or maybe even an i5-6500?

Bottom line is my 4K H265 looks like shit on Blue Iris and I'm hoping for a practical and economical way to make it happier than it currently is on a Gen4 box.

You've given me some things to thing about..., and a wonderful wish list to peruse.

Again, the input is appreciated.
 

Pogo

Getting the hang of it
Joined
Apr 26, 2022
Messages
148
Reaction score
49
Location
Reportedly in the Area
Hi again. If you'll indulge me a brief revival of the thread with a different angle on he same basic topic of trying to achieve any level of usable 4K/H.265 performance out of Blue Iris, I'd appreciate any additional input.

I'll preface this by stating up front that the same cameras (Amcrest, and yes, even a Reolink 810A) work fine through my Dahua NV4108 and VLC at full 4K resolution and frame rates. At present, I seem to clearly be dealing with a Blue Iris or server hardware issue -- or the combined effect of both to one degree or another.

So I ended up snagging the 5040 i5 6500 for fifty bucks as a test box for now -- with Win11 Pro installed and functioning with 8MB of RAM on a 500GB 2.5" SATA 5400 spinner. Sweet, eh? LOL

Managed to extract the OEM license (Win10 Home) and got that installed straight to the HDD for a basic fire-up. Trimmed the fat and added BI straight into the cauldron and started stirring with one camera to see if the GPU was even active for starters. It wasn't and one camera full screen was sucking nearly 35% out of the CPU as reported by BI. Sub-stream alone was 10%~12%. All the other normal stuff was in place, blah, blah, blah. An hour of driver updates, adding another 16GB of RAM, and getting QuickSync fired up brought things into a more reasonable range full screen main stream, but still about the same just idling with the sub-stream.

Now I realize the obvious shortcomings with the HDD and to some extent probably even the RAM if the odd stick is enough of a mismatch, but was at least expecting QuickSync to smooth out the H.265 performance somewhat. It reduced CPU utilization as expected but the H.265 decoding was still erratic and staggered with block shearing, half screen freezing and the other various artifacts that occur when H.265 doesn't play nice. And yes, it was pretty much the same with both the Amcrest and the Reolink. The Amcrest has a H.264 option at 4K which cleaned things up immensely, though some jerkiness still remained. No such option with the Reolink unless reducing resolution to 2K which defeats the purpose. And fwiw, both cameras perform better on my main i7 4790 box with 20 other cameras and no HA available.

So that brings me to hardware improvement options for the Gen 6 box, particularly the OS/BI/db SSD solution. There was never any intent to run the HDD for anything other than seeing what worked and what didn't. Yeah. I got a little carried away being over-anxious with the Gen 6 / H.265 thing and just pissed up a rope with all that. Dumb ass me again.

Now I need a better understanding of the performance impact I should expect from both Blue Iris and the CPU/GPU with a M.2 NVMe SSD (and matching up the RAM) to get things moving a bit more smoothly for the Blue Iris test box-- specifically with respect to the decoding process itself. And is a SSD with DRAM a significant consideration in this particular application given the existing RAM is only DDR3? Lots of SSDs to choose from out there, but fewer and fewer with DRAM which seems like it would be desirable for our purposes.

I keep reading references to 'less jitter', 'higher frame rates', 'smoother rendering', schmega-fast transfer rates, particularly read speeds, but aren't we just as concerned about write speed efficiency between the DB and the storage drives while achieving greater efficiency and performance from the CPU/GPU in all this as well? I just can't seem to wrap my head around a video gaming application or graphics production comparison to how Blue Iris works on a day in and day out basis processing and recording incoming video streams.

Realizing anything is better than what I've got now in this test rig, I'm curious by how much with the right SSD. They seem to be quite a bit more relevant to the overall performance scheme of things these days than just small and fast.

As usual, TIA for any input or enlightenment.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,028
Reaction score
48,788
Location
USA
As I said above, QuickSync/hardware acceleration may or may not be your friend.

And don't get too fascinated with H265 either.

Everyone likes to try digital zooming. That is where you can see the differences.

Plus when/if you need to export out video for police or what not, H265 can be problematic for other viewers. Once you have to start reprocessing and converting, you lose resolution.

In addition, just because the camera offers H265 doesn't mean it can do it efficiently. Some cams are better than others.

This will explain H264 versus H265 a little better.

H265 in theory provides more storage as it compresses differently, but part of that compression means it macro blocks big areas of the image that it thinks isn't moving. That can be problematic for digital zooming with H265.

However, it also takes more processing power of the already small CPU in the camera and that can be problematic if someone is maxing out the camera in other areas like FPS and then it stutters.

Further some cameras can handle H265 better than others, even if the camera "claims" to support it.

In theory it is supposed to need 30% less storage than H264, but most of us have found it isn't that much. My savings were less than few minutes per day. And to my eye and others that I showed clips to and just said do you like video 1 or video 2 better, everyone thought the H264 provided a better image.

The left image is H264, so all the blocks are the same size corresponding to the resolution of the camera. H265 takes areas that it doesn't think has motion and makes them into bigger blocks and in doing so lessens the resolution in those larger blocks yet increases the camera CPU demand to develop these larger blocks.


1667974399793.png


In theory H265 is supposed to need half the bitrate because of the macroblocking. But if there is a lot of motion in the image, then it becomes a pixelated mess. The only way to get around that is a higher bitrate. But if you need to run the same bitrate for H265 as you do H264, then the storage savings is essentially zero.


In my testing I have one camera that sees a parked car in front of my house. H265 sees that the car isn't moving, so it macroblocks the whole car and surrounding area. Then the car owner walked up to the car and got in and the motion is missed because of the macroblock being so large. Or if it catches it, because the bitrate is low, it is a pixelated mess during the critical capture point and by the time H265 adjusts to there is now motion, the ideal capture is missed.

In my case, the car is clear and defined in H264, but is blurry and soft edges in H265.

Digital zooming is never really good and not something we recommend, but you stand a better chance of some digital zoom with H264 rather than a large macroblocked H265. I can digital zoom on my overview camera and kinda make out the address number of the house across the street with H264, but not a chance with H265 as it macroblocked his whole house.

H265 is one of those theory things that sounds good, but reality use is much different.

Some people have a field of view or goals that allow H265 to be sufficient for their needs.

As always, YMMV.
 
Last edited:

CCTVCam

Known around here
Joined
Sep 25, 2017
Messages
2,675
Reaction score
3,505
For a hierarchy chart:


Choose the High End Chart as this covers processors right down to early i5's and even the odd i3.

Remember performance may differ on different software.
 

Pogo

Getting the hang of it
Joined
Apr 26, 2022
Messages
148
Reaction score
49
Location
Reportedly in the Area
Thanks, but been there, read all that in several threads..., a couple of times. And as mentioned previously, I have a general handle on the iCore CPU/GPU evolution..., at least to the extent that it would relate to my use of Blue Iris. I also know how the i7 4790 performs both with and without QuickSync in my situation, which is exactly how one would expect it to perform -- providing much more efficient CPU utilization with QuickSync enabled as is the case in most situations, especially when more sub-streams at any reasonable resolution and frame rate are involved. I also realize H.265 is not HA supported until Gen 6 and uses CPU resources in earlier implementations. Pushing H.265 at the 4790 works, but not well. That's why I started this thread.

And I'm hardly a fan of H.265, but it's part of the equation in my particular case for five 4K cameras unless I want to reduce them to 2K just for the sake of not having to deal with H.265 or disabling QuickSync to annihilate the CPU with all the additional processing load (not less) just to run H.265. Plus, there is the 'challenge and problem solving' aspect of trying to get it all sorted out to actually work ---- with Blue Iris ---- which is clearly the elephant in the room with all this in the first place. It doesn't handle the H.265 process flow worth shit with or without QuickSync in my limited experience with a Gen6 GPU -- and only one camera. My Dahua NVR handles it smoothly. VLC handles it smoothly. Freaking tinyCam running on a ChromeTV streaming dongle even handles it, albeit at 4fps (much like Blue Iris coughing at every 'I' frame and freezing and skewing...,) but it handles it.

BTW and for the record, H.265 does not use macroblocks. It uses CTUs (Coded Tree Units) which comprise CTBs (Coded Tree Blocks). While conceptually similar, they're technically quite different and not interchangeable terms.

The purpose for revisiting the thread was to inquire about my expectations for a significant boost in Blue Iris process flow by adding a mid-range M.2 NVMe SSD and whether or not onboard DRAM was of any consequence either way.

Still is.

I appreciate the input nonetheless. Thanks.
 
Last edited:

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,028
Reaction score
48,788
Location
USA
According to the benchmarks, your i7-4790 is better than your i5-6500, so that probably explains some of it.

Again, just because a camera offers H265 doesn't mean it handles it efficiently. The fact that you have to drop it down in resolution to use H264 is telling. Working with an NVR or VLC doesn't necessarily mean BI can handle it.

An SSD will be faster and more RAM may help, but it could also be a function of a crappy camera coding that can't be fixed as it relates to BI. If either camera are using any special additional codec that you can't change as part of their H265 then BI will choke on it regardless of the horsepower you throw at it.

As I played and learned, I had all my cams on H265 at one time on my 4th gen and it performed just fine. So that again is leading me towards your cameras are doing something to the H265 that BI or quicksync struggles with.

Regardless of the technical term of macroblocks versus CTUs, the end result to the casual user is evident when digital zooming that areas that are not getting a lot of motion are chunked together and most of us haven't seen any real benefit of H265 over H264, so we use what is still the "standard".

1711472875305.png
 

Pogo

Getting the hang of it
Joined
Apr 26, 2022
Messages
148
Reaction score
49
Location
Reportedly in the Area
I know all that.

The intent is to provide Blue Iris with some marginal assistance for processing what is getting thrown at it in my particular use case. The normal company line has been that it can basically run great on junk given an 8 gig RAM stick, sub-streams, and d-to-d. 30 cameras on a 486? No problem. "YMMV".

Times have changed as have the requirements for an effective BI hardware platform, even for just a residential surveillance setup that isn't intended to catch bad guys or record every license plate that goes through an intersection.

Any SSD recommendations? DRAM? DRAM-less? Is BI actually creating that many simultaneous and random read/write tasks for it to matter? Do faster write speeds actually help mitigate or eliminate video stutter, frame drops and compression artifacts? Is there a known point where Blue Iris just doesn't respond to (or care about) hardware performance increases?
 
Top