Blue Iris video suddenly has "Ghost" images and chop

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
This only started recently, about the time as the last BI upgrade where the new "time line" was added to the UI3 remote viewing.
However it also happens on the main BI pc.

The only thing changed recently (by me) was to activate the on board graphics (which were disabled because I have a video card).

Windows 10 PC I7-6700 32g RAM 1TB Nvme (New) and 8TB WD Purple (Stored).
I am only running TWO cameras, 4mp UNV IPC2124SR3-ADPF28M-F.
My server is 1gb POE, and I have 1GB fiber internet.
Video card is GTX1650 (Nvidia) 4GB

I have verified ALL the settings are still same (from Wiki recommendations!) so nothing I can find has changed as far
as settings.

Direct view of the camers thru the Uniview software (EZView) or Iphone app also are flawless.. so something in BI isn't right?
 

Attachments

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,907
Reaction score
21,293
revert your changes one at a time and find the culprit.
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
revert your changes one at a time and find the culprit.
The first thing I did was ask "what changed"? I saw on a Tech site that you could utilize the Nvidia graphiscs card AND use the onboard (Intel) graphics for the video decode so it would "help, offload or share" the video processing.
So I enabled the on board graphics via BIOS... and the Task Manger showed GPU0 and GPU 1... they were both processing 14% and 5% respectively). The CPU % did not change much, perhaps 3% since I am only a 2 camera system.
Google searching also said that it was not an issue to simultaneously have the Graphics card and the CPU video active.

Apparently WRONG, at least for BI video processing. as soon as I deactivated the CPU video, the Ghosting issue and chop vanished.

I am still wondering why as from what I read, this should not have been a detraction to quality, it was supposed to "help" with video processing.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,681
Reaction score
14,043
Location
USA
What were your hardware acceleration options set to in Blue Iris? If you had enabled intel acceleration before, but didn't have the device active, it would have been using software decoding originally. Then it would have begun using Intel hardware decoding after you enabled the integrated graphics.
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,907
Reaction score
21,293
The first thing I did was ask "what changed"? I saw on a Tech site that you could utilize the Nvidia graphiscs card AND use the onboard (Intel) graphics for the video decode so it would "help, offload or share" the video processing.
So I enabled the on board graphics via BIOS... and the Task Manger showed GPU0 and GPU 1... they were both processing 14% and 5% respectively). The CPU % did not change much, perhaps 3% since I am only a 2 camera system.
Google searching also said that it was not an issue to simultaneously have the Graphics card and the CPU video active.

Apparently WRONG, at least for BI video processing. as soon as I deactivated the CPU video, the Ghosting issue and chop vanished.

I am still wondering why as from what I read, this should not have been a detraction to quality, it was supposed to "help" with video processing.
You stated two changes an update and video processing. Simple, revert one at a time.
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
What were your hardware acceleration options set to in Blue Iris? If you had enabled intel acceleration before, but didn't have the device active, it would have been used software decoding originally, then switched to Intel hardware decoding after you enabled the integrated graphics.
I had Intel VPP And not an Nvidia So activating Intel would not need any settings change for decoding
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
Then this:


Can I use Intel graphics and Nvidia at the same time?


Both cards can remain active at the same time, but not for the same application. The integrated graphic chipset is the one responsible for switching on the dedicated graphic card when an application, such as a game, requires it, and then switching it off again when you exit that application.

So how would one utilize the onboard CPU graphics for BI and let the Nvidia GPU handle the other PC graphics? My BI was set to use Intel+VPP for decoding. Apparently it did not like something.
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
PS. The Nvidia runs the PC monitor.... I use this compter for some other things, but it is the main BI PC.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,681
Reaction score
14,043
Location
USA
I updated my post above for slightly better clarity. My point is, it is a very common configuration error for people to set Blue Iris's "Hardware accelerated decode" option to "Intel" (or "Intel+VPP" or "Intel Beta") on a system where the Intel integrated graphics are disabled or do not exist in the first place. In such a situation, Blue Iris will try to load the Intel decoder, fail, and silently fall back to software decoding without notifying the user. So when you enabled your Intel graphics, your Intel hardware acceleration would have suddenly started working for the first time, without you changing any of Blue Iris's configuration.

Intel hardware acceleration is a bit buggy, hence it is the most likely culprit for your "ghost images and chop". You may be able to fix it with an intel graphics driver update. But if not, then just set "Hardware accelerated decode" back to "No" in Blue Iris Settings > Cameras tab. And make sure it is also set to "No" or "Default" in each Camera properties > Video tab. Then restart Blue Iris.

You can determine if hardware acceleration is being used configured by looking in Blue Iris Status > Cameras at the "HA" column. It will be a - character if hardware acceleration is disabled. Otherwise an abbreviation like I for Intel. I+ for Intel+VPP, etc. But this doesn't mean it is actually being used. To know if it is being used you need to look in Task Manager at the various GPU usage graphs. There's a graph specifically for video decoding.
 
Last edited:

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,681
Reaction score
14,043
Location
USA
If you end up not using the Intel hardware acceleration or any of the onboard video outputs, then there is really very little point in having it turned on, and you might as well turn it back off in the BIOS.
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
Ok to clarify. I am using the Nvidia GPU video output. Nothing is using the onboard video connections. My original settings were "Intel+VPP" and did not have any issues, all worked fine. Only when I turned on the Intel CPU graphics in the motherboard BIOS..things began to cause issues.
I had see some videos and read that you can use both outputs, as BI will use the "Intel" decode, and the other PC functions will be thru the Nvidia GPU.

So.. if I am using the Nvidia, I should set decode to NVIDIA or OFF. Also, setting it to "Intel+VPP" with the BIOS disabled, is a pointless setting since the hardware isn't turned on for decoding.
 

Andykev

Getting the hang of it
Joined
Jun 21, 2021
Messages
52
Reaction score
28
Location
SF Bay Area
I set BI decode to Nvidia@NVDEC but the status shows "N" for video decode. Previously it showed I+ for the Intel but you said that did not mean it was actually being used.
However, with BOTH enabled (Intel and Nvidia) my Task Manager showed GPU 0 and GPU 1 both active and each had a different % of utilization. I have since deactivated Intel on board graphics.
 

Attachments

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,681
Reaction score
14,043
Location
USA
It sounds like you have a better understanding of what is going on now.

Just be aware that Nvidia's hardware acceleration is not energy-efficient. Some people found that their power consumption actually went up a little with Nvidia decoding versus no hardware acceleration.
 
Top