Hello all,
I'm helping a friend in setting up a surveillance system and for regulatory purposes, he is required to make continuous recordings with a minimum frame rate (only 10 fps, so I wouldn't expect it to be an issue). I have configured all of the cameras directly for the minimum frame rate required, and attempted to set each camera setting in Blue Iris to match that minimum. Unfortunately, there is a grayed 'adjust automatically' box which is checked and I can not change. I've found that the frame rate gets dropped for cameras after running fine for several hours, and I want to stop that from happening. Does anyone have any experience with forcing a specific frame rate and getting rid of that automatic adjustment?
FYI, the system is designed to handle the full 50 cameras, with only 14 on-board right now(I will be adding another 11 cameras planned over the next two weeks). He is using a variety of indoor and outdoor Foscams, some wired and some wireless. The cameras have been set to lower 640x480 resolution and are spread out among three dedicated gigabit ethernet networks (all separate networks, routers, and ethernet cards and will be a total of 4 networks when it is complete), so there is not a bandwidth problem. The CPU is an 8 core 4 ghz chip with 16 gb of ram; so not a problem there either. Initially, I stress tested the system; and was able to get 780p/higher @ 30fps at 95% on the CPU, the lower resolution and frame rate are coming in at only 19%.
Thank you all for your help!
I'm helping a friend in setting up a surveillance system and for regulatory purposes, he is required to make continuous recordings with a minimum frame rate (only 10 fps, so I wouldn't expect it to be an issue). I have configured all of the cameras directly for the minimum frame rate required, and attempted to set each camera setting in Blue Iris to match that minimum. Unfortunately, there is a grayed 'adjust automatically' box which is checked and I can not change. I've found that the frame rate gets dropped for cameras after running fine for several hours, and I want to stop that from happening. Does anyone have any experience with forcing a specific frame rate and getting rid of that automatic adjustment?
FYI, the system is designed to handle the full 50 cameras, with only 14 on-board right now(I will be adding another 11 cameras planned over the next two weeks). He is using a variety of indoor and outdoor Foscams, some wired and some wireless. The cameras have been set to lower 640x480 resolution and are spread out among three dedicated gigabit ethernet networks (all separate networks, routers, and ethernet cards and will be a total of 4 networks when it is complete), so there is not a bandwidth problem. The CPU is an 8 core 4 ghz chip with 16 gb of ram; so not a problem there either. Initially, I stress tested the system; and was able to get 780p/higher @ 30fps at 95% on the CPU, the lower resolution and frame rate are coming in at only 19%.
Thank you all for your help!