Perhaps it because I wasn't paying attention before, but it seems that ever since the new RTSP time coding that Ken has implemented, cameras that previously were rock solid at 15fps (as defined in the Hikvision camera GUI), are now dropping as low as 8FPS during a lot of scene activity.
It's made the video noticeably jerkier. I've had this phenomenon in the past, but it was when trying to run 1080p on all cameras, and with the CPU absolutely maxed at 100%.
Now, the CPU hardly reaches 80% when there's a lot of camera activity/recording (i.e. rain, lightning, wind, etc), and my FPS is still dropping.
Aside from the obligatory direct-to-disk and various other CPU-load lightening tips, is there a reason why the recorded frame rates are taking a hit despite the CPU being under-utilized?
It's made the video noticeably jerkier. I've had this phenomenon in the past, but it was when trying to run 1080p on all cameras, and with the CPU absolutely maxed at 100%.
Now, the CPU hardly reaches 80% when there's a lot of camera activity/recording (i.e. rain, lightning, wind, etc), and my FPS is still dropping.
Aside from the obligatory direct-to-disk and various other CPU-load lightening tips, is there a reason why the recorded frame rates are taking a hit despite the CPU being under-utilized?