I have a Lorex system and when I set it up it each of the four cameras has a different time. As I watch the live display, I see the main system timer and then the individual timecodes on each camera differs by 1 or even 2 seconds. I exported .mp4 video from this system and I am trying to solve a problem that occurred between two cameras as accurately as possible. If the cameras were perfectly in sync with the timecodes it would be ideal for my solution. However they seem to differ by varying amounts and strangely in one video the timecode sometimes ticks at a rate different than the video capture/playback (i.e., the time changes after 14 frames of a 15 fps video). I would love to hear what people know about-
1. why would cameras on the same system have different timecodes for the same moment in [actual] time?
2. are these timecodes typically "burned into" the footage at the camera before it is streamed to the NVR?
3. what could cause the on-screen time code to go out of sync with video capture rate?
4. If the timecode is "burned into" the .mp4 file during export from the NVR, would encoding/bitrate settings etc potentially cause variations in timecode?
thanks!
1. why would cameras on the same system have different timecodes for the same moment in [actual] time?
2. are these timecodes typically "burned into" the footage at the camera before it is streamed to the NVR?
3. what could cause the on-screen time code to go out of sync with video capture rate?
4. If the timecode is "burned into" the .mp4 file during export from the NVR, would encoding/bitrate settings etc potentially cause variations in timecode?
thanks!