Ideal bitrate <> framerate <> resolution computation?

joshwah

Pulling my weight
Joined
Apr 25, 2019
Messages
298
Reaction score
146
Location
australia
Is there a guideline for the ideal resolution <> framerate <> bitrate?

i.e. what bitrate if you run 20 FPS at 2688x1520? and then what happens if you want to change the resolution to something higher but same FPS?

Is there any harm in running variable bitrate at 20,000 to allow it to jump up if required?

What do you look out for to determine if the bitrate is too low?

Do most people run constant bit rate or variable? What is the pros/cons to each?
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
I run variable bit rate.
You are not shooting a Hollywood movie, for security and surveillance cameras 15 FPS is good enough.
The Iframe interval has is a major contributor to the actual bit rate Set it equal to the frame rate, so there is one complete frame a second.
If the bit rate is too low the quality is not good, then the video with motion will look bad, blurry.

Calculating the actual bit rate is nearly impossible, it depend on the frame rate,iframe, frame size, amount of color transitions, the complexity of the image, the amount of motion, compression type.
 
Top