Quote:
Originally Posted by A-Rod's Cousin
Dumb question: Isn't most stuff filmed in 30 frames/second? How does anything over 30 Hertz resolve this?
You're correct about the 30 fps for US TV (movies are traditionally shot at 24 fps).
First, the reason we need 60hz instead of 30hz is that a lot of television is broadcast interlaced, meaning that only half of the lines of resolution are shown each clock tick. So, lines 1, 3, 5, 7, 9 etc. are shown for 1/60 of a second then lines 2, 4, 6, 8 etc. are shown for 1/60 of a second.
For televisions refresh rates above 60hz, a process called Motion Interpolation is used. Basically the TV guesses at what an in-between frame would look like. So for example at 120hz:
First 1/120th of a second: Frame 1 (actual), odd lines
Second 1/120th of a second: Frame 1 (actual), even lines
Third 1/120th of a second: Frame 1.5 (guess), odd lines
Fourth 1/120th of a second: Frame 1.5 (guess), even lines
Fifth 1/120th of a second Frame 2 (actual), odd lines
Sixth 1/120th of a second Frame 2 (actual), even lines
and so on