Understanding Frame Rates, OR "24 Into 60 Won't Go!"

Hand Crank Projector.jpg

Movies and Television work because the brain WANTS to see continuous motion.  If you look at a sequence of still photographs, each changing just a bit from the photos before and after, and all flashed fast enough in front of your eye, your brain will blend them together into moving imagery -- an effect called Persistence of Vision.  You will no longer see the separate photographs.  You will see "A Movie"!  This trick, which goes back to the invention of moviemaking, works only if the images really are flashed past the eye fast enough. The question, then, from the very beginning was, "How Fast is Fast Enough?"

Early experimenters discovered they could get by with as few as 16 such, "Frames Per Second" (FPS).  Faster was better, of course, but was harder on the equipment and also cost more, as it used up film faster.  The first movie cameras and projectors were hand cranked as well, and were typically cranked at a speed resulting in 17-20 FPS.  Of course the speed wasn't uniform, despite the skill of the operators, and viewers would notice this variation in speed.  But Movies were so darned astounding in the first place, nobody put up a fuss.

However once sound films came along, standardization of speed became vital.  The recorded audio had to be kept in synchronization with the imagery of course, and although people might overlook variations in the speed of the ACTION, the variations in SOUND -- rising and falling pitch due to variations in the speed of recording or playback -- were just too annoying to ignore!  The movie industry settled on 24 FPS as the standard speed for sound films.  By this point, both cameras and projectors were motorized, and could be designed to maintain constant speed.  And 24 FPS matched the speed of the most modern, motorized, cameras and projectors in use at the end of the Silent Film era.  (It also worked for the fidelity of the newfangled audio they were going to try recording.)

And so it has remained.  For decades.  DESPITE the rapid changes in all other aspects of filmmaking technology.  Only very recently have filmmakers started experimenting with higher frame rates.

But there was one massive problem:  Television!  Why?  Because the natural frame rate for TV is NOT 24 FPS.  So how are you going to show your Movies on TV?


Before we dive into THAT problem, I want to digress to discuss another problem with 24 FPS movies:  Flicker!  When you show a movie, the transition between successive frames of film is not instantaneous.  Projectors are mechanical devices, and need time to advance the film from one frame to the next.  And of course you don't want the actual motion of the film through the camera to show on the screen.  So a shutter is closed inside the projector to black out the current image.  Then the mechanism moves the film to the next image, and then the shutter is opened to project that next image.  In between, however, the eye sees Black.

And although a frame rate even as low as 16 FPS is good enough to let Persistence of Vision do its thing, it is NOT good enough to keep the eye from seeing that constant transition between light and dark -- to see the Movie "flicker".

However, theater operators had a clever trick for fixing this:  They ran the shutter twice as fast as the film advance!  That is, they showed each frame of the movie TWICE -- closing the shutter the first time without advancing the film.  So even though the imagery itself advanced at only 24 FPS, the flickering happened at twice that rate -- 48 FPS -- and that was enough to trick the brain into "not seeing" the flicker.  Keep this trick in the back of your mind, because we will return to it.


A major concern in the early days of TV was how to keep TV sets cheap enough so enough people would buy them!  The TV industry could spend all the money it needed on producing TV shows and arranging to broadcast them, but the TV sets, themselves, HAD to be affordably priced.

And that meant keeping the electronics inside each TV as simple as possible.

One of the most complex things a TV has to do is painting each image in turn on the TV screen at just the right rate.  Quality timing circuits are expensive, but fortunately there was a ready-made source of timing no further away than the nearest power plug!  Wall power in the US arrives at your power outlets at a constant 60 Hz (60 cycles per second).  Without getting too far off topic, suffice it to say there are important, technical reasons the power companies want to maintain that constant speed, with pretty high accuracy.

So, umm, why not design your TV to do EVERYTHING it needs to do for showing any given picture in 1/60th of a second?  This would be the so-called "Refresh Rate" -- the speed at which the picture changes on the TV screen.

Another constraint on early broadcast TV was each TV station would be licensed for only a limited range of frequencies in which to broadcast its TV "channel".  This meant you could pack an interesting number of channels into the frequency range reserved for ALL broadcast TV.

The range of frequencies per channel -- the channel "bandwidth" -- was not big enough to carry all the information for a full, new picture 60 times a second.  But it was just peachy for carrying HALF a full picture 60 times a second!  And so broadcast TV was set up to send "interlaced" images.  That is, first all the odd number lines of a picture would be sent, and then all the even numbered lines.  The collections of odd and even numbered lines were called "fields", and two successive fields made up a single frame -- a single, complete picture.

If you send the fields 60 times per second, that means you get a "full picture" frame rate of 30 FPS.  And experience from the world of movies had already shown this would work!  30 FPS was sufficiently fast for Persistence of Vision to do its thing.

In the TV, the arriving image would be turned into a beam of electrons aimed at the inside of the TV screen.  The beam would be positioned electronically to paint a line across the screen and then move down to the next line.  First all the odd numbered lines were painted -- the first "field" of the image", and then the beam would reposition back up to the top and paint all the even numbered lines -- the second field of the image.  By the time all that was done, 1/30th of a second would have passed (two fields), and now it was time to start over with the first field of the next frame.  In addition, every portion of the image from top to bottom would get some change with each passing field being painted.  So the eye would see every portion of the image (at half resolution) changing 60 times a second!

(This frame rate was fine for Persistence of Vision, but was not as good as what the movie projectors were doing to reduce Flicker; again because Interlaced painting meant the eye was only seeing half the image resolution changing with each painted field.  As I said, we'll get back to that later.)

TV cameras worked the same way:  Capturing a frame in two interlaced passes.  If you think about that, you'll see that TV and film work differently.  A frame of movie film is captured all of an instant (within the limits of the shutter speed).  But a frame of TV is captured over a 30th of a second.  That is, the two fields combining to make any single frame are NOT a single exposure.  They are a DOUBLE exposure -- slightly blurred by whatever motion might have happened during that passing time.  This difference has important consequences for the process of "de-interlacing" video for presentation on "progressive" displays -- a topic I'll reserve for some future post.


So suppose you want to broadcast a movie on TV.  Well you are going to have to prep it first.  For example, you have to arrange to have each frame of movie film broken up -- scanned and interlaced -- into two successive, interlaced "fields" for broadcast.  Because of the difference just mentioned in how movie and TV cameras work, movies won't look exactly like TV, but we'll gloss over that.  It should be good enough.

But the frame rate is NOT going to be good enough!  You can't just show your 24 FPS movie sped up to 30 FPS on TV.  It would look silly!

So you have to "invent" additional frames to raise the movie frame rate to the TV frame rate.  Now this was all happening before the advent of Digital Video Processing, of course.  You couldn't just hand a computer the technically intricate task of interpolating additional frames of motion between the existing movie frames.  Whatever you did here had to be doable with the technology of the time.  And on top of that, the two frame rates were incompatible!  That is, 24 does not divide evenly into 30.  In fact it does not even divide evenly into 60!

So any simple way you might come up with to add additional, "invented", movie frames would either result in too MANY frames or too FEW.  OR, you could say, what if we let the speed of the movie vary a little bit while we are doing this?  I.e., lets have some portions of the movie imagery remain on screen for slightly longer than other portions!  THAT could work!  And if we made the variation in the image durations subtle enough, and presented them in a regular way so the brain got used to it, viewers might not even see it!

And that was the answer.  After breaking up the movie frames into interlaced fields, the fields would be recombined into TV frames.  But some movie fields would be used more than ONCE!

To raise a 24 FPS movie frame rate to a 30 FPS TV frame rate (without altering the actual speed of the action), what you need to do is construct 5 frames of TV out of each 4 frames of movie.  The first step is to interlace the 4 frames of movie into 8, interlaced fields.

The first frame of TV is made by combining the two fields of the FIRST frame of movie.  So far so good -- we've not done anything tricky.

The second frame of TV is made by combining the two fields of the SECOND frame of movie.  Piece of cake -- this is easy.

The third frame of TV is made by combining the the first field of the SECOND frame of the movie (say what?) with the second field of the THIRD frame of the movie (HUH?).  This produces a TV frame that does not exist in the movie -- made up by combining PARTS of the 2nd and 3rd movie frames.

The fourth frame of the TV is made by combining the first field of the THIRD frame of the movie with the second field for the FOURTH frame of the movie (good grief!).  This, too is an invented fame.

The fifth frame of the TV is made by combining the two fields of the FOURTH frame of the movie.

So the two fields of the first frame of the movie are used to produce TWO fields of TV -- both in the same TV frame.  The two fields of the second frame of the movie are used to produce THREE fields of TV, split across two adjacent FRAMES of TV.  The two fields of the third frame of the movie are, again, used to produce just TWO fields of TV -- but now split across two adjacent FRAMES of TV.  And the two fields of of the fourth frame of the movie are, again, used to produce THREE fields of TV -- and, again, split across two FRAMES of TV.

Summarizing:  The 4 frames (8 fields) of the movie are shown on the TV respectively in 2 TV fields, 3 TV fields, 2 TV fields, and 3 TV fields -- altogether producing 5 TV frames.  That means portions of the 2nd and 4th movie frame are on the TV screen slightly longer than the 1st and 3rd movie frame.

This 2-3-2-3 pattern is called a "Cadence".  And the whole process of prepping movies for TV broadcast, including this Cadence application, is called "Telecine".

TECHNICAL NOTE:  As I mentioned in my prior post on Digital Video, broadcast TV is actually broadcast a tiny bit SLOWER than 60 fields per second due to technical issues in the way video and audio were combined together in the original, Analog, TV signals.  This slowdown is precisely 0.1%, meaning the actual broadcast rate is 59.94 fields per second (29.97 full images per second).  Thus, to maintain the relationship of 5 frames of video per 4 frames of movie, the Telecine process must also slow down the MOVIE by that same 0.1%.  So the actual speed of movies shown on broadcast TV is 23.976 FPS.

2-3-2-3 is the classic Telecine cadence.  You could envision other "cadences" which would accomplish the same thing so long as the relationship is maintained between the number of TV frames and movie frames.  Nowadays, when electronics allows more than one frame of movie to be buffered during the processing, the preferred cadence is actually 2-3-3-2.


Whichever cadence you use, the upshot is some portions of the action in the movie will appear on screen for slightly longer than other, adjacent portions.  This tiny reduction in smoothness of motion is termed "Cadence Judder".  Indeed, the 2-3-3-2 cadence commonly used today actually has slightly MORE Cadence Judder than the classic 2-3-2-3 cadence.  (But the newer cadence has other technical benefits -- which I won't go into here -- which outweigh that difference.)

Much like with Persistence of Vision itself, the brain, itself, comes to the rescue!  It turns out the human brain is exceedingly good at "not seeing" Cadence Judder.  Indeed you've been living with Cadence Judder your entire life -- any time you saw a movie on broadcast TV -- and odds are you've never noticed it.

But modern TVs are much more sophisticated than the original TVs.  In fact it is quite common for TVs today to be able to present movies at their ORIGINAL frame rate!  What this means is the TV shifts gears to change its refresh rate -- the speed at which the picture changes -- to something compatible with 24 FPS.

Typically that is *NOT* 24 FPS.  Why?  Because of that Flicker I mentioned up top.  Instead the TVs will typically use a multiple of 24 FPS -- 48 FPS or 96 FPS or even 120 FPS.  Modern TVs are also capable of buffering entire images, so Interlaced display of the imagery is no longer necessary.  So a TV that uses a 48 FPS refresh rate for movies will simply display each frame of the movie twice before moving on to the next frame.  Sound familiar?  Yes, it's the same, anti-flicker trick used in movie projectors.

(Interlaced TV programs, broadcast at 60 fields -- 30 full images -- per second get the same treatment.  Each image is displayed twice to produce a 60 FPS refresh rate.  In addition, the two interlaced "fields" are combined together into a full image in the TV's memory before appearing on screen.  So the eye sees the full image appearing all at once instead of seeing two interlaced half images getting painted one after the other.  This is now a "progressive" TV image instead of the original, "interlaced" image.)

A 120 FPS refresh rate is particularly interesting because 120 divides evenly BOTH by 60 and 24!  So if your TV is using a 120 FPS refresh rate it can display 30 FPS, 60 FPS and 24 FPS content all without having to change its refresh rate.  It simply changes how many times it re-displays each frame before moving on to the next.

So it is possible, in modern TVs, to display your movies "Cadence Free" -- something of a Holy Grail for movie fans.  But again, the reality is MOST people will *NOT SEE* Cadence Judder -- that is, their brains won't acknowledge its presence -- unless they are shown the same movie content playing side by side:  One with Cadence Judder and one without.

If you are curious about whether YOU can see Cadence Judder, a good place to check is in the closing Credits scroll of most movies.  If that scrolling text appears to be ratcheting upwards slightly, instead of moving smoothly, then you are seeing Cadence Judder.  (This is not a perfect test, as it depends on how the Credit scroll is authored for a given movie.)


If you have a TV which CAN display 24 FPS movie content without Cadence Judder, the question then is, how do you get it to DO that!

If you view movies on discs, you are in luck.  Movie content can be stored on both Blu-ray and UHD (4K) discs at 24 FPS -- often shown as "/24".  So you might see a note that a movie is recorded on a given disc at 1080p/24.  (From my note above, that more likely means 1080p/23.976, but the nominal /24 is what will typically be printed.)

Other movie content may only be available as /60.  That is, Telecine has already been applied to it.  This is true for movie content found on SD-DVD discs (in those portions of the world using the NTSC TV standard -- which includes the US).  And this is also, frequently true for movies streamed from Internet movie services.

But this begs the question, can the original /24 movie content be EXTRACTED from that /60 so you can still watch the movie "Cadence Free"?

This processing -- called quite reasonably, "Reverse Telecine" -- CAN be done IF AND ONLY IF the  content is authored with an unchanging Cadence!  Consider, for example, editing can screw up the Cadence.  And some TV versions of movies were processed for Telecine way back when that work was done in a mechanical system -- basically a film projector pointing at a TV camera.  Such systems produce highly variable Cadence.  In essence, what Reverse Telecine tries to do is remove the duplicated fields in the /60 video -- the results of the Telecine Cadence -- leaving only the /24 original.

If the content does NOT have a uniform Cadence, this Reverse Telecine process will glitch from time to time.  Such glitching -- called "Cadence Stutter" -- is much harder to ignore than the Cadence JUDDER you were trying to eliminate in the first place.  If the movie you are trying to watch exhibits too much Cadence Stutter, you would probably be better off just watching it at /60, and living with the much MUCH less annoying Cadence Judder!

Meanwhile, suppose you try to watch a program which is *NOT* a movie -- a program originally authored at /60, as from a TV camera -- but while still forcing it to /24 for your TV?  For example, some disc players allow you to play SD-DVD discs at /24 output -- i.e., by doing Reverse Telecine.  But many SD-DVD discs contain TV shows which were never /24 in the first place!  THAT is not going to look good at all!  The problem is, there's no way for the processing to know what parts of the real, /60 content it can safely discard to reduce the frame rate to a mere /24, as if it were a movie.

The unfortunate result is what's called Frame Drop Stutter, and it will be blatantly obvious whenever much of the screen is in motion -- as during camera pans.  It's basically the same thing as the Cadence Stutter I just mentioned, except it is happening ALL the time.  And both types of Stutter are much MUCH worse than the Cadence JUDDER inherent in raising /24 movie content to /60 TV rates (i.e., Telecine).


So to summarize, you can raise /24 movie content to /60 TV frame rates safely -- even easily -- and the only downside is the introduction of Cadence Judder (a relatively minor issue).

But you can only drop /60 content to /24 movie rates if the content was ORIGINALLY authored at /24 -- i.e., a movie -- and if the Telecine cadence is preserved well enough to be detected accurately during Reverse Telecine.  You may still have occasional instances of Cadence Stutter when the cadence detection loses lock.

However, if you try to lower TV programs -- originally authored at /60 -- to /24 frame rate, the result will be continuous Frame Drop Stutter.  And you will know it right away!  As mentioned, examples of /60 content would be TV shows on SD-DVD discs (shows not originally shot on film) and things like live concert events released on Blu-ray discs at 1080i/60.

Well this doesn't sound too bad!  You can play your /24 movie discs (Blu-ray and UHD) at a refresh rate which is a multiple of /24 in the TV and enjoy the Cadence Free goodness!  And for movie content you happen to have only in /60 (as from SD-DVD discs) you MIGHT also be able to extract the original /24 and enjoy that, too, Cadence Free!

So are there any gotchas in this?

Yep.

It comes down to the fact 24 FPS is not really all that fast a frame rate for recording motion.  MANY things we see in daily life move too fast to be recorded well at only 24 frames per second.  If you film stuff like that, the motion will not look smooth when you watch the movie.  This defect is called "Motion Judder".  It results because the frame rate in use is too slow to capture the motion accurately.  Note carefully, this is a defect in the FILMMAKING as opposed to how you are watching the film.

Now, professional filmmakers have known about Motion Judder since the very earliest days of movies, and have come up with all sorts of tricks to minimize it.  For example, if you are going to pan the camera, it is wise to put the image somewhat out of focus.  The Motion Judder is still there, but the eye won't pick it up.

And as it turns out, the recombining of movie fields into TV frames which happens as part and parcel of Telecine ALSO reduces Motion Judder!

So there ARE some scenes, in some movies, which actually look BETTER when viewed at /60 instead of their original /24!  Film buffs collect instances like this, and you might keep an eye out for discussions saying such and so movie is better to watch at /60.  For example, the gambling table scene in "Casino Royale" (2006) is one such instance.

Other folks really dote on Cadence Free viewing, and will want to watch ALL of their movies at a refresh multiple of /24 -- filmmaking glitches like this notwithstanding.


Well if THAT'S the problem, why not start making movies at a higher frame rate?

Indeed there ARE recent examples of filmmakers who have done just that!  In 2012, Peter Jackson started releasing his "Hobbit" movies at 48 FPS (both for filming and projection).  And in 2016, Ang Lee released "Billy Lynn's Long Halftime Walk" at 120 FPS!

Both instances got panned for results that did not look sufficiently "film like".  But you have to take such quibbles with a grain of salt.  For example, the home media releases of the "Hobbit" movies are just /24.  "Billy Lynn" is nearly unique in that it was released on UHD (4K) disc at a true /60 -- i.e., reduced from the 120 FPS production rate rather than raised up to /60 via Telecine.  That retail release *ALSO* included a version of the film on regular Blu-ray disc -- at /24.

Now "Billy Lynn" most DEFINITELY has an extremely, non-filmic look.  But the /60 version of the movie on the UHD, and the /24 version of the movie on the regular Blu-ray, look essentially IDENTICAL in this respect.  I.e., it is just not a particularly attractive movie to look at -- totally independent of the filming frame rate!

The main limitations on higher frame rate are not that it screws up the look of the film.  It's that it costs more money to produce and edit and it requires special projection equipment for Theatrical release.  I think the odds are good we'll see more examples of high frame rate movie making in the future.


The distinction between /24 movie content and /60 TV content is the main point of confusion for most folks thinking about this topic, but there are additional items I should include for completeness.

First, there are many MANY other Cadences out there than just the classic and modern Telecine cadences I mentioned above.    These typically arise because the content was not authored as a traditional film.  So for example, animations authored at a slow frame rate will likely use a different cadence.  Anime animations from Japan are an example.  These traditionally put a lot of effort into the artwork and reduce the number of frames per second of character animation.  If you are viewing these on /60 media -- such as SD-DVD -- odds are you will get better results displaying them at /60 on your TV.

The second item goes way back to the origins of TV.  I mentioned "NTSC" TVs were built to take advantage of the 60 Hz wall power in the US.  Well there are large portions of the world which do not use 60 Hz wall power!  Indeed, much of Europe uses 50 Hz wall power, and their TVs are based on the "PAL" system which uses a 50 FPS interlaced refresh rate.

TECHNICAL NOTE:  The number of lines, and the number of pixels across each line, in a PAL TV broadcast are higher than for an NTSC broadcast.  So even though the PAL version updates the image more slowly, the actual amount of information per second is the same for both TV formats. because the PAL image has higher resolution per frame.

And when Telecine Cadences were being devised for NTSC TV in the US, it was immediately noted they might not even be NECESSARY for PAL TV!  Why?  Because 25 FPS -- half of the refresh rate of the PAL TVs -- is sufficiently close to 24 FPS you can just SPEED UP the movie a little bit and broadcast it directly!

And indeed that's what's done.  I.e., the movie is simply sped up to 25 FPS.  So the PAL version of a movie plays in 96% of the time the same movie plays in NTSC.

Now this speed difference is not enough to mess up the action.  I.e., it still strikes the viewer as "correct" speed.  But the audio is a different story, entirely!  When you speed up the movie, that raises the pitch of the audio, and that DOES sound unnatural.  So the PAL version of a movie gets its audio "pitch adjusted" -- think of it basically as Telecine for PAL, but without the Cadences.  (Technically, this is called a 2-2 Cadence.)

Just as with the rest of the world, modern TVs sold in Europe are perfectly capable of handling multiple, display refresh rates.  So they can play a /24 movie as /24, as well as playing a /25 movie (i.e., a PAL version) as /25.

And some content PRODUCED for PAL TVs gets marketed on disc SLOWED DOWN to /24 (and pitch adjusted).  This last has caused some consternation about the "correct" way to do this.

Should the movie be put on disc with a  24.000 FPS frame rate?  Or should it be put on disc with the 23.976 FPS frame rate that's traditional due to the historical tie-in with broadcast frame rates and Telecine?

The bottom line is your gear is likely to encounter (and SHOULD be able to handle) a whole set of different frame rates:

  1. 59.940 FPS -- NTSC broadcast rate, and NTSC SD-DVD "fields" per second rate
  2. 50.000 FPS -- PAL broadcast rate, and PAL SD-DVD "fields" per second rate
  3. 29.970 FPS -- NTSC SD-DVD "frames" per second rate
  4. 24.000 FPS -- Film rate, also found on some discs
  5. 23.976 FPS -- Film rate adjusted to match (1) and (3)
  6. 25.000 FPS -- PAL pseudo-film rate
  7. 60.000 FPS -- Rate for some content on UHD (4K) discs

At ANY rate (ahem!), I hope this is all a good deal less mysterious to you now!

--Bob