Rob’s Tech School Part 1 – The Great Framerate Debate

Posted by on January 31, 2013 at 4:00 pm
Must...hit...5 million...FPS...

Must…hit…5 million…FPS…

30 FPS, 60 FPS, 120 FPS, Vsync, tearing, flicker…

I have seen this topic come up an untold number of times on gaming and hardware forums over the years and in various message boards and the mods always end up locking it because the argument is old, played out and stupid. People have gotten so pissy over the framerate issue that, now, if you even bring it up it seems to be grounds for locking a thread. Now I’m going to take a moment to clarify the situation on framerates and how our eyes perceive moving images.

This will be the first in a series of articles in which I will cover a number of “technical” topics and will do my best to pull back the shroud of mystery which covers such things as frequency response, signal to noise ratio, video resolutions and a number of other things which tend to confuse the hell out of people when they are looking to buy new gadgets for their home entertainment and gaming systems.
There are 2 schools of thought on this topic. One group thinks that a game (or the hardware that game is on) MUST produce 60+ frames per second or it’s a huge pile of shit. The other group thinks that as long as the frame rate meets or exceeds that which is necessary to provide the illusion of buttery smooth motion, then it’s good enough.

I’m here to clarify a few things because I believe that the former of those 2 feel the way they do because they are misinformed and unaware of the way the framerate situation works.

Without getting into a long discussion about Flicker Fusion Frequencies, after-images, etc…The framerate issue breaks down like this.

All humans perceive light differently and all humans have a slightly different threshold at which an animated image appears to move smoothly. On average, however, this threshold tends to hover between 15 and 17 images (frames) per second. (Please not that I am NOT talking about light flicker – I am talking, VERY SPECIFICALLY, about continuous movement in animation. Flicker is a TOTALLY different subject and has more to do with refresh rates)

If the number of images per second drops below an individual person’s threshold, they will notice “chop” or stuttering in the animation. If the number of images per second is above an individual person’s threshold, the animation will appear smooth and continuous.

In NTSC television (America / Canada) images on the screen are produced by scanning every other line, 60 times per second. Lines 1, 3, 5, 7, 9, etc.. are drawn on one pass and then 2, 4, 6, 8, 10, etc.. on the next. This is called “interlaced scan(ning)” and is the “I” in 480I and 1080I. So since it takes 2 passes, at 60 passes per second (60Hz refresh rate) to draw each image, we get 30 images (frames) per second (It’s actually 29.97 FPS but that can only be explained via a long technical dissertation that I haven’t the desire to present, right now). Standard televisions are not able to exceed this limitation.

In PAL television (Western Europe and most of the rest of the World, not including eastern Europe, France and some countries in Africa which use “SECAM” images are also produced in an interlaced scan but at 50Hz…hence they are presented at 25 images (frames) per second. Here, too, most televisions are not able to exceed that limitation.

In motion pictures (like you see in the theater) all movies run at a standard 24 images (frames) per second. There is NO overcoming this, at this time. Standard movie cameras run at this speed so that’s what we see on the screen. Yes, we just had a movie which was filmed and shown (in some theaters with the right equipment) at 48 fps but many people complained that seeing it at that speed made them feel wobbly, dizzy, strange, uneasy and, in some cases, physically ill so for now the technical limitation is still 24 fps.

When is the last time you saw a television show or a movie in the theater that was choppy? I’ll answer for you…NEVER.

So why is it that people have these great big fits when a game’s frame rate drops below 60? (and GOD FORBID they get down to 30…it’s a friggin federal case at that point!)

It’s simply because they don’t understand human physiology. Yes, our eyes can see in hundreds of frames (images) per second…But when we’re looking at something that falls below the number of frames our eyes can see, our brain interpolates the images and makes assumptions, then fills in the gaps. In essence, we hallucinate the bits that we can’t see. It’s the same reason that every human being has a blind spot, but you’ll never know unless you look for it. Your brain takes all of the available information and then fills in the blank spots with what it believes should be there…Tricky, isn’t it?

The only effect differing framerates can have on gaming relates to twitch style gaming, where a person with 120fps MAY have a slight advantage over someone with 30fps due to the machine calculating hitboxes and timing shots, etc…but this has nothing to do with the appearance of the game.

Now, the next time you see that a game has a frame rate of 30 FPS, think back on this and ask yourself…”When is the last time that I saw a choppy movie on TV or in the theater”. If you can’t come up with an honest answer to that question then don’t give it a second thought.

Don't Keep This a
Secret, Share It