Why are TV resolutions the numbers that they are?

TOPICS:

You know I’m going to get comments about this one. OK, here goes.

What we all refer to as “4K” video comes in at a resolution of 3840×2160. High definition video is 1920×1080 or 1280×720. When you get to standard definition, the numbers get a little hazier because it’s all analog, but the most commonly quoted resolutions are 720×480 and 640×480.

But why?

Here’s the part where the comments start. I don’t exactly know. I will tell you that the “480” part of standard definition is pretty easy to suss out. The original analog standard for US broadcasting specified 525 lines of vertical resolution. (The horizontal resolution was variable, because it was an analog signal.) That resolution included things like the “vertical blank,” an area where there could be no signal, as well as the lines used for closed-captioning, as well as a few lines because televisions of the day always overscanned.

Standard definition televisions were said to have an aspect ratio of 4:3. You might remember that TV screens weren’t really straight lines back in the day, but they were close enough. So if you apply that ratio to the 480 lines, you get 640. The 720×480 number came later as the standard for widescreen DVD was developed.

How this translated out to high definition

Here I’m going to make liberal use of guessing and myths. And again, you’re going to slay me in the comments.

Back when the HD standards were first developed, there were two camps. One said that it was most important that the entire picture be shown 60 times a second. The other favored an interlaced approach similar to standard definition sets, where half the picture was shown every 1/60 of a second but the image was better quality.

For those who favored the first approach, they must have looked at the way a widescreen image was shown on a standard definition TV in the US. That would have equated to 640×360. Double that and you get the HD resolution of 1280×720.

Getting to 1920×1080 takes a little more imagination and a global mindset. The original standard definition signals in Europe were 625 lines, not 525. This netted out to an effective resolution of 540 lines, not 480. By the early 2000s, TVs were already going widescreen in Europe thanks to the DVB standard. These TVs gave an image resolution of 960×540, considerably better than what US TVs could do at that time.  Double that resolution, and you get the now-familiar 1920×1080 resolution that is universally used for HD.

From there it’s easy to get to 4K

To get to the 4K resolution of 3840×2160, you either double the 1920×1080 resolution or triple the 1280×720 resolution. Both work. So a 4K television is going to give the best possible reproduction of any resolution that’s been used in the US or round the world, because it’s also four times the widescreen 960×540 resolution. It’s not an even multiple of 640×480 but once you’re making that big of a jump, I’m not sure it really matters.

If it seems almost by accident…

well I am confident that it is. Television systems were developed independently in the US and Europe in the 1930s, and no one at that time could have imagined that you could walk into any store and buy a 40″ widescreen HD television for two days’ pay. Television evolved slowly, and along the way every TV ever made had to adapt. This meant hauling along old standards every time a change was made. Up until the early 21st century, any television made could pick up modern broadcasts with no additional hardware. When TV went digital in the 2000s, old TVs finally became obsolete, but as we see from the math, some of those standards still followed us.

 

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 10,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.