What exactly is “high-definition” television?

I know it when I see it. That sentence has been used to explain everything from art to pornography, and it applies equally well to high definition TV. HDTV is a really misunderstood term and really, in a lot of ways, it’s just something we all sort of agree on. There are standards, of course, but does that mean something that doesn’t meet those standards isn’t HDTV? Not at all. Let’s take a little deeper look at this issue and why you can pretty much say that almost anything is HDTV today.

Standard definition
In order to understand HDTV, you need to first understand what HDTV is definitely not, and that’s SD. Standard definition is the term for analog TV that existed prior to the mid-2000s, the old-school TV you grew up with. In the US, that’s a standard for broadcasting 525 vertical lines of resolution, of which 480 are actually used for picture. (The others are used for things like closed captioning.) In other parts of the world that standard is 625 lines of which 576 are used for picture.

So technically anything with 481 lines of vertical resolution is high definition, at least in the US. But obviously that’s not going to feel very high.

There are technical standards for over-the-air HD which were supposed to make things a little easier to understand, but most HD is not actually pulled from off-air antennas and the technical standards for antennas make absolutely no sense for streaming.

HDTV doesn’t have to be digital. The first HDTV systems that were used in the 1980s (these were prototype systems, not available to the public. The problem with analog HDTV is it requires a LOT of bandwidth, so that one broadcast channel would take the space of 6-7 standard definition broadcast channels. That really didn’t work for a lot of people, but technically you could have a system that is HD and is totally analog, even recording onto video tape.

Modern HDTV systems are digital so that they can be compressed. A digital HD signal fits in the same space as an old-school analog SD one because it’s been compressed to about 10% of its original size. Sophisticated math is required on both sides to make this happen.

The only real measure of HDTV is resolution. There are two basic standards for HDTV, one at 1280×720 pixels and one at 1920×1080 pixels. This was done initially because there were people who were worried that the 1920×1080 equipment was going to be too expensive. There are other technical details like frame rate and interlacing, but that’s not really the point here. The point is that if you’re looking for HD, really you’re looking for a resolution between 720 and 1080 vertical lines of picture.

Why does some HD look worse than others?
Netflix HD, especially in its early days, was really poor. The digital download versions of Star Wars pale in comparison to Blu-ray disc. Yet both are HD. That’s because there’s a lot more to HD than just vertical lines, even though vertical lines are enough to call a picture HD.

An HD picture can be compressed really well or really poorly, it can be set up to fit in a certain size data stream. If the picture is overcompressed, it can look blurry, weak, or choppy. There can be all sorts of extra noise around images. This is less of a concern now than in the past because people have a lot more streaming speed than they did before. Yet it still ends up being a problem. I watched Avengers: Age of Ultron on DIRECTV On Demand recently and the whole thing looked like it was being shown through a gauzy curtain. Yet it was, at least by the numbers, HD.

What can be done about all of this?
The real truth is no one is really going to go back and reclassify different kinds of HD, but at least in the 4K world we’re starting to see some new standards that make sense. There is only one 4K/UHD resolution: 3840×2160, and in addition to resolution, you’re starting to see some other standards for color gamut (the total number of colors that the picture can have) and dynamic range (the difference between the whitest whites and the darkest blacks.) Color gamut and dynamic range are actually the things that drew you to HD in the first place, and made the picture seem really bright and sharp. It wasn’t just the detail level.

This kind of standardization can’t come soon enough. DIRECTV’s 4K, for example, looks great except for “Undeniable with Joe Buck” which looks grayish-green because the color gamut is too compressed (probably done that way at the studio.) Netflix’s 4K is great for demonstration purposes, but arguably doesn’t look as good from a reasonable distance as Blu-ray discs with 1/4 the resolution.

So, all HD is not the same, and at least today all 4K isn’t the same either. But in the future, it should get a lot better.

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 8,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.