What is bitstarving?

TOPICS:

When is HD not really HD? When is 4K not really 4K? If you’re a numbers guy you think of it this way: Standard definition signals are 640×480 or smaller. High definition is 1280×720 or higher. 4K is 3840×2160, period. That’s really only a small part of the story. Because all digital signals are compressed (they have to be, or only 4 or 5 channels would fit on DIRECTV’s largest satellite) there is almost always a loss of quality.

About those compression artifacts

When a signal is compressed too much and then decompressed, compression artifacts occur. We’ve told you about a few of them: macroblocking means large irregularly sized blocks or a noticeable grid pattern, while pixelation is the term for an image that has overall low (but consistent) resolution within an HD signal. Mosquito noise happens when the image compression is so great that contrasting dots appear around areas of high detail. These artifacts are almost always found together because they are all caused by the same thing: bitstarving.

Bitstarving is the general term for taking a digital signal and compressing it so far that the quality goes down. The same HD image can be compressed by 90% or even 99% using controls that are applied when the signal is processed. Too much compression and you lose image quality.

Take a look at the image at the top of this article. Believe it or not it’s one big HD image. The grey cat is compressed by about 60% and the white cat is compressed by about 98%. You can still see plenty of details on the grey cat but the white cat is a blurry, blocky mess. Even though the picture is blocky, it would still be considered HD.

When you look at it this way, bitrate, not resolution, becomes a very important measure of an HD signal. There are other measures (like compression technology) that skew things somewhat but bitrate is a good indication of what you’ll get. Bitrate is measured in megabits per second (Mbps.)

For example:

  • Hulu’s HD signal is about 3-5Mbps. That is lower than some standard definition programs, but Hulu uses advanced compression that gives them passable quality at that level.
  • Netflix’s HD signal used to top out at 6Mbps, but they now have 4K streams that seem to top out at 17-20Mbps.
  • DIRECTV uses a maximum 10-12Mbps for their HD and up to 35Mbps for 4K and you can really see the quality difference.
  • Blu-ray disc, which is the highest available HD signal, uses a bitrate that tops out near 40Mbps. That’s 10 times more image information than Hulu.
  • UltraHD Blu-ray, the best possible 4K signal, can top 100Mbps, but rarely gets that high due to extremely high-quality compression.

In many cases it’s hard to tell the difference between 10Mbps and 40Mbps because the compression technology is just that good. Get down to about 3Mbps though, and everything can become a mess pretty quickly.

So how can Hulu call their signal “HD?” Because, technically, it is; the resolution stays the same at 1920×1080. By the book, that’s high definition. So what can you do? In most cases you won’t have the sophisticated equipment to detect bitrates, and unless bragging rights are more important than plot, the most important thing is that the picture looks good to you. If it doesn’t, find another provider.

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 10,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.