The sordid history of HDMI (revised and updated)

TOPICS:

HDMI is the standard for single-cable, HD audio and video today. It is reliable, cables are inexpensive, and it’s usually the right choice for connecting video sources like cable and satellite receivers, Blu-ray disc players, and IPTV boxes to TVs.

HDMI has not always been the best choice, and if you have older equipment, it may not be the right choice for you. Let’s take a look at how HDMI evolved.

Before HDMI

 

Before HDMI, there was DVI and Component. DVI was created by computer programmers… the people who brought you floppy disks and Windows95. Component was created by home theatre geeks, the same people who brought you Betamax and LaserDiscs. It’s no surprise that neither cable really represented what the customer wanted.

DVI cables were potentially the worst idea for HDTV ever. The connectors were like fishhooks, and it was impossible to remove one without snarling every other cable you had. The little pins broke easily, the cables were delicate and expensive. At least they carried digital information so the picture quality was great, when they worked.

Component cables broke the HD signal down into three parts and put it over a triplet of cables. The information was analog, meaning that it was subject to RF interference and line loss. The cables were traditionally red, green and blue which wasn’t confusing at all… except that most people thought that the signal was broken into components of red, green and blue and it wasn’t.

Add to all of this, neither cable carried sound which meant another pair of cables for sound and the potential for a delay between sound and picture. None of this was consumer friendly.


Enter HDMI

HDMI started with a great idea: one connector, easy to attach and remove, that carried digital audio and video. The HDMI connector was designed in 2002 and went into production in 2003. Almost immediately, problems ensued.

An HD digital audio and video signal can be copied with no loss of quality, and so content providers lobbied successfully to get HDCP (high-bandwidth digital content protection) included in the HDMI standard. HDCP does more than encrypt a signal. It’s an incredibly complex set of shared secret codes, timing parameters, and automated decision processes. In short, it was a mess from the beginning.

HDCP works by scrambling a digital signal in a way that it does not lose quality. It then creates secret passwords that are shared at the beginning of every “event,” such as when you turn something on or off, play content, change channels, wipe your nose, whatever. Every few seconds — yes, EVERY FEW SECONDS — the two devices have to pass these codes back and forth or everything stops.

Devices at the time just couldn’t handle HDCP. The good news is that you could use an adapter to easily change the HDMI connector to a DVI connector so at least you could do … something. This wasn’t a great option, but it was something.


But wait, there’$ more…

Until 2011, you had to pay money to a consortium in order to make or use HDMI connectors. Obviously this made everything better for everyone. So yes, HDMI’s future looked pretty dim for a while.

In 2004 and 2005, a lot of people didn’t think HDMI even had a future. It was expensive and a lot of times, it just didn’t work. And then the digital TV transition came.

All of a sudden, millions of people bought brand-new flat TVs and HDMI was the way to go. HDMI really took off in 2006 and 2007 because it was an easy, one-cable solution. Unfortunately, there were still a lot of bugs to work out.

Versions 1.0 through 1.2 of the HDMI specification were complicated, and a lot of TVs from those days didn’t implement HDCP very well.

If you bought an HDTV that looked like this, it might have had HDMI problems. They were pretty common at the time.


Finally… HDMI 1.3

The HDMI 1.3 specification in 2006 helped a lot to standardize HDMI. At the same time, TV manufacturers were getting smarter about HDMI. The good news is by 2008 most HDMI incompatibility issues were past… although that wasn’t very comforting to people who had bought HDTVs in the two years prior.


HDMI 1.4, 2.0 and 2.0a

With the hard work behind it, HDMI had some time to evolve. The next iteration of HDMI brought some support for 4K televisions, as well as advanced features like support for an audio return channel (ARC) that allowed people to hook everything up to the TV and feed audio to a second source like an A/V receiver. This essentially killed coaxial and digital audio; while they continue on to this day, they’re no longer the standard.

HDMI 2.0 and 2.0a really represent the pinnacle of HDMI evolution and really should be the only standards we need for a while. They fully support 4K and the increases in dynamic range and color gamut that people want today, plus add support for the latest (and supposedly uncrackable) content protection. HDMI 2.0a is interesting because it’s possible to upgrade most HDMI 2.0 devices to 2.0a with a software upgrade. This is the first time we’ve seen HDMI be upgradeable in this way; in the past if you wanted more advanced HDMI you had to change out TVs. Let’s hope that manufacturers jump on board with this sort of upgrade strategy. There was a time you could reliably expect a TV to be current and capable for 10 years or more Wouldn’t it be wonderful to see that again?

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 10,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.