What the H?
In the transition from High Definition Television to Ultra High Definition TV, we’ve seen the acronym dictionary go from bad to worse. On the good side, HDTV was multiple video resolutions and display formats, like 480p, 720p, 1080i and 1080p, while UHD is essentially just one. Some call it 4K, some call it UHD, some call it 2160p, but it all really boils down to the same thing for the TVs we’ll buy as consumers – 4 times the resolution of 1080p.
HDTV to UHD
There are differences between what the professional video industry considers 4K, which is a resolution of 4,096 by 2,160, and what the rest of of get when we buy a 4K TV, or an Ultra High Definition television set, which is typically 3,840 by 2,160 resolution, but the two are quite close. Some TVs support the slightly higher resolution, but for the most part we’re dealing with the one, quad-HD format, that defines UHD.
In some ways, this makes the transition from HDTV to UHD very simple. In early HDTV days, there were the EDTVs: plasma TV sets that could display HDTV content but scaled it down to a native resolution of 480p. Then there were two dominant resolution formats, 720p and 1080i. 720p was better for fast moving action while 1080i had better resolution and produced sharper images. Eventually we got 1080p sets, the best of both worlds, and the debate was solved. With UHD, we don’t have to worry about it,. We get 2160p televisions. That’s it. Nice and simple.
But that’s not the whole story. It isn’t just a resolution change in the migration from HDTV to UHD. There are so many more changes under the covers, so many more changes built into the transition that are intended to improve our lives and make the entire viewing experience better and more advanced. We’ve talked about many of them before, but sometimes it’s easy to get them confused or to gloss over the relationships between all of them. They build a somewhat twisted web of interconnected relationships it’s easy to get turned around. It happens to us all the time.
The High-Definition Multimedia Interface 2.0 specification is typically considered part of the UHD or 4K transition. HDMI cables have been heaven-sent. One cable that carries high definition audio and video in the same connection makes wiring up your home theater soe much easier – so much simpler than the days of old with a coax or SPDIF audio cable and three component video cables, or one DVI cable if you were so lucky to have digital video support on both ends.
As the demands for what you can watch on your HDTVs evolves, the HDMI spec has had to evolve as well to support the better video. HDMI 1.4 actually supports 4k resolution, but only at 24 or 30 frames per second. If you want full 4k resolution at 60 fps, you have to get a system that supports HDMI 2.0. In addition to the higher frame rates, the higher bandwidth supported by HDMI 2.0 also allows more audio and video information to travel across the cable. For example, HDMI 1.4 is limited to 8-bit color, HDMI 2.0 can go to 12-bit. That higher bandwidth paves the way for something called HDR or High Dynamic Range.
But before we get to HDR, let’s take a brief detour to discuss HDCP 2.2, the next rev of the High-bandwidth Digital Content Protection spec also commonly associated with Ultra High Def TV. HDCP has been around since the beginning of HDMI. It is the copy protection part of the spec aimed to keep pirates from getting their hands on pristine, high quality digital formats that they could turn right around and post on the Internet for anyone to download. It is designed to protect the content owners from the evil pirates who want to post movies and TV shows on bit torrent and other file sharing sites.
However, what it typically does is just make all of our lives harder. Many of the HDMI communication issues we’ve all experienced between set top boxes, receivers, and other home theater devices are due to the copy protection part of the spec. A part of the spec that probably, in most cases, isn’t even enabled for the content we’re viewing. But HDCP 2.2 is the next evolution, so if you want to make sure you’ll be able to watch copy protected 4K content, you’ll need gear that supports HDCP 2.2.
Odds are they’ll never turn on the content protection for most of what we watch, because it would create so many issues with people trying to view it that it wouldn’t be worth it, but if they do decide to enable it, all the devices in the chain: set top box, blu-ray player, receiver, television, etc. will all need to support it for you to see the content. The biggest bummer is that we’ll probably have a whole new batch for HDMI incompatibility issues as some devices begin to roll out with HDCP 2.2 and try to talk with legacy devices that don’t support it. HDMI, for all its benefits, hasn’t been without its issues, and HDCP will most likely compound them, not make them any better.
If you can get past the copy protection, and get your devices all talking with HDMI 2.0, you might very well be able to enjoy HDR content, or High Dynamic Range video. High dynamic range video is, in a nutshell, a better luminance range than typical video, providing whiter whites and blacker blacks, this gives you better contrast, better color response and better shadow detail in the videos you watch on TV. You don’t get better resolution, but you get more realistic, more lifelike images because the contrast more closely resembles what we see in the world around us.
HDR isn’t an essential part of UHD or 4K TV. You don’t even need 4K resolution to enjoy the better color and contrast you can get from HDR video, but in most cases you’ll need to upgrade to a 4K set if you want a TV that will display the High Dynamic Range content – not because the two are required or connected, but just because the latest and greatest TVs, the ones that support HDR, just so happen to be 4K sets. There may be 1080p OLED TVs in the future that have support for HDR, but why would you upgrade to that?
The last piece in the puzzle is our last ‘H’ acronym: HEVC or High Efficiency Video Coding. It is the successor to the standard H.264/MPEG4 AVC codec used predominantly for our current HDTV content and is the codec used most often to encode or transmit UHD content. It has twice the compression capabilities without sacrificing video quality, or it can be used to transmit much higher quality video, up to 8K resolution, in the same bandwidth currently used for 1080p HDTV content.
One important note about HEVC is that it is currently the only mainstream codec that supports HDR content. so while it is possible to get HDR in your 1080p HDTV movies, you’d need those movies to be encoded with HEVC, not the old-school H.264 codec you have now. So you’d need a TV and a player that both support HDR and HEVC to get the benefit of higher dynamic range. Since HEVC is typically associated with UHD, it isn’t likely that many manufacturers will introduce support for it in non-UHD devices. so while it might be possible to watch 1080p content with HDR, you’d probably need to do that on a 4k set anyways.
The move from tons of resolution options in the HDTV spec to essentially one in the UHD world should have made our lives easier, but content providers and manufacturers wouldn’t stand for it, so they gave us a bunch of new ‘H’ acronyms we’d have to worry about to keep us on our toes. The good news is that in a couple years, when UHD is commonplace and reaches mass adoption, everything will support all the new acronyms and it won’t really matter anymore. But for those of us on the early adopter curve, it can be tricky. For now, make sure you read the specs on everything you buy to make sure it’ll support what you want now and in the near future. And if you have any questions, give us a shout.