Buying a 4K TV: Getting to know HDCP 2.2, HDMI 2.0, HEVC & UHD
Buying a 4K TV: Getting to know HDCP 2.2, HDMI 2.0, HEVC & UHD
Deciphering the 4K TV alphabet soup
Is buying a 4K TV as simple as ensuring that it has a 4K resolution? Not quite, especially if you want your multi-thousand-ringgit investment to last as long as possible. But as standards and technologies get sort of agreed upon and adopted by TV makers and content producers, there's a higher degree of certainty that a recent 4K TV you buy now won't become obsolete anytime soon.
However, the sad fact remains that if you've bought a 4K TV or receiver sometime between 2013 and early 2014, there's a chance that it might not be able to play or passthrough today's 4K content.
Whether you're looking to buy a 4K TV or just wondering if your existing 4K gear can keep up with the latest standards, understanding jargons like HDCP 2.2, HDMI 2.0, HEVC, and UHD is invaluable. So here's the skinny.
1.) HDCP 2.2
Most 4K TVs support something called HDCP, which stands for High-bandwidth Digital Content Protection, but only the more recent ones come with HDCP 2.2, the latest generation content protection mechanism. Simply put, HDCP attempts to secure the connection between the source and display (and anything in between, such as receivers and game consoles), just that in 2.2 the protection system is much more advanced and stronger than before. HDCP is usually implemented on HDMI ports, but it can also be used on other types of connections, like DisplayPort and DVI.
Now, HDCP 2.2 is really designed for 4K content, so you’re fine if you think you’re going to stick with 1080p all the way. Put another way, as long as it’s 1080p content, even if there’s a mix of HDCP 2.2 and non-2.2 devices in the chain, you’re safe.
But there will be a problem if you’ve a 4K source device (e.g., a Blu-ray player) or service that’s HDCP 2.2-compliant and is trying to send protected 4K signals to a non-HDCP-2.2 4K TV. The TV would most likely show a blank screen. To make matters worse, non-HDCP-2.2 devices can’t simply be upgraded to support HDCP 2.2, because specific hardware is required.
In short, if you’re buying a new 4K TV (or home theater projector or receiver) today, it’s of your best interest to make sure that it supports HDCP 2.2. While most name-brands’ recent offerings have it, it’s still good to verify it either by checking the specs sheet or with the salesperson. Many consumers like to buy from second or third-tier brands because they’re usually more affordable; and while there’s nothing wrong with that, be mindful that such TVs often lack the latest features like HDCP 2.2.
For early adopters who’ve bought a non-HDCP-2.2 4K TV, well, just keep all these in mind, and don’t be too surprised if some 4K content don’t display properly on your TV some point down the road. Once the problem crops up too frequently, you’d know that it’s time to get a new TV.
2.) HDMI 2.0
Another feature or technology that you’ll inevitably come across when buying a 4K TV is HDMI 2.0. Early and cheap 4K TVs typically come with HDMI 1.4, and while it supports 4K, it has a framerate limit. To be more precise: 30 frames per second for 3,840 x 2,160 and 24 fps for 4,096 x 2,160.
HDMI 2.0 is designed to support higher bandwidth (up to 18Gbps) than 1.4, and as a result, it’s able to do 4K at framerates up to 60fps. Other features include options for the Rec. 2020 color space, 4:2:0 chroma sub-sampling, up to 32 channels of audio, 21:9 aspect ratio, dual video streams, and improved 3D and CEC functions. HDMI 2.0 is also backwards compatible with HDMI 1.x.
In a nutshell, HDMI 2.0 can only get more important as more 2160/60p content arrives. For futureproofing’s sake, there’s no reason to omit it if you’re buying a 4K TV (or receiver) today. Some early 4K TVs are able to upgrade from HDMI 1.4 to 2.0 through a firmware update, but that is not a guarantee and really depends on the manufacturer and the design of the TV.
Also, while HDMI 2.0 and HDCP are often mentioned in the same breath, know that one doesn’t equate the other. As mentioned earlier, though it’s possible to upgrade from HDMI 1.4 to 2.0 through a software update, the same doesn’t apply for HDCP 2.2. In addition, if a 4K TV supports HDMI 2.0, it doesn’t mean it also supports HDCP 2.2.
For those interested, the latest HDMI version now stands at 2.0a, with high dynamic range (HDR) video support the major new feature added. Currently, only a handful of TVs do HDR, such as Samsung’s 2015 SUHD models and a couple of high-end models in Sony’s current BRAVIA lineup. That said, none of them supports HDMI 2.0a, though it’s plausible that this can be achieved through a firmware update in the future.
And since we’re at it, be aware that there’s no such thing as an HDMI 2.0/2.0a cable. As long as you’ve a ‘High Speed’ HDMI cable (you probably do if you already own a 1080p TV), you’re all set.
HEVC/H.265, or High Efficiency Video Coding, is a compression standard that succeeds the MPEG-4/H.264 AVC standard. Expectedly, the newer HEVC is much more efficient (almost 200%) than its predecessor, and this becomes important when dealing with today’s high-res, high-bandwidth 4K content. As such, HEVC has become the go-to compression scheme for 4K streaming services, like those from Netflix, Amazon, and M-Go.
The good news is unlike 4K TVs sold in 2013 and early 2014, recent 4K TVs from name-brand makers mostly come with a built-in HEVC decoder, so they shouldn’t encounter any hiccups when playing HEVC-compressed 4K content. That said, it’s still possible for a HEVC decoder-equipped 4K TV to not playback HEVC 4K content, something we saw on the 2014 Panasonic AX800/802 series. In this particular case (which was resolved in October 2014 through a firmware update), they weren’t able to play Netflix’s 4K streams because the TVs’ processing chip didn’t meet Netflix’s certification requirements.
At the moment, we aren’t aware of any HEVC problems in any of the latest 4K TVs made by name-brand TV manufacturers. If anything, check if HEVC is supported if you're buying a model in their lower-end series. Again, be more careful with off-brand 4K sets, because their makers are less likely to be as enthusiastic with regards to such certification.
For those who have bought a 4K TV in 2013, you could be out of luck, unless you've one of LG's earliest 4K TVs that came with a built-in HEVC decoder, or a Samsung 4K TV, like the S9 and F9000. While these Samsung TVs didn't support HEVC initially, they can be upgraded with an external One Connect box that adds support for it. Samsung's One Connect box idea, which the company implements for all its 4K TVs, is actually very good since it enables users to upgrade their TV without throwing out the whole set. For the S9 and F9000's case, the One Connect box upgrade also brought along support for HDCP 2.2 and MHL 3.0.
Did you know that 4K can mean different things depending on whom you ask?
Strictly speaking, a 4K TV that has a resolution of 3,840 x 2,160 pixels (or 2,160p) should really be called a UHD (ultra-high-definition) TV or 4K UHD TV to avoid confusion with the DCI (Digital Cinema Initiatives) standards, the latter of which are specs defined for digital cinema production and projection systems.
Under DCI, 4K is defined as having a resolution of 4,096 x 2,160 pixels. Many high-end home theater projectors, like the Sony VPL-GTZ1 and VPL-VW1100ES, support this DCI 4K resolution. In fact, there are other 4K resolutions under the DCI spec, such as 4,096 x 1,716 (scope) and 3,996 x 2,160 (flat). Luckily for most consumers, there’s no need to know all these; because when it comes to 4K TVs (ahem, 4K UHD TVs), there’s only one resolution that matters: 3,840 x 2,160.
Standards and trade bodies like the Consumer Electronics Association (CEA) in the U.S. and Digital Europe in Europe have also gotten into the act of defining 4K for consumer products and coming out with their own standardization logos. Both organizations agree that 4K UHD has a pixel count of 3,840 (horizontally) and 2,160 (vertically), and there must be 8 million addressable pixels (3,840 x 2,160) that use an red-green-blue (RGB) sub-pixel layout.
And here's where it gets interesting (and technical)
Which brings us to the brouhaha that started last year with regards to whether a 4K panel using a red-green-blue-white (RGBW) sub-pixel arrangement can be considered a true 4K UHD TV.
Such RGBW-based LED-LCD 4K TVs are commonly found in China, and they use panels made by Chinese panel makers, and reportedly, South Korean heavyweights LG Display and Samsung Display as well. In LG’s case, it’s called the Green Plus or G+ panel; for Samsung, the Green panel. Long story short, by adding a fourth transparent white sub-pixel, energy consumption of such green 4K panels can be reduced (vs. a traditional RGB panel) without sacrificing brightness. Component costs drop too, which means these panels cost less to make - this is why TVs using them tend to be more affordable than their RGB-based counterparts.
That said, the biggest knock against such panels is that while resolution is increased compared to 1080p panels, due to the pixel layout and the fact that each pixel isn’t made up of three colored sub-pixels, image quality may take hit as their resolution and color accuracy aren’t as good as ‘true’ 4K panels that use an RGB matrix.
For those interested, we've reached out to both LG and Samsung for comment. According to a Samsung spokesperson, Samsung Display's Green panels are made for third parties and are not used in Samsung's own 4K UHD TVs. In fact, all of Samsung's UHD TVs are certified by Digital Europe, which means they follow the UHD definitions set out by the European organization.
For LG, the company says its RGBW UHD TVs satisfy standards set by International Standard Organizations such as Intertek (U.K.), TUV (Germany), UL (U.S.), CESI (China), and JEITA (Japan). Also, its RGBW UHD implementation is able to achieve RGB UHD resolution without any loss of UHD picture quality because of its panel algorithm, where nearby pixels can share among themselves a sub-pixel. This allows the RGBW UHD panel to have the same number of pixels as RGB UHD panel. In addition to the energy consumption benefit we've mentioned above, LG also says its RGBW structure allows for the same color reproducibility as a RGB structure, as well as the same picture quality as RGB UHD when upscaling Full HD to 4K. Lastly, LG thinks that the RGBW structure is an appropriate method for implementing future standards such as 8K, which cannot be done with the current RGB method.
Be it differences in definitions or marketing methods, at the end of the day, it boils down to the question of whether you can see the difference. The problem with RGBW 4K LCD panels isn’t so much the technology (cheaper 4K TVs with non-perceptible drop in image quality - that's a good thing, right?), but rather who makes them. For major panel makers, it’s safe to assume that they’d ensure a certain level of quality, either through software or hardware means. The danger is more with RGBW panels sourced from unknown suppliers, because for these panels, you'd never know the ratio between the RGB and white sub-pixels. That is, unscrupulous makers could jolly well use far fewer RGB sub-pixels and make up the number with tons of white sub-pixels to hit the 4K resolution requirement. These are the real fake 4K TVs that we should be wary of.
Now you know.