X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. How we test TVs

How HDR works

High Dynamic Range (HDR) TVs and players, as well as movies and TV shows, are the latest advancement in picture quality. They potentially offer a big step up over older, standard dynamic range TVs. Here's how it happens.

Geoffrey Morrison Contributor
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Geoffrey Morrison
9 min read

High Dynamic Range TVs and devices are here, along with HDR TV shows and movies. You can buy a new 4K HDR TV for as little as $450 and stream HDR Netflix and Amazon, or add a $300 Xbox and spin HDR 4K Blu-ray discs. HDR games are coming soon, and even phones and PC screens are attaching those three buzzy letters.

But how do they work? Is HDR just another marketing gimmick to sell TVs and other gear, with no technology behind it? What does it claim to do, and why is that better?

Well folks, if those are your questions, you've come to the right place.

What's HDR?

For the basics about what HDR is, check out What is HDR for TVs, and why should you care?. The short version is an HDR TV, when showing special HDR content, has a wider dynamic range (i.e. contrast ratio), along with more steps in brightness (for smother transitions and more detail in bright and shadowy areas). Also, usually, HDR is paired with Wide Color Gamut (WCG), which offers a greater range and depth of color.

Unlike 4K resolution, curved screens or 3D, HDR is a TV-related technology we're actually excited about. In the best cases it actually improve the image beyond what you're used to with non-HDR video (standard dynamic range, or SDR), including conventional high-def, Blu-ray or even 4K. How much of an improvement--if any--depends first and foremost on the capabilities of TV itself, but also on the content.

Of course you'll need to be watching actual HDR content on a new HDR TV to see the benefits. Fake HDR "upconversion" is available on some products, but it's not the same.

Though it shares its name, HDR for TVs is not the same as what your camera or smartphone can do. Check out the appropriately named HDR for photography vs. HDR for TVs: What's the difference? for more on that.

How HDR works is pretty fascinating, which is why I presume you've read this far. So let's dive right in with the heavy stuff.

Electro-Optical Transfer Function

Electro-Optical Transfer Function (EOTF) is one of the core technologies for how HDR works. It's actually a lot less complicated than it sounds. Essentially, it's an electronic value in the content that specifies real-world brightness on your TV, i.e. a value of 768 from an HDR Blu-ray is equal to 2 nits (a unit of light output) on your TV. A value of 1,024 equals 5 nits, and so on.

This may seem obvious. Don't all TVs do that anyway? Actually, and oddly, they don't. Until HDR, TVs were far more wacky. For example, let's say a signal from a Blu-ray player told the TV to display a 100 percent white image. Essentially telling the TV "show me what you got!" The TV would go all out and show an image that was as bright as that TV could go (or as bright as settings allowed). But that amount of brightness was completely unspecified. A top-of-the-line Samsung LED LCD would be significantly brighter than a bottom-of-the-line Magnetbox CRT, for example.

EOTF changes that, which is important since HDR TVs have the potential to be significantly brighter. It also gives content producers even more control for how an image will appear on the TV in your house (a good thing).

A less-jargony way of describing EOTF would be something like "specified real-world brightness," or maybe "Brightness-the-way-most-people-probably-figured-TVs-already-worked."

Dynamic range

The most important aspect of picture quality is contrast ratio, or the difference between the brightest a TV can be, and the darkest. Another way to describe this would be the "dynamic range" of the television.

hdr-range-from-dolby.jpg

A representation of the dynamic range of the real world versus various display technologies.

Dolby

As the name suggests, High Dynamic Range means a greater dynamic range than you'd find on a traditional television. For both LCD and OLED, this means brighter highlights. A lot brighter. It varies per TV of course, but an HDR TV's highlights (the bright parts of the image), could be 5 times as bright as a non-HDR TV. That additional brightness is key to making HDR pop compared to non-HDR TV.

Why just highlights? Why doesn't HDR address black levels too? Well, OLED can't get any darker blacks (they're actually perfectly black already). The blacks on LCDs are lighter, though local dimming helps. So the dark parts of the image are about as dark as they can be already, which is why many TV makers today are touting higher brightness, where there is still room for improvement, instead.

Keep in mind -- and this is important -- this isn't just about making the whole image brighter. The trick is getting only the parts of the image that need to be brighter, known as "specular highlights," to be brighter. On the TV side, that means LCD TVs with local dimming, and OLED TVs, look best with HDR. However, the TV is only half the battle. The other half is the content.

HDR content is key to making HDR as a whole work. Let's take, for example, a simple scene of a sunset. Here's one:

img20160711012306.jpg

On your screen, you can see the sun is a lighter color than the rest, but on an HDR TV, it would actually be noticeably brighter, while the shadows and clouds remain dark. More like the real world. That's what HDR can do, more than even non-HDR LCD and OLED. This is from my Instagram (thanks for asking).

Geoffrey Morrison/CNET

With a normal TV, you have 8 bits to tell the TV what each level of brightness should be. If you'll allow me to oversimplify this a bit, imagine you only have eight steps to choose from in the image above. The edges, in shadow, are levels 1 to 3 of brightness. The sea reflections might be levels 4 to 6. The sky is 7, and the sun itself, 8. You can see how quickly you run out of steps.

HDR is 10-bit, allowing for (in our example) two more steps. So with our demo scene, that now means you can have the sun at 10, the sun's rays at 9, the sky at 7-8, and so on.

In reality, 8-bit means 256 shades (for each of the three primary colors), and 10-bit means 1,024 shades, but it's the same idea. More shades mean more headroom to push what really should be bright (the sun, glare off a windshield, an explosion) and not have to blow out the whole image to make it shine. Further, it means more detail in the bright (and dark) parts of the image.

Could the same effect be done with 8-bit? Sort of. Content creators would have to choose between detail in the highlights, and detail in the shadows. There just aren't enough bits to do both.

So with HDR, the brightness range of the TVs is greater, and the content is becoming available to take advantage of that. It also means more colors.

Color

ciechartwith709and2020-and-p3.jpg

The smallest triangle (circles at corners) is what your current HDTV can do. The next largest (squares) is P3 color, used in most HDR today. The largest (triangle edges) is Rec 2020, which isn't available yet.

Geoffrey Morrison/CNET (triangles); Sakurambo (base chart)

In addition to a wider dynamic range, more and better color is possible as well. Generally, this is called Wide Color Gamut, as mentioned earlier. WCG has two main benefits. The more obvious one is deeper colors. Deeper reds, greens, blues, and everything in between. Your TV hasn't been able to accurately reproduce many colors you see every day, like "fire-truck red," the deep purple of an eggplant/aubergine, and so on.

For years TVs could do more colors than were in the HDTV specification, but they were creating that on their own. Pushing the colors to be wilder, and not what the content creators intended.

How do they do create better colors? By creating better light. LCDs create color by using color filters, essentially colored pieces of plastic that, well, filter the white-ish light created by the backlight into the red, green, and blue you see.

The problem is, the deeper you make the colors, the dimmer the TV is. That's kinda the opposite of what HDR is going for, right? So the trick is to start with colored light.

Though a few TVs used red, green, and blue LEDs to do this (quite rare, actually), the way many of the TVs are achieving deep colors and brightness is by using quantum dots. These microscopic materials are exceptionally efficient at creating different colors of light, and are (relatively) easy to tune to create the exact frequency (color) of light you want. So they can create a deep red, for example, and be bright doing it.

For more info, check out Quantum dots: How nanocrystals can make LCD TVs better.

nanosys-qled-hero-amanda-carpenter-and-oleg-grachev.jpg

Two vials of photoluminescent quantum dots next to a prototype blue photoelectroliminescent QD.

Nanosys - Amanda Carpenter and Oleg Grachev

OLED is different. Current OLED TVs use blue and yellow OLED materials, and color filters to create red, green and blue. By making the OLED more efficient (i.e. brighter for the same amount of energy) LG can keep making the colors deeper, and closer to the P3 color standard. Last year's OLEDs were about 85 percent of P3. This year's are about 95 percent (both of which being far more than non-HDR TVs are capable).

The other half of Wide Color Gamut ties in with what we talked about in the last section: 10 bits. Your TV creates all the colors you see by mixing different amounts of red, green, and blue. How many steps for each amount, the bit depth, was 8-bit, and with HDR/WCG, is now 10-bit. More shades equals more colors. Before, you got, roughly, 256 shades of red, green and blue, for a total of (again, roughly) 16.78 million color options (256 times 256 times 256 equals 16.78M). That may seem like a lot, but it's a fraction of what your eye can see. With 10-bit, now there are 1 billion color possibilities (1,024 times 1,024 times 1,024 equals 1B).

So, in short: deeper colors, and more shades of colors. Like the dynamic range side, the content has to be created to take advantage of this. Which brings us to the next part...

HDR TV shows and movies

The key aspect to getting HDR to work is the content. TVs with more range and more color are great in theory, but without movies or TV shows to take advantage of it, they'd just be making up how the colors and range should look (not something most people want their TVs doing, TVs are pretty dumb).

Turns out, all film-based content (pretty much everything from about 10 years ago and before), and the vast majority of digital recorded content (pretty much everything now, and a mix in the last 10 years), have greater dynamic range than TVs were capable of. Or specifically, what the medium was capable of transmitting. In other words, in order to send a non-HDR signal over the air, satellite, or even a Blu-ray disc, it has to get dumbed down compared to the original image captured by the camera.

technicolor-tour-1.jpg

HDR on the right, is not "brighter" just for the sake of being brighter. Highlights and colors can be brighter thanks to a greater dynamic range. The addition of wide color gamut makes the colors richer and more realistic, too. (Keep in mind this is a single image at a single shutter speed and aperture... shown in compressed form on a non-HDR display, so don't take this as literal example of what HDR looks like in person.)

Geoffrey Morrison

Companies like Technicolor and others have developed ways to go back to this old content and allow the directors to remaster for HDR, without necessarily "creating" anything artificial, just adding in what is there in the original, so you can see it in your home.

And because nothing is ever easy, there are currently two "flavors" of HDR: HDR10 and Dolby Vision. To the end user, there isn't much of a difference between the two; both have the same EOTF and many of the same technical underpinnings. What really matters is if your TV can play it. Pretty much all HDR TVs will play back HDR10 content; some will play Dolby Vision. You'll need to match the content's HDR format with what your TV can display. As in, if your TV is only HDR10, Dolby Vision content won't work.

For more info, check out HDR is TV's next big format war.

Bottom line

Because of its potential for improved picture quality and enthusiastic Hollywood support, we consider HDR a welcome advancement in home video. Right now it's only available on midrange to high-end 4K TVs from 2015 and 2016, but as with every other new technology, we expect prices to fall.

Will all HDR TVs perform equally well? No. There will always be good TVs and bad TVs. Will an HDR TV always offer better picture quality than a non-HDR TV? Not always, but for the most part, when running HDR content, they will.


Got a question for Geoff? First, check out all the other articles he's written on topics such as why all HDMI cables are the same, LED LCD vs. OLED, why 4K TVs aren't worth it and more. Still have a question? Tweet at him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his sci-fi novel and its sequel.