X

What is refresh rate?

What do 120Hz, 240Hz, 600Hz really mean for the picture quality of your new HDTV?

Geoffrey Morrison Contributor
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Geoffrey Morrison
8 min read
 
Original video frames (1 and 2) at 60 frames per second aren't enough to fill 120Hz and 240Hz LCDs. Duplicating the original frames is one method. Alternately, frames can be interpolated to fill the gaps. In this example, the TV's processor creates frame 1a from the difference between 1 and 2. This (along with 2a, 3a, etc.) makes up the difference between 60Hz video and 120Hz TVs. Geoffrey Morrison/CNET

With 120Hz, 240Hz, and even 600Hz, refresh rate gets a lot of attention in the marketing of new HDTVs.

What it is and how it works is interesting, but why it exists is even more so. And it can have a profound effect on the picture quality of your HDTV.

Curious?

Let's start with the basics. Television is a series of images, shown rapidly enough that your brain sees it as motion.

In the U.S., our electricity runs at 60Hz, so it's only natural that our TVs run at the same rate (elsewhere, 50Hz is common). This is largely a holdover from the CRT days, but our entire system is based on it, so there's no use changing it.

What this means is that modern HDTVs show 60 images per second (60Hz). For a refresher on progressive scan (720p, 1080p) and interlaced (1080i), check out "1080i and 1080p are the same resolution."

Upping the frame rate
A few years ago, LCDs hit the market with higher refresh rates. These started at 120Hz, though now you'll see 240Hz and beyond. In this case, higher is indeed better, but to understand why it's better, we have to discuss why it exists in the first place.

All LCDs have a problem with motion resolution. Which is to say, when there's an object in motion onscreen (or the whole image is moving), the image blurs compared with when the object/scenery is stationary. In the early days of LCDs this was predominantly because of the "response time," or how fast the pixels could change from light to dark. Response times on modern LCDs are quite good, and this isn't the big issue anymore.

 
The top is full motion resolution. The bottom half is a representation of what motion blur looks like. Notice how the dolphin on the right is blurred compared with the other three and the rest of the image. Geoffrey Morrison/CNET

The issue is how your brain interprets motion. Because it's your brain, everyone is going to see motion resolution somewhat differently. Some people don't notice motion blur. Some people aren't bothered by it. Some (like me) notice it quite often and are bothered by it. Others, like our David Katzmaier, know it exists but don't notice it enough in normal program material to consider it a major factor in picture quality.

There are two primary ways to fool the brain into seeing better detail with LCDs: backlight flashing (also called backlight scanning) and frame insertion.

Backlight flashing is what it sounds like. The most basic version of backlight flashing is the backlight going dark in between video frames. This moment of darkness is much like how a film projector works: an image, then darkness, an image, then darkness, and so on. Done slowly, this can result in flicker. Done fast enough, and you don't notice it. A more advanced version, called backlight scanning, dims sections of the backlight in sequence with the video. In either case, the side effect is a loss of light output (sometimes significantly), because there are sections of time where the backlight is literally off (or close to it). There is another way of doing this called black-frame insertion, which shows a black image in between the real frames, but that doesn't actually manipulate the backlight.

 
Panasonic's "1920bls" backlight scanning technology. It's like a scrolling dimming of the LED edge lighting to darken rows of LEDs in sequence. It does this very quickly. Panasonic

With 120 and 240Hz displays, there's another option: frame insertion. This method, also called frame interpolation, actually creates entirely new frames of video to insert in between the "real" frames of video. With video sources, like live TV, sports, and video games, there's very little downside to this method. You get excellent motion resolution, and you maintain the light output of the display. The image at the top of this article is an example of frame interpolation.

However, with film/24fps content (movies, most scripted TV shows), there's an issue. The interpolated frames smooth out the inherent juddery motion of 24fps content. On the surface this may seem like a good thing, but the resulting ultrasmooth motion makes movies look like soap operas. Fitting, then, this is called the Soap Opera Effect. We and many TV companies call it "dejudder." Personally, I find motion-interpolated video annoying to watch. In some, it causes nausea. Some people don't mind it, which I find rather shocking. Check out What is the Soap Opera effect? for more on this "feature."

Most modern 120/240Hz TVs have one or both versions of this technology, and it's completely selectable which to use (if at all). In some bizarre cases, like the Cinema mode in Panasonic's WT50 series of LED LCDs, you're locked into the motion interpolation. The trade-off, usually, is that if you don't use motion interpolation or backlight scanning, you don't get full motion resolution.

Personally, I find the step from 60Hz to 120Hz to be noticeable, and worth the additional money. The step from 120 to 240 is far more modest of an improvement.

Nah, let's make it more confusing
Marketing being what it is, companies are now obfuscating their TVs' actual refresh rates. Samsung, LG, Sony, Vizio, and Sharp have stopped being honest about their refresh rates entirely, instead adopting bespoke motion-resolution ratings called "Clear Motion Rate," "TruMotion," "Motionflow XR," and so on.

In all cases, the company uses backlight scanning and/or extra processing to imply that their TVs have higher refresh rates than they actually do. So, for example, a Clear Motion Rate of 120 could be a 60Hz TV with a scanning backlight, or it could be a 120Hz LCD without a scanning backlight. The spec sheets for the televisions rarely, if ever, list the actual panel refresh rate.

Gary Merson did an excellent article on this at HDGuru called "Beware of phony LCD HDTV refresh rates." CNET's 2013 TV reviews always specify the true panel refresh rate, not the phony one.

Plasma's '600Hz'
This part is complex enough that it has it's own article. For the full story, check out What is 600 Hz?

The abridged version is this: Because plasmas don't suffer from motion blur like LCDs, they don't need higher refresh rates. The problem is, all plasma TV manufacturers also make LCDs. So you're not going to see a big marketing push from any of them saying "No, no, buy our cheaper plasmas because they don't suffer from motion blur (or poor off-axis response, or poor contrast ratios)." With the intense marketing of 120Hz and 240Hz, many consumers assumed plasma was lagging behind, fitting into their erroneous preconceptions that plasma is somehow an "older" technology.

Instead, all three plasma manufactures (LG, Panasonic, and Samsung), have adopted the "600Hz" claim. As Obi-Wan Kenobi said in "Star Wars Episode VI: Return of the Jedi," "...what I told you was true, from a certain point of view."

"From a certain point of view?" As an engineer once eloquently explained to me: plasmas create light with time. Each pixel in a plasma has only two states: on or off. (In that way, they're a completely digital device, unlike LCD, which can still be analog, but that's fodder for an entirely different article.)

Because plasma pixels only have two states, they create different levels of brightness by flashing more or less often. This is where the 600Hz comes in. In the most basic explanation, plasmas break up each frame of video into 10 subfields (60Hz x 10 = 600). If the pixel is supposed to be bright white, it flashes once for each of those subfields. If it's supposed to be 50 percent bright (50 IRE, or medium gray), it flashes for half of those 10 subfields. When it's supposed to be dark, it doesn't flash at all.

DLP works by a similar principle: each mirror is either on (facing the lens) or off (facing away).

In reality, it's a bit more complicated than this, but this is the general idea. There are other pros and cons to this method that are beyond the scope of this article, but if anyone really wants me to dive into it, let me know.

So "600Hz" is more or less a marketing thing, but it's not untrue. The fact is, plasmas don't suffer from motion blur like LCDs do, so they don't need higher refresh rates.

Source
If you're looking for motion blur in your own TV, keep in mind that in some cases, there's going to be blur in the source. This is most common with movies shot on film. Fast motion will blur on film because of its low frame rate.

Personally, I notice motion blur most in close-ups. When an actor's face fills the screen, for a moment he or she will be stationary, and you'll see every bit of facial detail. Then they'll move slightly, and the image will blur. I see this across all different types of source material.

Bottom line
Refresh rate is how often the TV shows a new image. Anything above 60Hz is entirely the invention of the TV itself. All modern video is either 24 frames per second (movies and most TV shows), 60 fields per second (1080i video), or 60 frames per second (720p video). Higher refresh rates are used to increase apparent motion resolution with LCDs. The 600Hz with plasmas is largely marketing, but is technically how they work.

If you're annoyed by motion blur, you're better off getting the highest-refresh-rate LCD you can get, or stick with plasma (or OLED). Though it's worth mentioning that sometimes the processing that allows high refresh rate TV to work can cause input lag.

However, not everyone notices, or is bothered by motion blur. I do/am, and it's one of the main reasons I prefer plasma over LCD (the other being contrast ratio). As mentioned earlier, David doesn't/isn't. We both have highly critical eyes when it comes to TVs, but because the perception of motion blur is so subjective, we're both right. Are you bothered by motion blur? Comment below; I'm curious.

And most important, because your source is 24 or 60fps, you do not need special HDMI cables with a 120 or 240Hz TV. If the salesperson tells you that, he or she is either clueless or lying. For more on that, check out "Why all HDMI cables are the same."


Got a question for Geoff? First, check out all the other articles he's written on topics like HDMI cables, LED LCD vs. plasma, Active vs Passive 3D, and more. Still have a question? Send him an e-mail! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.