HDR. These three letters have been one of the most important developments in TV and monitor technology in recent years, and for good reason. They stand for High Dynamic Range, a technology that offers a higher degree of luminosity control over SDR, or Standard Dynamic Range, images. Where SDR assigns each pixel a color value, HDR complements that value with a pixel brightness, allowing for far deeper black values and much brighter highlights — an important factor in figuring out the best monitor for you.
A night scene on a proper HDR display will display almost true, deep blacks, but then a flashlight shining at the camera can be eye-searingly bright — an excellent feature for improved immersion and realism, no matter what you’re watching or playing.
But, there are multiple standards for HDR performance and various different display technologies that can all vastly alter the experience, so when you’re shopping for a new computer monitor and you care for HDR, it can be helpful to understand what’s what before pulling the trigger.
DisplayHDR, HDR10, Dolby Vision … an ocean of standards
VESA, an industry group that sets a handful of display standards in the PC industry (including the infamous VESA mount and DisplayPort technologies), set some standards for rating the HDR performance of PC monitors. These range all the way from DisplayHDR 400 up to DisplayHDR 1400. These are ratings to communicate to the consumer what the peak brightness is that a particular display is able to achieve.
For example, DisplayHDR400 is the lowest standard, offering a peak brightness of just 400 nits, whereas the top standard is DisplayHDR1400, capable of outputting at over 1400 nits when there is a small but very bright area demanding high brightness. From our experience testing, the lowest DisplayHDR standard, DisplayHDR 400, isn’t very useful, but it has a lot to do with the display technologies used to achieve it. More on that below.
There are other techy terms like HDR10, HDR10+, and Dolby Vision that you may also run into. These are the protocols used for encoding and decoding the HDR signals. On televisions, you’ll often find support for both technologies, but on PC monitors it’s all about HDR10, so it’s not something most monitor buyers typically need to worry about.
That said, a selection of television and cinema content comes encoded in Dolby Vision, so for professional video editors, there are a small handful of PC monitors that support this standard — but they’re prohibitively expensive.
How well a display is capable of rendering HDR images essentially boils down to one thing: How the display is illuminated. The vast majority of quality PC monitors nowadays use IPS displays, though VA panels are quickly rising in popularity thanks to their better contrast ratio and lower costs. But both of these are essentially the same thing when it comes to illumination: A variable color filter with a lamp behind it.
Nowadays, this lamp generally comprises a series of LEDs at the edge of a panel, hence the term “edge-lit,” and can be wired into multiple zones. That’s where we get into the difference between global dimming and local dimming.
An important bit of knowledge here is about a panel’s contrast ratio — its ability to block light when showing a black image. No LCD panel is able to block all the light, with IPS panels, despite offering excellent colors, “only” achieving static contrast ratios of 1000:1. This means that a white image is 1,000 times brighter than black. VA panels might not offer the same pixel response or color performance, but they do offer a much higher contrast ratio often in the realm of 3,000:1, making way for deeper, inky blacks.
To create true blacks, the lighting itself needs to be dimmed.
For global dimming, a display has no specific zones it can individually control the brightness of, instead changing the brightness of the entire display to meet the image needs. This is generally a rather ineffective way of rendering HDR images as it limits peak brightness and leaves all of the display illuminated in dark scenes with a single, small bright element. Fortunately, these displays are only allowed to carry the DisplayHDR 400 label and no higher — we consider these not to be HDR-capable displays because of this, since, although they have the controller technology to interpret HDR signals, they aren’t capable of effectively recreating them.
Make the jump to local dimming, and suddenly the experience vastly improves. For displays with local dimming, the edge lighting is split into multiple zones, often 8, 16, or 32 zones, almost always as vertical columns. Of course, this requires a more complicated design, so these monitors do cost a bit more.
Because the backlighting is split up into multiple zones, the peak brightness of individual zones goes up. This is a balance of power draw. If the entire display is bright, then the maximum brightness is generally limited, but if much of the display is dark, an individual zone can light up much brighter than the display’s typical brightness, meeting the DisplayHDR rated maximum brightness. Is that night scene with the flashlight coming to mind?
But as you can imagine, this form of lighting is still imperfect: In that scene, most of the display will be dark, except for the column with the flashlight — the entire area above and below the flashlight will also be illuminated, emitting a blueish sheen while the rest of the display is dimmed to true black. This effect is less prominent on VA panels compared with IPS due to their improved dark levels, but still visible.
Full-Array Local Dimming (FALD)
That’s where Full-Array Local Dimming comes in. But let me warn you, displays with this technology are expensive.
FALD displays don’t use an LED strip at the edge of their panel to illuminate the image, but rather feature a full array (hence the name) of LEDs behind the panel for illumination. An array like this can feature over 1,000 zones, giving much more precise control over which area of the display is illuminated and eliminating the “illuminated column” effect. And because the bright zones are much smaller, often, they can reach ludicrously high peak brightness on small zones.
But there are still drawbacks to this technology too: Power draw, cooling, cost (a FALD array can easily double a monitor’s cost), and the technology still isn’t perfect. Over 1,000 zones is a lot finer control than a small handful of columns but still produces a slight halo effect even if it does offer a far superior HDR experience.
What about OLED
Dan Baker/Digital Trends
If you want no halo effect, your best bet is an OLED TV. We say TV, because really, there aren’t any gaming OLED monitors out on the market. This is of course for a good reason: Burn-in. Unlike TV shows, movies, and games, computing subjects a panel to a lot of static images — and because each pixel is its own light source on an OLED panel, each pixel can wear out at its own rate. There are tricks to reduce burn-in, but even so, it makes sense that manufacturers are hesitant to put OLED monitors in consumers’ hands.
But if you’re willing to take that risk, OLED TVs make great HDR gaming monitors thanks to their grand size and incredible black levels without any haloing at all — there’s also no backlight bleed, or the glowing that you’ll see on traditional LCD panels. OLED TVs give you the best picture available today.
As far as the image goes, the main catch with OLED is peak brightness levels, as these can be limited to about 600 nits — something the manufacturers do to protect the lifespan of the panels. That being said, for most potential buyers, 600 nits is plenty. Especially those gaming in darker rooms where peak brightness isn’t as important.
Find what’s perfect for you
Having come this far, you may have noticed something: There is no such thing as the perfect monitor. Even if you throw unlimited money at the problem, you still have to make a compromise somewhere, and that’s probably not going to change for quite some time.
While there is no perfect HDR monitor, though, there may be one that’s perfect for you. For the majority of gamers, the best recommendation for a good HDR monitor is one with edge-lit, local dimming, with a DisplayHDR 600 label or above, based on a VA panel to combat the halo effect as best as possible without breaking the bank.
You’ll still find that although you’ll spot the halo effect in especially dark scenes and desktop use, it falls to the background in most dynamic content and is hardly noticeable at all in daylight scenes.
If you’re a content creator, you may need to drop a few extra pennies on a display with an IPS panel, and if you’re very professional about it, one of the FALD types, but this isn’t necessary for the vast majority of consumers.