Not too long ago, a technical manager at Samsung reached out to me about issues with something I wrote about HDR10+, one of the leading HDR standards in the home theater space. Normally I would just correct the existing article and be done with it, but this person pointed out that I was hardly alone in having these misconceptions — precisely because until recently, Samsung wasn’t making an effort to set the record straight. Indeed, it’s not like I was turning to shady sources for my info. These were details reputable sites have circulated for years.
Because of this, I decided it would be worth devoting an entire article to HDR10+ myths. Some of these may be things you already knew, but I’ll bet at least one or two facts will surprise you. They may even sway a future TV purchase, although you should mostly be keeping an eye on Dolby Vision 2 and HDR10+ Advanced.
Brightness is capped at 4,000 nits
Once, maybe, but not forever
One of the more important specs on a modern TV is peak brightness. This doesn’t refer to levels in the picture settings menu — rather, it refers to what a TV is capable of in short bursts, usually in order to display HDR highlights. The greater this figure is (measured in nits), the more brilliant those highlights can be. In fact this is one of the advantages mini-LED has over OLED, although newer OLED sets are catching up somewhat.
HDR10+ is capable of hitting 10,000 nits, just like Dolby Vision. The real limit is your TV.
To exploit peak brightness, an HDR standard has to be engineered to match. Traditionally this has been claimed as one of the advantages of Dolby Vision, which is rated up to 10,000 nits — a figure the most expensive TVs are only now starting to catch up with. Samsung tells me, however, that HDR10+ is equally capable in this regard. The myth of a 4,000-nit limit may owe to the fact that when the standard emerged in 2017, the brightest available reference displays were in that ballpark.
Does the difference between 4,000 and 10,000 nits matter? Not so much. While it’s good to have that range for the sake of accuracy and futureproofing, most current TVs are unlikely to top 5,000 nits, and even that’s absurdly bright. Consider that the 3,000-nit display on my Apple Watch Ultra 2 is perfectly readable in the midday sun — shining that light in a darkened room could hurt your eyes. There’s really no need to max out brightness under normal viewing conditions. You might hypothetically appreciate higher peaks if you’re trying to watch HDR video in a sunlit room.
Color accuracy tops out at 10-bit
Beyond the infinite
When sites talk about HDR, the focus is usually on the highlight and shadow detail it provides, but another benefit is color accuracy. All the major HDR standards support at least 10-bit color depth, as opposed to SDR’s 8 bits. This lets you see a wider variety of tones. Many of these are indistinguishable to the human eye, it’s true — but HDR does tend to look more colorful in general, and if nothing else, it’s nice to know you’re getting the most accurate reproduction of a movie or TV show.
It’s actually possible to generate 16-bit HDR10+ metadata — it’s just a question of TV hardware and the content itself catching up.
In the past, I’ve seen another Dolby Vision advantage as being its support for 12-bit depth, allowing for over 68 billion colors. In reality, HDR10+ can easily keep up, and any form of 12-bit color is going to offer marginal improvements regardless. That’s because consumer TVs don’t yet support native 12-bit pipelines. What you’re getting today is downsampled to 10 bits, at best.
HDR10+’s color support is well beyond what any TV needs. It’s actually possible to generate metadata for 16-bit color — it’s just a question of TV hardware and the content itself catching up, since a lot of what you’re streaming is 10-bit. If there’s a difference between Vision and HDR10+, it’s more likely to come from the metadata involved, and possibly features like Dolby Vision IQ. Even then, IQ has an equivalent in the form of HDR10+ Adaptive.
Content availability is scattershot
Watch out on Blu-ray
Widespread adoption of HDR10+ is a relatively recent phenomenon. While the standard dates back to 2017, streaming services didn’t rush to support it — its biggest initial platform was Amazon Prime Video. For a while, the only HDR formats you could count on (outside of broadcast TV) remained Dolby Vision and the vanilla version of HDR10.
As of this writing, though, HDR10+ has become extremely common, at least in the US. Samsung claims that it’s present for every HDR title not just on Prime Video, but on Hulu, Netflix, Apple TV, and Paramount+. It’s also an option for YouTube creators, though it may be easier for some to turn to HLG, which doesn’t depend on custom metadata.
Where things are truly scattershot is the Blu-ray scene. For three years, Dolby Vision was the only choice for dynamic HDR. Even some more recent Blu-rays omit HDR10+, a couple of random examples being 2022’s The Northman and 2024’s Alien: Romulus. There are plenty of discs that do offer support, such as 2022’s Elvis and the recent re-releases of Jaws and Back to the Future — but if you have a TV that lacks Vision compatibility, you may end up watching important movies in HDR10 despite your best efforts.
Gaming support is lacking
Still not what it should be
Admittedly, I’m cheating a bit with this one, since HDR10+ support has been genuinely weak for a while, and could still be described that way. It’s not built into Windows 11 or any console, and when I last checked, the format’s official website listed just 13 compatible games, among them Battlefield 6, Cyberpunk 2077, and Hell is Us. On top of this, you need an Nvidia or Intel GPU paired with a compatible monitor. You’re out of luck if you’re using an AMD Radeon card.
The turning point may be the upcoming PlayStation 6 and next-generation Xbox (a.k.a. Project Helix). If those don’t have HDR10+, the format could continue being relegated to movies and shows.
The situation seems poised to change, however. The format is already being integrated into more games, like Crimson Desert, and Samsung has motivation to push the industry further via partnerships, given that its monitors and TVs don’t support Dolby Vision. From one perspective, it’s a bit ridiculous that HDR10+ isn’t everywhere already — it’s royalty-free, unlike Vision. It may be that a lot of developers (and AMD) feel the benefits aren’t really worth it yet, since it’s possible to approximate it well enough without integrating it into rendering pipelines. I’ve seen surprisingly decent HDR effects in games using Windows 11’s Auto HDR feature.
The turning point may be the upcoming PlayStation 6 and next-generation Xbox (a.k.a. Project Helix). If those don’t have HDR10+, the format could continue being relegated to movies and shows. If they do incorporate it, support in games still isn’t guaranteed, but there should be more incentive to work on it.
Trending Products
ASUS RT-AX1800S Twin Band WiFi 6 Ex...
TP-Link AX5400 WiFi 6 Router (Arche...
MSI MPG GUNGNIR 110R – Premiu...
Lenovo 15.6″ FHD Laptop, Inte...
GAMDIAS ATX Mid Tower Gaming Laptop...
15.6” Laptop computer 12GB DD...
SAMSUNG 27″ T35F Sequence FHD...
cimetech EasyTyping KF10 Wi-fi Keyb...
