We run through the differences between HDR10 and Dolby Vision, two HDR formats commonly found in modern TVs
Directly competing technologies have a habit of cropping up simultaneously, especially when it comes to the realm of home entertainment. Most of us will remember the consumer technology wars waged by VHS against Betamax, LCD against plasma and HD DVD against Blu-ray.
Today, in a world filled with High Dynamic Range (HDR) televisions, new battle lines have been formed between the various HDR formats. The list is large and growing: HDR10, HDR10+, Hybrid Log-Gamma and Dolby Vision are the buzzwords of the moment, and Technicolor is trying to gain traction for its own format too. With so many types of HDR out there, it’s no wonder consumers are getting confused.
In this article, we’ll be looking specifically at Dolby Vision, the licensed HDR format from Dolby Laboratories, and the open-source HDR10 standard that’s supported by most HDR-capable televisions today. If you’re not sure why HDR even matters, check out our ‘What is HDR TV?’ primer to find out. Read it already? Well, let’s carry on then.
READ NEXT: The best 4K HDR TVs you can buy right now
Dolby Vision vs HDR10: What are the key differences?
HDR10 and Dolby Vision are both forms of HDR but they have several key differences, none of them very simple. Thankfully, you don’t necessarily need to make a decision about which one to invest in, because many manufacturers and streaming services support them together. HDR10 remains the most common form of HDR but Dolby Vision has gained a firm foothold in recent years. With the right home cinema setup, you can reap the benefits of both.
In the comparison table below you can see how HDR10 and Dolby Vision differ in technical terms; there’s a lot to take in, from metadata technology to colour depth (aka bit depth) and peak luminance. These terms may seem bewildering at first, but don’t worry – we’ll get into all the gritty details soon enough.
|Open-source||Licensed by Dolby|
|Static metadata||Dynamic metadata|
|Up to 10-bit colour depth (1 billion colours)||Up to 12-bit colour depth (68 billion colours)|
|Up to 1,000 nits max brightness||Up to 10,000 nits max brightness|
|Widespread hardware/streaming support||Hardware/streaming support growing fast|
On paper, Dolby Vision appears to have the upper hand over HDR10, offering vastly higher peak brightness levels and a massively wider spectrum of colours. HDR10 does come out on top when it comes to availability, however, because it is supported by more Blu-Ray players and 4K TVs.
None of Samsung’s current HDR TVs can play Dolby Vision content, for example, and it wasn’t until June 2017 that the first Dolby Vision Blu-rays were released. Then again, AV companies like Sony, LG and Panasonic include Dolby Vision across much of their home cinema ranges, even extending down to base-spec 4K TVs in some cases.
What about streaming? Dolby Vision content can be found on Netflix, Amazon Prime Video and Apple TV, among other places. It is in scarce supply compared to HDR10 content on these same services, but that’s changing with time, as more and more films and shows – especially Netflix Originals – are being mastered for Dolby Vision.
READ NEXT: The top 4K and HDR Blu-ray players
Dolby Vision vs HDR10: Colour Bit Depth and brightness
Most of the differences between the competing standards arise around colour bit depth and brightness. Dolby Vision films are mastered in up to 12-bit colour, whereas HDR10 is mastered for 10-bit colour, hence the name. Films are colour graded specifically for Dolby Vision and studios have to provide artistic approval of Dolby’s mastering.
The difference that 12-bit makes is that Dolby Vision has 4,096 possible RGB values versus the 1,024 values for HDR10, meaning greater granularity of colour production. A 10-bit colour depth amounts to over 1 billion colours, whereas 12-bit opens it up to over 68 billion colours. Needless to say, both offer a far wider colour gamut than non-HDR sets of today, which make do with just 256 RGB values for 16 million colours. All Dolby Vision branded TVs will support the 12-bit colour depth required.
^ Image from Dolby Vision white paper
Dolby Vision also aims to master films at up to a theoretical 10,000 nits of brightness. This is vastly higher than the 100-nits that SDR Blu-ray discs and broadcast television programmes are mastered to. HDR10, on the other hand, is mastered at around 1,000 nits depending on the content. The extra brightness of HDR televisions is in part how they are able to produce their higher dynamic range through heightened contrast.
At present, there aren’t any displays capable of 10,000 nits, even in Hollywood’s mastering suites, so most Dolby Vision content is typically mastered at around 4,000 nits. Meanwhile, the most high-end consumer sets top out at around 2,000 nits. Dolby Vision is future-proofed in the sense that the metadata included can take advantage of improved displays down the line, however.
READ NEXT: HDR TVs explained
Dolby Vision vs HDR10: Static and dynamic metadata
Whereas HDR10 is a widely supported open standard, it’s quite restricted in terms of how content is produced and played compared to Dolby Vision. That’s down to HDR10’s use of static metadata, which limits the impact of HDR performance, especially on cheaper LCD HDR TVs with lower brightness levels. Dolby Vision has the advantage of dynamic metadata, information provided by content creators that dictates how the content is played on your TV at home.
Dedicated chips inside Dolby Vision players and televisions can communicate their capabilities such as colour space and brightness, which then optimises the display signal to the particular screen on a frame-by-frame basis, which Dolby says can ensure hues are better preserved, leading to improved skin tones among other benefits.
It’s this dedicated hardware that means that Dolby Vision can’t be simply added to a player or television through a firmware upgrade. What the chip ensures is that there’s a more precise, consistent output that plays to the device’s strengths. In contrast, HDR10 leaves this all to the television to decide how it wants to output the image and uses only static metadata for the entire film. That actually means that Dolby Vision can benefit cheaper 4K TV sets and televisions that haven’t been properly calibrated, to a greater degree than top-end sets owned by obsessive home cinema types.
Both Dolby Vision and HDR10 use the SMPTE ST 2084 Electro-Optical Transfer Function (EOTF) format, which lets content creators master once for both formats and add the optional metadata for Dolby Vision displays to get them performing optimally. It’s worth noting that there is now an extension of HDR10, called HDR10+, that uses dynamic metadata rather than static. It also happens to be royalty-free, despite being the result of a collaboration between Samsung, Panasonic and 20th Century Fox.
READ NEXT: What size TV should you buy for your home?
Dolby Vision vs HDR10: So which is best?
So should you opt for a Dolby Vision-capable television rather than one that only supports the standard HDR10 format? As mentioned earlier, there are benefits and drawbacks to both formats, but today Dolby Vision is generally considered to be the pinnacle of HDR technology due to its potential for larger bit depth and higher brightness. And, like HDR10+, its encoded dynamic metadata allows the content to communicate with the playback display to optimise HDR performance for each individual frame.
In general, cheaper HDR-capable TVs will only offer HDR10 and, perhaps, HDR10+. The good news is that, with the exception of Samsung TVs, any HDR TV you buy which supports Dolby Vision is certain to also support HDR10 content playback as well. This means that, even if you can’t find something to watch in Dolby Vision, you’ll still be able to fall back on HDR10 content. In 2020 we’re beginning to see companies like Panasonic rolling Dolby Vision to their cheapest 4K TVs, so it’s likely to become more and more accessible from here on out.
HDR is not going away anytime soon, so purchasing a TV that is HDR-capable will help you future-proof your home entertainment setup for years to come. The more formats the TV supports, the better – but some is better than none. Not sure which TV you should buy? Check out our roundup of the best 4K HDR TVs on the market.