HDR10 Vs HDR10+ Vs Dolby Vision – Do the Differences Matter?
When we want to watch content, we obviously want it to be of the highest possible quality. For that reason, we purchase TVs which are larger and boast a bigger, but not often better, resolution.
One of the parts of that HD experience is the dynamic range. When we watch videos and play games in normal circumstances, we view them in standard dynamic range or SDR. If we want to see the content in its best light, quite literally, we should watch it in HDR or high dynamic range.
With there being multiple standards and iterations of HDR, terms like HDR10, HDR10+ and Dolby Vision bring a bit of confusion to the market.
To clear everything up, let us look closely at HDR in general and then its most common standards and technologies.
What Is HDR?
High dynamic range is a way of capturing images and videos and enhancing their highlights, shadows and overall quality of colors. HDR often uses a higher bit rate, often 10 bits or more, as opposed to the 8 bits that most displays use.
HDR content is detailed in all areas and you would be able to see the details of a bright sky as well as the dark shadows of a nearby tunnel. The colors often appear to be more detailed, and that is because of the higher bit rate per channel.
In video and game reproduction, HDR makes everything more akin to how our eyes would perceive it, but that also depends on the format used, as well as the HDR standardization of the display.
Do the Displays Matter?
The display where you view the HDR content matters a lot, mostly because they need to support HDR properly. There are various VESA standards for HDR displays and the most common are 400, 600 and 1000, with some in between and above.
HDR 400 displays are considered entry level and often not good at reproducing HDR while some HDR 1000 displays can be good at reproducing HDR content.
The numbers in the standard refer to the peak brightness of the monitor, measured in candelas per square meter or cd/m2. However, the standardization does not specify for how long the display must maintain that peak brightness. Other things are included in the standard, such as the maximum black luminance, which should be as low as possible, also measured in cd/m2.
Turning on HDR lowers the performance of the display in terms of speed, which is why it is more often found in TVs compared to gaming monitors.
What is HDR10?
Standardization is important if you want to make progress and have multiple companies create displays and content which can be viewed on those displays. In 2015, the Consumer Technology Association released the HDR10 standard, the most widespread and easiest to implement HDR format.
HDR10 is an open and royalty free standard, and has all the goods such as 10 bits per channel, a wider color gamut, but falls short when it comes to the metadata. HDR10 has static metadata, which means that the HDR content will remain the same throughout the duration of the content, not adjusting to the display’s capabilities, other than the brightness.
HDR10+ – Better In Every Way
On the other hand, HDR10+, released in 2017, updates the HDR10 standard and adds dynamic metadata. Still an open standard and is without royalties, it is more widely supported by various programs, displays, as well as devices such as consoles and even smartphones.
HDR10+ is able to address each scene or even frame in a video or game, and then adjust the dynamic range. This means that the content will adjust to the display and present itself to the best of the display’s abilities the way the content creator intended it to be.
Dolby Vision was first and offers somewhat superior technology, at a price.
Dolby Vision – Better or First?
Dolby Vision is an alternative to HDR, released in 2014. Dolby Vision uses 12 bits per channel and is therefore capable of displaying more colors. Given its trailblazing, and the fact that it used dynamic metadata from the start, Dolby Vision had a lot of traction, that is until the release of HDR10+.
Dolby Laboratories ask a rather steep price for their technologies to be used in content creation and for a time, only larger studios used them. Royalties also have to be paid, which is where success comes at a rising price.
Which of The Formats is Better?
When you start talking about the best formats, you only need to look at the specifications and some parts will already be eliminated. Dolby Vision vs HDR10 is a losing battle for HDR10, as is the HDR10 vs HDR10+.
The formats with dynamic metadata win over the format with static metadata, hands down. This is also why most companies have moved to HDR10+ when it was announced and proven to work reliably.
HDR10+ Vs Dolby Vision – What About the More Detailed Formats?
This is where the lines get blurred in terms of what the better choice would be. Dolby Vision has a higher bit rate and was first to the party, but it cost money to implement and is paid later in royalties.
HDR10+ is open, free and used by most other standards such as HDMI 2.1 as a default way of presenting HDR content. It has lower performance, but that is something that can rarely be spotted unless you were to view the same content in Dolby Vision and HDR10+, which is highly unlikely to happen.
Conclusion and Summary – HDR10+ and Dolby Vision Are Both Great
Whether one were to watch Dolby Vision or HDR10+ content on a display that can properly reproduce HDR, the experience would be better, no matter the format. The quality of the panel type matters for gaming and HDR content.
Most HDR monitors are IPS and OLED, while there are some VA, but no TN panels. In the end, the format doesn’t matter unless it supports dynamic metadata and your display is capable of reproducing the content with good peak and sustained brightness and little to no bloom.
Reducing Audio Latency on Your PC
One simple way of reducing audio latency on your machine is by reducing the audio buffer that is used by default on your Windows installation. This is a well-known trick and one way of achieving this is by using Real – a simple utility that requests the usage of smaller buffers from Windows. Why Would […]
Stop Using Your Wireless Mouse in the Wired Mode!
An ultra-competitive subset of gamers has developed a trend that has spread towards the more casual gamers, namely using their wireless mice in the wired mode for performance gains. While this makes sense since wired connections are always technically speaking more stable, it is also true that unless you are facing very specific conditions your […]
Ghost Judges Apollo – The Perfect Midway Point
My interest in cheap AliExpress keycap sets that punch way over their pricing is a documented fact, and sometimes you can find real gems in that category. Sometimes, however, you might want to get a bit more of a premium experience without spending 100$ on a keycap set. I spent a bit of time researching […]
OpenRGB – One App to Rule Them All
One of the biggest reasons I hate RGB products besides cable management is the software that you need to install to control said RGB effects. God forbid you mix multiple RGB items from different brands because then you need to install each RGB software from each vendor, resulting in an impressive amount of bloatware. This […]
Testing Wireless vs Wired High Polling Rate Mice
The mouse industry has all shifted towards improving wireless mice and the current race is about who can make mice with a 4000 Hz polling rate or more with a low weight. Wireless tech has come a long way and at this point, a wireless 4000 Hz mouse will have less click/motion latency than a […]
Is It Worth Buying More Than One Glass Mousepad?
I think this is a question that both people with glass pads and people without them have – is it worth having more than one glass mousepad? We are talking about mousepads that do not change their surface feel and to ruin one you need to use a material harder than tempered glass (ceramic, sapphire). […]