HDR For Monitors: What is It? And is it Worth It?
It’s 2019 and though HDR is still tangled in competing standards, you can still take advantage of this new display technology.
High Dynamic Range (HDR), is a display technology that is all about lifelike and realistic imagery. It is much easier to understand in the TV world because it’s less chaotic than in the PC world. But you can count on its improved brightness, contrast, and a variety of accurate color range.
HDR boost is much more noticeable in any display than having a higher resolution. Say comparing 2K and 4K displays. You will notice HDR in a monitor than you would recognize higher resolution.
What is HDR?
It’s all about the way we perceive the world around us. HDR is as close as we can get to the natural world in a display. We are talking black blacks, brighter whites, balanced hues of blue, red, and orange, etc. You will see every shade of every color on an HDR display. No more greys that look like white and blacks that look like grey.
SDR vs. HDR
HDR supersedes SDR (Standard Dynamic Range). SDR is limited to a fraction of the dynamic range that an HDR monitor can deliver. It does not preserve the contrast ratio. Scenes are not rendered as they should. SDR is also not capable of preserving light in optical phenomena. These are in things such as transparent materials like glass or shiny stuff like jewelry and the stars.
On a cloudy day, you can clearly see different shades of colors including the sun hiding under the clouds. But the same represented using SDR will look quite different. You will notice that it cannot simulate the colors and the sun may not even appear in the scene as a shiny object.
HDR Color Depth
The color depth or bit depth refers to the numbers of colors a pixel can display. The greater the color depth, the more the colors that can be displayed. Before HDR, displays maxed out at 8-bit color depth. That is around 16.77 million different colors.
The entry of HDR displays, saw 10-bit and 12-bit color depths. That is roughly 1.07 billion colors for 10-bit display and 68.7 billion colors for 12-bit display.
Perhaps the most important feature of HDR is the increased luminance.
Display brightness is the amount of light a screen emits. It determines how dark the darkest pixel on the screen can be and how bright the brightest pixel on the screen can be. HDR comes in to further differentiate between the darkest and the brightest pixel on a screen.
The increased contrast ratio allows you to see something creeping up in the shadows. Or experience nature documentaries like you are in nature itself. A monitor display with HDR can respond accurately to what’s onscreen. It can brighten the parts that need to be bright and darken further the parts that need to be dark. This is in the case where you have stuff such as stars and car flashlights being bright and visible without the screen losing its dynamic range and contrast.
HDR10, Dolby Vision, and HDR10+ are the three main HDR formats.
This is a free, open-standard for manufacturers. Content creators and TV manufacturers don’t need to pay any royalties to utilize the standard. It reproduces 10-bit color depth and can produce up to 1000 bits of brightness. It guarantees over a billion colors per pixel, up to 4000 nits of brightness, and generalized metadata.
In other words, HDR10 gives you HDR content at the most basic level.
While HDR10 has generalized metadata, Dolby Vision implements dynamic metadata. It is the propriety of Dolby which means manufacturers have to pay Dolby royalties to use the standard. HDR10 is limited to 10-bit but Dolby Vision can display 12-bit color depth.
Because there are currently not many 12-bit displays, DV downsamples its bit depth to 10-bit. It gives you a slight improvement over HDR10 but is still very remarkable.
This brings us to the next description of Dolby Vision which is dynamic HDR. HDR10 implements what we call static HDR. It does not change the metadata according to the scene. But Dolby Vision changes the brightness and the color according to the scene; hence dynamic HDR.
As you can see, Dolby Vision is superior to HDR10.
This is an HDR format designed by Samsung. It ensures 1000 nits of peak brightness in their monitors and TVs. Also known as Smart Ultra High Definition, HDR10+ is essentially similar to HDR10. Well, except for the ultra-black technology which Samsung uses to reduce screen glare.
Other formats of HDR include HLG (Hybrid Log Gamma) and Technicolor Advanced HDR. HLG is backward compatible with SDR TVs and is better than SDR. It does not quite match up to HDR displays.
Technicolor Advanced HDR comes in multiple formats. SL-HDR1, 2, and 3. SL-HDR1, like HLG, is backward compatible with SDR TVs. SL-HDR-2 uses dynamic metadata like Dolby Vision and SL-HDR 3 is HLG with dynamic metadata.
The baseline is that HDR10 is the de-facto format and Dolby Vision is the step-up format.
Displays have to meet certain requirements to qualify as HDR10 displays.
- 4k Ultra HD resolution
- 10-bit color support that covers a minimum 90% DCI-P3 color space. I.e. 117% Adobe RGB and 125% sRGB.
- 0.05-nits black level and 1000 nits of peak brightness for LCD displays
- 540 nits of peak brightness and 0.0005-nit black level for OLED displays
- At least HDMI 2.0 and DisplayPort 1.4
- Monitors must have VA and IPS panels because TN does not support HDR
For gaming, it is not hefty on anyone since it does not require a beefy GPU. With at least Nvidia GTX 950 and above and AMD R9 380 onward, you can enjoy HDR gaming.
To enable HDR on your monitor, you simply have to go the display setting menu. Select enable HDR on the HDR option and you are set. It is pretty much the same on consoles so no worries there either.
DisplayHDR Standards by VESA
VESA (Video Electronics Standards Association) defined the industry’s first open source HDR quality in December 2017. Their aim was to remove consumer confusion in the different standards so that you can know the meaning behind the manufacturer’s specifications.
You can now download DisplayHDR software and performance tests for color gamut, luminance, bit depth, etc. There are three DisplayHDR standards;
HDR vs QHD vs UHD
The question you are probably asking is whether you should forego a high-resolution monitor and instead go for HDR. High resolution and HDR are mutually unrelated. While HDR improves the brightness and contrast of an image, high resolutions deal with the sharpness and the sense of depth in an image.
The good thing is that high-resolution monitors support HDR. This way, you don’t have to choose one or the other. The only thing you need to remember is that a high-resolution monitor, say, 4K will require a beastly GPU over lower resolution monitors like HD and FHD.
HDR is definitely worth it in monitors. From gaming to editing, it brings you closer to the real world. More and more games are getting on the HDR wagon. Releases such as Rise of the Tomb Raider and Battlefield are already on it. You can expect improved details in dark scenes and brighter whites in nature.
For photo and video editors, HDR gives you color fidelity on a whole new level. All you need to ensure is that your monitor supports HDR content and you are good to go.