Hello guest, if you like this forum, why don't you register? https://fanrestore.com/member.php?action=register (December 14, 2021) x


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
HDR (and why you shouldn't hate it)
#1
(This was written for a different forum with a less technical audience than this forum, but given the negativity about HDR, and the misunderstandings contained therein, I thought it worthwhile copy this post here)

"It's important to remember that HDR isn't a color space nor a bit depth; HDR can be used with even REC709 if one wanted, or REC2020 space could be used with SDR. It's better to think of HDR as a new bit value -> optical function intended to replace the gamma function, and as a new brightness range that video is mastered in.


The idea behind the gamma function (REC1886 to be specific) is you can make MUCH more efficient use of the available bits-per-channel by taking advantage of the fact that human vision has an easier time noticing differences in in two close dark shades vs two close bright shades; more bits are allocated to the darker parts of the image. This efficient allocation of bits helps to avoid the perception of banding or stair stepping on smooth gradients. Problem is the gamma function was modeled after CRT behavior which, while decent, isn't necessarily the most efficient use of the bits available; even with 10 bits a REC1886 video signal would still have visible banding.

Before describing HDR, it's important to remember that a white pixel value correlated to about 80-100 nits back in the CRT days, so displays were *generally* calibrated with that in mind, and video transfers were *generally* mastered within the range of 0-100 nits. Given that most Blu-Rays, which *generally* use the REC1886 gamma function, look pretty good using this system, it's fair to say that most of the information the colorist would wish to present resides within that 0-100nit range. Unfortunately, one of the problems with this limited range is that highlights can easily be hard clipped which destroys the detail within those highlights.

HDR is centered around two new developments: a new bit value -> optical function by Dolby called the Perceptual Quantizer, and a new brightness range of 0-10,000 nits (10k nits is about the brightness of a fluorescent tube). Dolby's PQ function is so efficient with it's usage of bits that it can cover the range of 0-10,000 nits using 12 bits without any visible banding. The averaged brightness of an average frame mastered in HDR is, ideally, not much more than the averaged brightness of an average frame mastered in SDR in that the majority of material resides within 0-100 nits (with 100 being the white value we are familiar with), but the colorist has that extra 100-10,000 nit range for elements of the frame that need that extra brightness (such as specular highlights) whilst preserving detail in those areas.

Basically, if the colorist is using the system as intended by the developers of this system, your average brightness within the frame will be similar to Blu-ray, just with the added benefit that highlight detail is preserved and the capability to make brighter elements stand out if necessary. As you can see, if a colorist deems it necessary, they could grade no frame elements above 100nits, keeping the grade similar to SDR. HDR doesn't automatically imply that content need be brighter, it simply provides a framework that accommodates many more artistic intentions vs the older SDR system."

To sum up:
-HDR PQ content makes far better use of the 10 bits per channel than SDR does, and if the disc has Dolby Vision, then the 12 bits per channel is more than enough to prevent visible banding from occurring completely. As such, it is very much superior to SDR.
-HDR content is graded in such a way that it's harder to clip highlights, which means that if color correction or an SDR grade wants to be done by a member of this community, we have more to work with vs SDR content.

-Not mentioned in the article but Dolby Vision discs and streams would be easiest to convert to SDR because part of making a Dolby Vision grade is making a complimentary SDR grade that is bundled in with the Dolby Vision data. You just need a decoder...

Reply
Thanks given by: CSchmidlapp
#2
Learned to work with SDR (more or less) after so many years, and now I must start from scratch with HDR... Wink

I think the problem is not HDR per se, but all those standards (HDR10, HRD10+, HLG, DV etc) and the fact each display producer apply its own technique...

I guess when (if) we will have 12bit 10000nits capable display, and no need for metadata, there will be no HDR problems anymore... of course, at that time, producer will invent new things to let us be involved with! Big Grin
Reply
Thanks given by:
#3
HDR10, 10+, and DV all involve the same grade, so there's not really a difference between them as far as the material is concerned (though DV grades come out being 12 bit). HLG is it's own animal BUT that was designed to be backwards compatible with SDR systems.

Worth nothing that, with a DV capable display and a DV disc, the tone mapping is specified by the colorist in the form of meta data, and the display will follow Dolby's instructions in applying that tone mapping in a way consistent with the colorists instructions and other DV hardware.

Also, given that some HDR grades of older films barely use the expanded range that HDR offers, a simple clipping of values could result in a SDR conversion that looks identical to the HDR version.
Reply
Thanks given by:


Forum Jump:


Users browsing this thread: 1 Guest(s)