(This was written for a different forum with a less technical audience than this forum, but given the negativity about HDR, and the misunderstandings contained therein, I thought it worthwhile copy this post here)
"It's important to remember that HDR isn't a color space nor a bit depth; HDR can be used with even REC709 if one wanted, or REC2020 space could be used with SDR. It's better to think of HDR as a new bit value -> optical function intended to replace the gamma function, and as a new brightness range that video is mastered in.
The idea behind the gamma function (REC1886 to be specific) is you can make MUCH more efficient use of the available bits-per-channel by taking advantage of the fact that human vision has an easier time noticing differences in in two close dark shades vs two close bright shades; more bits are allocated to the darker parts of the image. This efficient allocation of bits helps to avoid the perception of banding or stair stepping on smooth gradients. Problem is the gamma function was modeled after CRT behavior which, while decent, isn't necessarily the most efficient use of the bits available; even with 10 bits a REC1886 video signal would still have visible banding.
Before describing HDR, it's important to remember that a white pixel value correlated to about 80-100 nits back in the CRT days, so displays were *generally* calibrated with that in mind, and video transfers were *generally* mastered within the range of 0-100 nits. Given that most Blu-Rays, which *generally* use the REC1886 gamma function, look pretty good using this system, it's fair to say that most of the information the colorist would wish to present resides within that 0-100nit range. Unfortunately, one of the problems with this limited range is that highlights can easily be hard clipped which destroys the detail within those highlights.
HDR is centered around two new developments: a new bit value -> optical function by Dolby called the Perceptual Quantizer, and a new brightness range of 0-10,000 nits (10k nits is about the brightness of a fluorescent tube). Dolby's PQ function is so efficient with it's usage of bits that it can cover the range of 0-10,000 nits using 12 bits without any visible banding. The averaged brightness of an average frame mastered in HDR is, ideally, not much more than the averaged brightness of an average frame mastered in SDR in that the majority of material resides within 0-100 nits (with 100 being the white value we are familiar with), but the colorist has that extra 100-10,000 nit range for elements of the frame that need that extra brightness (such as specular highlights) whilst preserving detail in those areas.
Basically, if the colorist is using the system as intended by the developers of this system, your average brightness within the frame will be similar to Blu-ray, just with the added benefit that highlight detail is preserved and the capability to make brighter elements stand out if necessary. As you can see, if a colorist deems it necessary, they could grade no frame elements above 100nits, keeping the grade similar to SDR. HDR doesn't automatically imply that content need be brighter, it simply provides a framework that accommodates many more artistic intentions vs the older SDR system."
To sum up:
-HDR PQ content makes far better use of the 10 bits per channel than SDR does, and if the disc has Dolby Vision, then the 12 bits per channel is more than enough to prevent visible banding from occurring completely. As such, it is very much superior to SDR.
-HDR content is graded in such a way that it's harder to clip highlights, which means that if color correction or an SDR grade wants to be done by a member of this community, we have more to work with vs SDR content.
-Not mentioned in the article but Dolby Vision discs and streams would be easiest to convert to SDR because part of making a Dolby Vision grade is making a complimentary SDR grade that is bundled in with the Dolby Vision data. You just need a decoder...
"It's important to remember that HDR isn't a color space nor a bit depth; HDR can be used with even REC709 if one wanted, or REC2020 space could be used with SDR. It's better to think of HDR as a new bit value -> optical function intended to replace the gamma function, and as a new brightness range that video is mastered in.
The idea behind the gamma function (REC1886 to be specific) is you can make MUCH more efficient use of the available bits-per-channel by taking advantage of the fact that human vision has an easier time noticing differences in in two close dark shades vs two close bright shades; more bits are allocated to the darker parts of the image. This efficient allocation of bits helps to avoid the perception of banding or stair stepping on smooth gradients. Problem is the gamma function was modeled after CRT behavior which, while decent, isn't necessarily the most efficient use of the bits available; even with 10 bits a REC1886 video signal would still have visible banding.
Before describing HDR, it's important to remember that a white pixel value correlated to about 80-100 nits back in the CRT days, so displays were *generally* calibrated with that in mind, and video transfers were *generally* mastered within the range of 0-100 nits. Given that most Blu-Rays, which *generally* use the REC1886 gamma function, look pretty good using this system, it's fair to say that most of the information the colorist would wish to present resides within that 0-100nit range. Unfortunately, one of the problems with this limited range is that highlights can easily be hard clipped which destroys the detail within those highlights.
HDR is centered around two new developments: a new bit value -> optical function by Dolby called the Perceptual Quantizer, and a new brightness range of 0-10,000 nits (10k nits is about the brightness of a fluorescent tube). Dolby's PQ function is so efficient with it's usage of bits that it can cover the range of 0-10,000 nits using 12 bits without any visible banding. The averaged brightness of an average frame mastered in HDR is, ideally, not much more than the averaged brightness of an average frame mastered in SDR in that the majority of material resides within 0-100 nits (with 100 being the white value we are familiar with), but the colorist has that extra 100-10,000 nit range for elements of the frame that need that extra brightness (such as specular highlights) whilst preserving detail in those areas.
Basically, if the colorist is using the system as intended by the developers of this system, your average brightness within the frame will be similar to Blu-ray, just with the added benefit that highlight detail is preserved and the capability to make brighter elements stand out if necessary. As you can see, if a colorist deems it necessary, they could grade no frame elements above 100nits, keeping the grade similar to SDR. HDR doesn't automatically imply that content need be brighter, it simply provides a framework that accommodates many more artistic intentions vs the older SDR system."
To sum up:
-HDR PQ content makes far better use of the 10 bits per channel than SDR does, and if the disc has Dolby Vision, then the 12 bits per channel is more than enough to prevent visible banding from occurring completely. As such, it is very much superior to SDR.
-HDR content is graded in such a way that it's harder to clip highlights, which means that if color correction or an SDR grade wants to be done by a member of this community, we have more to work with vs SDR content.
-Not mentioned in the article but Dolby Vision discs and streams would be easiest to convert to SDR because part of making a Dolby Vision grade is making a complimentary SDR grade that is bundled in with the Dolby Vision data. You just need a decoder...