2018-05-26, 08:49 AM
Just checked The Mummy screenshots, and the HDR-to-SDR conversion seems done right!
If I am not mistaken (and it is probably not the case), HDR material mastered at 1000nits (and viewed on a 1000+nits display) are not dinamically mapped, but they are just plain 10bit, 0-1023 reduced to 64-940 range
So, in those cases there should be a (right) univocal way to convert HDR to SDR.
If I am not mistaken (and it is probably not the case), HDR material mastered at 1000nits (and viewed on a 1000+nits display) are not dinamically mapped, but they are just plain 10bit, 0-1023 reduced to 64-940 range
Quote:if a TV can deliver 100% of DCI-P3 and 1,000 nits of peak brightness, then for content graded at 1,000 nits the TV will take a one-to-one approach and simply show the content as it was created.(🔍https://www.avforums.com/article/what-is...ping.13883)
So, in those cases there should be a (right) univocal way to convert HDR to SDR.