(2018-04-19, 08:26 AM)spoRv Wrote: (2018-04-19, 07:28 AM)Doctor M Wrote: Movies are shot for DCI-P3, an even wider colorspace than Rec.2020.
Actually, DCI-P3 is in between Rec.2020 and Rec.709
Yes, you are right. I was looking at it cross-eyed last night.
And Zoidberg is completely right about his take on HDR.
It's important to understand that modern '4k' TVs have 4 important new features over older HDTV.
4k: Completely misleading since it is a measure of vertical lines, rather than the normal home electronic naming scheme of counting horizontal lines. It is more correctly 2160p.
Also, I guarantee there will be a future lawsuit about this since the TVs actually only have 3840 vertical lines NOT 4000, although digital cameras and CG animation are created in true 4k.
Why it's useless: Most older films were shot on filmstock lacking 4k worth of detail (like 35mm). Many that were shot on higher resolution film were finished in 2k (or a bit better). There are SOME instances of older movies that are 4k end to end, and of course recent films are now shot in 4k.
Additionally, humans cannot resolve detail this fine at more than an arm's length distance from the screen or on a display less than 100-inches. Demonstrations claim that people DO see that 2160p TVs are better than 1080p, but no television can change the anatomy of the eye.
Mostly studios degrade the picture a bit when they put it on 1080p blu-ray just to up sell you on UHD, or because they've been technically stupid on how to make a good looking BD.
(Btw, there is a really cool thread on HDBits where they are taking films that don't have true 4k detail and are making 1080p out of it with 10-bit color and HDR.)
Wide Color Gamut: Larger range of colors capable of being displayed allowing truer to life images.
Why it's useless: Mostly it isn't. Assuming people can SEE that many colors, as spoRv pointed out, Rec.2020 is STILL more colors than originally used. Odds are goods though that studios will create new colors in older films to expand to Rec.2020 rather than leaving a gap between DCI-P3 and Rec.2020 ranges.
10/12-bit Color: The "range of possible color values per RGB subpixel in each pixel". Simple answer: While each pixel in a wider color gamut can hit more colors at the limits, the STEPS between accessible colors is determined by the bits. 10-bit has finer tuning of what color it hits.
Why it's useless: It's not. Banding in gradients suck and this reduces it.
HDR Color: Completely misleading name. Wide Color Gamut + 10-bit Color = HDR Color.
Why it's useless: It's just a simple way to say HDR as a specification gives better colors, but WCG and 10-bit would do the same thing even without what we think of as HDR which is actually HDR Contrast. It's just a buzz word.
HDR Contrast: Enhanced black and white peaks to show more details in shadows and brightest areas with eye-blinding levels used for highlights/specular lighting. And that's assuming that the person creating it uses restraint and doesn't just make it look like someone is shining a maglite in your eyes.
Why it's useless: It's completely artificial on anything more than a few years old. Films have always been shot on film, and projected on a screen with one bulb of a set brightness. If you start shining different light levels through a film, sure you can extract more detail, but it is NOT detail anyone ever expected to be on the screen.
And like Zoidberg said, there is also no one standard, so it looks different on every device you watch it on.
Like I said, this is the 3D post-conversion and colorization of black and white all over again.