Hello guest, if you like this forum, why don't you register? https://fanrestore.com/member.php?action=register (December 14, 2021) x


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[Idea] SDR -> HDR "upconversion"
#21
CS, not sure who you are responding to there, but it seems like you are just looking at HDR from a movie perspective, when it's origins are film stock replacement, so photography. Basically Fujifilm had to try and get the sensors to produce results as close as possible to their major professional filmstocks of the time, so Velvia, Provia 100FN, NPS, NPH, Neopan etc. It was the only way they could win the professional photographers across as they were used to certain result types and with exception of Neopan, B&W, Fujis film stock was always strongest within green gradients, whereas Kodak's best, Kodachrome 64 was stronger in reds.
What makes it complicated is not having the same color, it's the same colors/gradients within the same "exposure", so the same final total image because that is where the controlled aspect comes into play.
Reply
Thanks given by: CSchmidlapp
#22
(2018-04-19, 12:58 PM)OogieBoogie Wrote: CS, not sure who you are responding to there, but it seems like you are just looking at HDR from a movie perspective, when it's origins are film stock replacement, so photography. Basically Fujifilm had to try and get the sensors to produce results as close as possible to their major professional filmstocks of the time, so Velvia, Provia 100FN, NPS, NPH, Neopan etc. It was the only way they could win the professional photographers across as they were used to certain result types and with exception of Neopan, B&W, Fujis film stock was always strongest within green gradients, whereas Kodak's best, Kodachrome 64 was stronger in reds.
What makes it complicated is not having the same color, it's the same colors/gradients within the same "exposure", so the same final total image because that is where the controlled aspect comes into play.

Sorry buds. Im trying to reply while looking after a 2 year old lol.
I understand all of this, but im not having a good time communicating.
I'll come back to the discussion when im a little more 'free' Smile
Reply
Thanks given by:
#23
An interesting article about HDR: https://hometheaterhifi.com/technical/te...libration/
Reply
Thanks given by:
#24
Assuming you had for example a 12bits master (not the encoded Blu-ray pix) you can technically get the most out of it even if you switch back to rec709 at the end, by color timing it in the BT2020 space.
Reply
Thanks given by:
#25
DCPs use the P3 gamut with 12bit per colour in an XYZ colourspace. A lot of smaller independant features will grade within Rec709 and let the DCP software upconvert to P3 during the encode to DCP, this is because display technology that can accurately portray P3 is prohibitively expensive. SMPTE specifications for light output is 16 foot lamberts which is approximately 50nits. DCPs are mastered at 50nits, HD Rec709 100nits. Film was 16 foot lamberts 'through the gate' with no film in the light path.

The HDR grade is a second pass completely independant of the DCI-P3/Rec709 passes. That's when the mad 1000s of nits highlights are added. Unless you're fortunate to live near one of the few Dolby Vision laser projection screens then HDR is strictly a home video experience. Anything greater than an attempt to match the latitude of film (when it comes to grading older titles shot on film) will be essentially revisionist. Attempts to accurately match the latitude of film will be met with wild rage on the blu ray forum.

Here's where it gets funky. Every display manufacturer has it's own way of implementing HDR. There is no standard. Some displays can handle isolated peaks of 1000+ nits but most cannot. The display's algorithms determine how much black/white clipping must occur to handle the HDR material. It's the only video system I know that introduces clipping by design. Also very very few HDR displays can even display the P3 colourspace let alone BT2020, most are delivering something nearer to (or slightly beyond) Rec709. Basically you need the big big bucks 4K displays to be getting the genuine WCG HDR experience.

It's a far cry from the early days of flat panels/solid state displays when finally (after the CRT years) people all over the world could, by way of calibration (either professionally or personally) see exactly what was intended and know that others were seeing the same.
Reply
Thanks given by: CSchmidlapp
#26
Well, you can still calibrate a HDR display, but will get some amount of clipping if it's a lower grade one. But the same is kinda true for normal LCDs, at least from my experience with calibrating in Windows. If the display you use cannot fully display the colorspace, then it clips after calibration.

Calibration also can't magically improve the black point nor account for the fact that some displays change colors depending on viewing angle.
Reply
Thanks given by:
#27
AFAIK no UHD display match the rec.2020, only the best are near P3, but I guess only the worst are near rec.709 - probably the norm would be around halfpoint 709 and P3.
Reply
Thanks given by:
#28
(2018-04-19, 08:26 AM)spoRv Wrote:
(2018-04-19, 07:28 AM)Doctor M Wrote: Movies are shot for DCI-P3, an even wider colorspace than Rec.2020.
Actually, DCI-P3 is in between Rec.2020 and Rec.709

Yes, you are right.  I was looking at it cross-eyed last night.

And Zoidberg is completely right about his take on HDR.

It's important to understand that modern '4k' TVs have 4 important new features over older HDTV.

4k: Completely misleading since it is a measure of vertical lines, rather than the normal home electronic naming scheme of counting horizontal lines.  It is more correctly 2160p.

Also, I guarantee there will be a future lawsuit about this since the TVs actually only have 3840 vertical lines NOT 4000, although digital cameras and CG animation are created in true 4k.

Why it's useless: Most older films were shot on filmstock lacking 4k worth of detail (like 35mm).  Many that were shot on higher resolution film were finished in 2k (or a bit better).  There are SOME instances of older movies that are 4k end to end, and of course recent films are now shot in 4k.

Additionally, humans cannot resolve detail this fine at more than an arm's length distance from the screen or on a display less than 100-inches. Demonstrations claim that people DO see that 2160p TVs are better than 1080p, but no television can change the anatomy of the eye.

Mostly studios degrade the picture a bit when they put it on 1080p blu-ray just to up sell you on UHD, or because they've been technically stupid on how to make a good looking BD.

(Btw, there is a really cool thread on HDBits where they are taking films that don't have true 4k detail and are making 1080p out of it with 10-bit color and HDR.)

Wide Color Gamut:  Larger range of colors capable of being displayed allowing truer to life images.

Why it's useless: Mostly it isn't.  Assuming people can SEE that many colors, as spoRv pointed out, Rec.2020 is STILL more colors than originally used.  Odds are goods though that studios will create new colors in older films to expand to Rec.2020 rather than leaving a gap between DCI-P3 and Rec.2020 ranges.

10/12-bit Color: The "range of possible color values per RGB subpixel in each pixel".   Simple answer: While each pixel in a wider color gamut can hit more colors at the limits, the STEPS between accessible colors is determined by the bits.  10-bit has finer tuning of what color it hits.

Why it's useless: It's not.  Banding in gradients suck and this reduces it.

HDR Color: Completely misleading name.  Wide Color Gamut + 10-bit Color = HDR Color.

Why it's useless: It's just a simple way to say HDR as a specification gives better colors, but WCG and 10-bit would do the same thing even without what we think of as HDR which is actually HDR Contrast.  It's just a buzz word.

HDR Contrast: Enhanced black and white peaks to show more details in shadows and brightest areas with eye-blinding levels used for highlights/specular lighting.  And that's assuming that the person creating it uses restraint and doesn't just make it look like someone is shining a maglite in your eyes.

Why it's useless:  It's completely artificial on anything more than a few years old.  Films have always been shot on film, and projected on a screen with one bulb of a set brightness.  If you start shining different light levels through a film, sure you can extract more detail, but it is NOT detail anyone ever expected to be on the screen.

And like Zoidberg said, there is also no one standard, so it looks different on every device you watch it on.
Like I said, this is the 3D post-conversion and colorization of black and white all over again.
Reply
Thanks given by: CSchmidlapp , Jetrell Fo
#29
You don't need an expensive monitor to get HDR, any 10 years old HD monitor with DVI input will render it. You will not get the fine tuning out of it, but it will display nonetheless.
Reply
Thanks given by:
#30
Wow. Great thread.
I have been thinking how to word my next post after joining the discussion earlier.
It was hard to get across all I wanted to cover and ask in small bursts due to the time constraints of having to entertain my 2 year old, and the conversation moves fast Smile

Both Zoidberg and Doctor M have covered everything in the posts above I wished to bring to the discussion and more so im quite glad I don't have to write all that out. I don't think I could have put it nearly as eloquently as you guys have lol.
Reply
Thanks given by:


Forum Jump:


Users browsing this thread: 6 Guest(s)