Hello guest, if you like this forum, why don't you register? https://fanrestore.com/member.php?action=register (December 14, 2021) x


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[versions] Why SO many versions? (aspect ratios, colors, HDR, sound...)
#1
You'd think when a studio or producer or director makes a movie they'll want it done right the first time. This is the way we made it, this is how we are releasing it, and this is the way it should be seen. As everyone on this website knows: that's not the case. I'm wondering: why?

Of course, I understand how directors and producers can clash, (Brazil, Blade Runner, the Alien franchise, Apocalypse Now), or just the director wants the extended cut to be released (any James Cameron film), but that's not what I'm talking about here. I'm talking about why are there so many versions of movies with different croppings and colors?

I'm fairly new to this site, I must say, but I got into all this when Endgame wasn't released in its full IMAX open matte glory. The film was shot entirely on IMAX, so why not release it that way? (Answer: pretty sure it has to do with how IMAX distributes their films, and they said No to Disney). But the Harry Potter films and True Lies and Top Gun... Those are Super 35. These films can be viewed in 2.35 or 1.78 (not 1.85, right?).

SO WHY NOT just release them in 1.78? Why not release your film with MORE of the image? Is it because 2.35 looks more "cinematic"? OR is it just because these films are eventually going to be shown on TV and instead of doing a pan and scan (as would be done with an anamorphic print) they release it open matte... Which brings me back to my original question. WHY NOT just release it open matte, utilizing more of the screen?


AND WHAT ABOUT COLOR?
(Sorry, all caps there to just distinguish the subject change)
Look at Minority Report - there's differences in the color between the DVD and Blu-Ray. I didn't get to see the film in theaters (ahem, was a child) but I'm going to guess that the DVD is the same as the theatrical version. So why change it?

Terminator 2's 4K HDR release.... why would you do that to a film? What if we were to go back and make all those old film-noirs HDR (whatever the equivalent in black and white is), getting rid of all the contrast between the lights and shadows? That wouldn't go over well. So why do the same thing to movies today? If it wasn't shot for HDR, then why remaster and release it on HDR? Just for the namesake? Couldn't distributors just add a little checkbox in Special Feature which activates or deactivates all of the LUTs and color done to the new version of the film? Would be nice...

And then there's DISNEY... editing their classics so much to even switching the order of two shots (Little Mermaid, I'm looking at you). And all the while, practically remaking whole films in the computer. Dumbo didn't look like that. Those aren't the colors they chose. Fantasia didn't look that way. I could understand adding more colors and upping the resolution (albeit it looks weird), but why CHANGE the colors?


SOUND is a whole other topic I don't know all too much about. I do know that it seems there's a new remastered soundtrack just about every new release of a movie. I can understand making a variety of mixes based on user's home setup, but I don't know how many changes are made to soundtracks and would love to hear more in this thread.

What are you thoughts on these changes? Why are there so many versions of films (aspect ratios, not just cuts!)? Does releasing a film in 2.35 really make it more "cinematic" and if not, then why not release it as open matte? I look forward to this discussion.
Reply
Thanks given by:
#2
About HDR ... most older movies were shot on negative, which easily captures a bigger dynamic range than SDR, so a mild HDR version is actually a nice thing to have it look correct. Even a theatrical release print slightly exceeds the dynamic range of SDR in my opinion, at least sometimes. It's certainly not "full fledged" HDR utilizing the entire contrast range HDR offers, but it's a little upgrade nonetheless, avoiding clipped highlights. And about getting rid of the contrast between lights and shadows - that's the opposite of what HDR offers. HDR offers you greater contrast. But in order to be able to store all that in a file, the color space for that file is rather flat, so looking at a raw HDR Blu Ray stream it looks very flat but properly decoded it has a much greater dynamic range (contrast range) than SDR.
Reply
Thanks given by: crumpled666 , HippieDalek
#3
Aspect ratios

at the beginning, the films were release in academy ratio (1.33 then 1.37:1); in the late 50s, to push people to watch movies in the theaters, the studio began to release movies in a "cinematographic" ratio - wider than 1.33:1; theaters then must fit their screens to display such wide image. Apart the anamorphic titles - "tied" to their widescreen format, the others (talking about films here) were shot in full frame, then soft matted (apart few instances); hence, for these titles, a full frame original negative exists.

For home media releases, fullscreen 1.33:1 was the easiest (read: less expensive) way to transfer a (non anamorphic) film. Lately, with the introduction of digital cameras - since EPII - the aspect ratios were more or less around 16:9; as many viewers hate black bars (most of them not even understand why they are there), the broadcasters started to transmit movies in 1.78:1 (16:9); not a very big change in the directors' view from 1.85:1, but quite some from 2.35:1

There is IMHO a very simple solution that will please anyone: release movies in open matte, but provide a way to add black bars so to retain OAR; it could be possible if directors film with this in mind - full frame for home viewers, centered widescreen frame for cinema viewers. Or, even better, film and release a movie in a single aspect ratio, whichever the director/studio decide, would avoid any "aspect ratio war"!

Color grading

This will take pages of discussions, so I'll be brief: once there were only CRT displays, and movies released for home viewing were color corrected to be similar to projected films when viewed at home on CRT displays; that's why many movies mastered in the old days look "wrong" when viewed in modern LCD/OLED displays (plasma, I'm not that sure); so that's why there are (usually) two main grading of the same movie, old made in the CRT era, and new made in flat display era; there are few with three, four or more grading, sometime with the DOP that says "this version has THE right color" for two releases...

Sound mixes

At the time of VHS/laserdisc, it was also simplier (again, read: less expensive) to transfer audio tracks straight to home formats; they "should" be the closest way to hear the original mix - apart getting film scans or Cinema DTS discs, but I digress.

At the end, without all these different versions, and the quest to find which one is the (most) right one - usually one version has the right video, another the right color, another one the right mix - where the fun for us would be? Wink
Reply
Thanks given by: crumpled666
#4
(2020-08-01, 09:23 PM)TomArrow Wrote: About HDR ... most older movies were shot on negative, which easily captures a bigger dynamic range than SDR, so a mild HDR version is actually a nice thing to have it look correct. Even a theatrical release print slightly exceeds the dynamic range of SDR in my opinion, at least sometimes. It's certainly not "full fledged" HDR utilizing the entire contrast range HDR offers, but it's a little upgrade nonetheless, avoiding clipped highlights. And about getting rid of the contrast between lights and shadows - that's the opposite of what HDR offers. HDR offers you greater contrast. But in order to be able to store all that in a file, the color space for that file is rather flat, so looking at a raw HDR Blu Ray stream it looks very flat but properly decoded it has a much greater dynamic range (contrast range) than SDR.

Perhaps I'm using the wrong terminology then if HDR increases the contrast range. Take a look at this Terminator 2 Skynet vs. 4K comparison. Specifically at 1:57 and 2:37. The highlights are wayyyy toned down in the HDR version. Does the HDR effect only work if you have a "true" HDR display?

https://www.youtube.com/watch?v=IFC2aiHxiG4


I'll take a solid look at my TV settings and play T2 again and get back to you.
Reply
Thanks given by:
#5
Quote:SO WHY NOT just release them in 1.78?
Often the director and DOP are thinking in terms of 2:35 to begin with because ideally they'd be using anamorphic lenses. That's what they want the movie to look like, even if they have to compromise and crop from super 35 instead. Shooting with super 35 however is less maintenance than native anamorphic, and when the home video release was being prepared in the days before everyone owned an HDTV, the producers would have better options for pan-and-scan. When that frame is being cropped to theatrical 2:35 they aren't sacrificing information; it's stuff that they never necessarily cared to have in the frame to begin with.

Quote:AND WHAT ABOUT COLOR?
Revisionism. Sometimes there are middlemen who change things around; sometimes someone with authority will get involved. A director can do a project in 1995, come back in 2020 for a new release, and decide that the movie doesn't fit his current sensibilities and wants to tinker with the colors. It's that adage that says an artist never finishes his work; he only abandons it. For the audience it's different; to you or I a movie is an objective product but to the artist it's a collection of brushstrokes and scribbles that still have the potential to be moved around. More often though it's because studios are paranoid that their catalog will become irrelevant if they don't adjust the look to match current trends. You mentioned Disney; you might have seen for instance how they scrubbed all the grain from "The Sword in the Stone" so that it would resemble Flash cartoons.

Quote:SOUND is a whole other topic I don't know all too much about

Even before home video it was common for the sound editors to tinker with the movie after the theatrical release. That way maybe a more polished mix could be heard on TV airings. Often the casualties are foley sounds, where they might trade it for something more direct or more realistic.
Sometimes for home video they worry that the high end hiss will irritate people, so they'll do a roll off which cuts all that stuff off, and if done lazily a lot of detail and crispness will be lopped off as well. But that's a bad practice, not necessarily changing the content.
Reply
Thanks given by:
#6
(2020-08-01, 09:46 PM)crumpled666 Wrote:
(2020-08-01, 09:23 PM)TomArrow Wrote: About HDR ... most older movies were shot on negative, which easily captures a bigger dynamic range than SDR, so a mild HDR version is actually a nice thing to have it look correct. Even a theatrical release print slightly exceeds the dynamic range of SDR in my opinion, at least sometimes. It's certainly not "full fledged" HDR utilizing the entire contrast range HDR offers, but it's a little upgrade nonetheless, avoiding clipped highlights. And about getting rid of the contrast between lights and shadows - that's the opposite of what HDR offers. HDR offers you greater contrast. But in order to be able to store all that in a file, the color space for that file is rather flat, so looking at a raw HDR Blu Ray stream it looks very flat but properly decoded it has a much greater dynamic range (contrast range) than SDR.

Perhaps I'm using the wrong terminology then if HDR increases the contrast range. Take a look at this Terminator 2 Skynet vs. 4K comparison. Specifically at 1:57 and 2:37. The highlights are wayyyy toned down in the HDR version. Does the HDR effect only work if you have a "true" HDR display?

https://www.youtube.com/watch?v=IFC2aiHxiG4


I'll take a solid look at my TV settings and play T2 again and get back to you.

That Youtube video is tonemapped. Google "tonemapping" for a bit if you don't know what it is. That's not how the image actually looks in HDR, it's just an automated downconversion to SDR, which reduces contrast in order to fit the dynamic range into SDR, resulting in a low contrast image.

To judge HDR, you need to watch it on an actual HDR display or at least map the image values linearly (will cut off/clip highlights) to get an idea of the actual contrast.

The only thing you can tell from that Youtube clip is that the highlights are protected/not clipped on the HDR version.
Reply
Thanks given by:
#7
(2020-08-01, 09:46 PM)crumpled666 Wrote:
(2020-08-01, 09:23 PM)TomArrow Wrote: About HDR ... most older movies were shot on negative, which easily captures a bigger dynamic range than SDR, so a mild HDR version is actually a nice thing to have it look correct. Even a theatrical release print slightly exceeds the dynamic range of SDR in my opinion, at least sometimes. It's certainly not "full fledged" HDR utilizing the entire contrast range HDR offers, but it's a little upgrade nonetheless, avoiding clipped highlights. And about getting rid of the contrast between lights and shadows - that's the opposite of what HDR offers. HDR offers you greater contrast. But in order to be able to store all that in a file, the color space for that file is rather flat, so looking at a raw HDR Blu Ray stream it looks very flat but properly decoded it has a much greater dynamic range (contrast range) than SDR.

Perhaps I'm using the wrong terminology then if HDR increases the contrast range. Take a look at this Terminator 2 Skynet vs. 4K comparison. Specifically at 1:57 and 2:37. The highlights are wayyyy toned down in the HDR version. Does the HDR effect only work if you have a "true" HDR display?

https://www.youtube.com/watch?v=IFC2aiHxiG4


I'll take a solid look at my TV settings and play T2 again and get back to you.
Never EVER use that channel's videos as an example. Their HDR->SDR tonemappings are absolutely awful, and in no way represents the actual HDR image.
Reply
Thanks given by:
#8
(2020-08-01, 10:56 PM)Setzer Wrote: Never EVER use that channel's videos as an example. Their HDR->SDR tonemappings are absolutely awful, and in no way represents the actual HDR image.

Ok. But if the viewer doesn't have a true HDR monitor, wouldn't the HDR footage look the way it does? After all, T2's 4K HDR Digital release appears to be the exact same as what this guy's showing (I don't have the 4K disc myself). Again, will get back to you both about what happens when I adjust my TV settings.

But my question remains: if viewers are viewing THAT version (the 4K HDR one displayed on the channel, which also appears to be the exact same as the iTunes digital version) WITHOUT a proper HDR display, then that's what the viewer will see, correct?

How can the viewer ever know if they are watching the film in its actual intended display of colors?
Reply
Thanks given by:
#9
There is no fixed or standardized way (to my knowledge) for tone mapping, so any player will have its own implementation. And each implementation has usually a few parameters to tune. In short, every tone mapped version will look slightly different.

Well, technically even HDR displays tonemap when the dynamic range of the content exceeds that of the display. I'm not sure what the perfect solution is but certainly a HDR display will serve you better than an SDR tonemap. The better the display (max nits) the less tonemapping will be done. Other than that you probably need a reference display?
Reply
Thanks given by:
#10
Quote:It's stuff that they never necessarily cared to have in the frame to begin with.

Whoa..... If they didn't want something in the frame then why would they have framed the shot that way? Even if they present the film theatrically in 2.35, they're planning on making an television release in standard wide screen. My question still remains: why do these films continue to be presented in scope if they were shot on a larger frame? Why wouldn't we want more image? Is it just me? Does 2.35 really make something look more "cinematic" and if so, what is it about it?

Quote:because ideally they'd be using anamorphic lenses

Why do you say anamorphic lenses are ideal compared to cropped Super 35 or say Techniscope? Anamorphic widescreen was invented to combat television, getting audiences into the theater to see this wide screen. If audiences really want 2.35 does it matter if it's anamorphic? And again, what's up with movies Still being in scope? Is it just because they're movies and that's how they're distinct compared to TV?

I remember there was some online resistant against Joss Whedon shooting The Avengers in 1.85. Why though? You get more picture.


Quote:It's that adage that says an artist never finishes his work; he only abandons it. For the audience it's different; to you or I a movie is an objective product but to the artist it's a collection of brushstrokes and scribbles that still have the potential to be moved around.

Stated beautifully, @Lio!

Quote:But that's a bad practice, not necessarily changing the content.

So what's with all the debate around sound and if it's 4 channels or 6 channels or 5.1...?

Thank you for taking the time to read all of that.
Reply
Thanks given by:


Possibly Related Threads…
Thread Author Replies Views Last Post
  Hong Kong films-where to find information on the best versions captainsolo 30 10,415 2023-10-30, 12:00 AM
Last Post: Yarp
  Aladdin 1992 uhd colors right? jedimasterplo 1 1,412 2022-01-23, 10:54 AM
Last Post: dvdmike
  TV Versions (Caddyshack, Vacation, & Porky's) gracie1979 1 1,778 2021-07-17, 07:26 PM
Last Post: dvdmike
  Highlander saga - different versions spoRv 35 28,610 2021-04-16, 09:13 PM
Last Post: bromichaelhenry
  Galaxy Quest aspect ratio change Behodar 6 5,781 2018-07-30, 06:00 PM
Last Post: BronzeTitan
  Blade Runner (1982): color-timing; different versions; audio mixes Chewtobacca 30 31,905 2017-11-23, 09:53 PM
Last Post: Chewtobacca

Forum Jump:


Users browsing this thread: 1 Guest(s)