Hello guest, if you like this forum, why don't you register? https://fanrestore.com/member.php?action=register (December 14, 2021) x


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
4K 10bit Rec2020 HDR to Lossless format for editing ?
#11
Based off standards of the time I would think not. Blu-ray is defined as HD/709 which is SDR only. When HD was defined, which BD spec is built off of, the defacto TVs which were still CRTs. So you had to build a standard towards those devices meaning the SDR Range of 0 to 100 IRE. In fact I'm pretty sure MPEG2 and H.264 (as originally defined) could only do SDR.

Can you graft HDR on that? Sure you technical can add HDR to the 709 space or to H.264 stream or really anything. Its getting it to play back automatically that is the problem. Its probable that the display won't know to turn on HDR with a mpeg2/264 stream. So you will be seeing that gray washed out look.

But you are right HDR and color is often confused, due to bad marketing. HDR is not making your colors but it will make your colors pop.
Reply
Thanks given by:
#12
Oh yeah I just meant that it might be technically possible to create some kind of HDR Rec709 file. To apply the PQ curve to Rec709 RGB values should be not too difficult. However yeah, most if not all existing encoders/decoders likely wouldn't know what to do with such "contradictory" information and there would be no real practical use to such a concept either.

A bit off topic but I remember a time when all HDR meant to anyone was the exposure stacking of photos and subsequent hyper-ugly tonemapping that sought to bring out every last pixel of detail from any photo, creating intolerable halos and artifacts throughout the image but it was seen as the "HDR look". Big Grin Once saw a guy use the HDR photo stacking technique in a way that actually still looked good (with good looking tone mapping) and people criticized him and said it was not "true HDR" because it didn't look ugly basically. Big Grin
Reply
Thanks given by:
#13
(2021-01-21, 07:17 PM)TomArrow Wrote: A bit off topic but I remember a time when all HDR meant to anyone was the exposure stacking of photos and subsequent hyper-ugly tonemapping that sought to bring out every last pixel of detail from any photo, creating intolerable halos and artifacts throughout the image but it was seen as the "HDR look". Big Grin Once saw a guy use the HDR photo stacking technique in a way that actually still looked good (with good looking tone mapping) and people criticized him and said it was not "true HDR" because it didn't look ugly basically. Big Grin

I remember the same confusion as I was experimenting with HDR still photography at the time and was fearful of movies in that heightened "HDR look".

The irony to me if you think about it that's how we get HDR from 35mm film. Three pics, one high, one mid, one low and combined together.

I guess HDR really is the outcome and not process to get there?

One of the unfounded complaints is that any 35mm film (or 16, 70, etc) that is given a HDR grade is a lie or basically untrue to the original OCN or IP. That HDR is a fake add-on. That's not true. You might be able to argue about peak brightness of projected film but pretty much every film ever shot has more latitude in its grayscale then SDR has. HDR gives you more a chance to duplicate that latitude. It is basically the same old compliant that 35mm can't do HD or later UHD. Vast, vast majority of films shot had more then enough resolution and color gamut to accommodate HD and most UHD. Again talking OCNs and maybe IPs. The problem lies when people become revisionist about a movie. Just like with the HD era you can remake the film into whatever colors you want. That has nothing to with the system but more to do with senile directors or modernist colorists.

The irony is that old 35mm films are in a better position to do HDR then any movies made in the 2000s and most of the 2010s. Those 35mm DI and pure digital movies were designed to meet the DCI/P3 (DCP) standard of movie theaters. That standard did not have a HDR component at the time. So basically any HDR grade on movie of that era is a lie. Plus the obvious upscale of movie locked to 2K. That era of movie making was very short sighted. A movie from the 80s and 90s can in theory, do better HDR than a movie shot in the 2000s.
Reply
Thanks given by: CSchmidlapp
#14
(2021-01-21, 09:12 PM)PDB Wrote:
(2021-01-21, 07:17 PM)TomArrow Wrote: A bit off topic but I remember a time when all HDR meant to anyone was the exposure stacking of photos and subsequent hyper-ugly tonemapping that sought to bring out every last pixel of detail from any photo, creating intolerable halos and artifacts throughout the image but it was seen as the "HDR look". Big Grin Once saw a guy use the HDR photo stacking technique in a way that actually still looked good (with good looking tone mapping) and people criticized him and said it was not "true HDR" because it didn't look ugly basically. Big Grin

I remember the same confusion as I was experimenting with HDR still photography at the time and was fearful of movies in that heightened "HDR look".

The irony to me if you think about it that's how we get HDR from 35mm film. Three pics, one high, one mid, one low and combined together.

I guess HDR really is the outcome and not process to get there?

One of the unfounded complaints is that any 35mm film (or 16, 70, etc) that is given a HDR grade is a lie or basically untrue to the original OCN or IP. That HDR is a fake add-on. That's not true. You might be able to argue about peak brightness of projected film but pretty much every film ever shot has more latitude in its grayscale then SDR has. HDR gives you more a chance to duplicate that latitude. It is basically the same old compliant that 35mm can't do HD or later UHD. Vast, vast majority of films shot had more then enough resolution and color gamut to accommodate HD and most UHD. Again talking OCNs and maybe IPs. The problem lies when people become revisionist about a movie. Just like with the HD era you can remake the film into whatever colors you want. That has nothing to with the system but more to do with senile directors or modernist colorists.

The irony is that old 35mm films are in a better position to do HDR then any movies made in the 2000s and most of the 2010s. Those 35mm DI and pure digital movies were designed to meet the DCI/P3 (DCP) standard of movie theaters. That standard did not have a HDR component at the time. So basically any HDR grade on movie of that era is a lie. Plus the obvious upscale of movie locked to 2K. That era of movie making was very short sighted. A movie from the 80s and 90s can in theory, do better HDR than a movie shot in the 2000s.

Yep it's a similar process, if not identical. Except film is not that exorbitantly high in dynamic range as real world scenes, so 2 shots are enough to get a good picture.

I suppose the demarcation between HDR and SDR is a bit arbitrary. I think what's typically seen as HDR is more than 12 stops of dynamic range. And yeah I agree that HDR does make sense for 35mm scans, totally. It's not a lot of extra dynamic range, but many films do have a little bit more than SDR lets you comfortably display. I'd say most prints could profit from just leaving everything linear and pushing 2 stops so that the peak highlights are around 400 nits or so. Most people would likely call this "fake HDR", but I think it's the same obsession with people back then demanding that every picture take the HDR process to the absolute maximum. I think it's much healthier to see HDR simply as a tool with a very high maximum, but no requirement or obligation whatsoever to actually have to reach that maximum. Way I see it, it simply allows to preserve the extreme highlights on prints that would have to be flattened or clipped otherwise, the way they are on the print, and that's good enough for me.

A bit ironical perhaps is that SDR "technically" does have roughly 12 stops of dynamic range (or more if you use a higher bit depth) and that's kinda enough for many prints out there I'd say. A single exposure (non HDR) scan can actually theoretically capture just about that, in 12 bit. However that doesn't factor in noise, which tends to make the darker stops unusable and, well, noisy. So the second exposure doesn't necessarily always increase the dynamic range as much as clean up the noise in the shadows, but I'd say it's still very much worthwhile as the improvement in image quality is very noticable.

I imagine that a bleach bypass print might yet have a notably higher dynamic range than a normal one due to retaining silver (bypassing the bleach process that removes the silver) and the silver has a higher and more uniform spectral density than the 3 dye layers combined, so black areas I imagine would have the potential to be much denser. Would love to see a HDR/non HDR scan comparison.

Now then of course when talking about negative, that's a whole different beast, since the dynamic range on a negative might not be higher in the sense of requiring a higher dynamic range scanner, but higher in the sense that a negative records a rather flat image and the contrast basically has to be added back in. So if the film characteristics are known, you can probably comfortably unfold a negative to a rather high dynamic range and thus have the original scene's dynamic range again. Question is of course if that is a sane approach, especially with old movies where this was never intended and might very well ruin the atmosphere. But technologically it's fascinating. I imagine especially for old documentaries and such, which aim to preserve the way the scene originally looked, this might be a godsent.

I think even the films done with a DI have slight potential for HDR actually. The reason is that they were graded on CRTs and likely in 10 bit, plus DCI-P3 has a gamma of 2.6 afaik, so there's actually a lot more dynamic range potential there than in a 8 bit Rec709 video with gamma of around 1.9-2.4 or so. So just preserving the original DCI-P3 master could probably yield nice HDR results already. I imagine for the SDR Blu Rays back then they had to make compromises when mapping the DCI-P3 into the Rec709 space - either add gamma or other curves to preserve the entirety of the dynamic range into the SDR Blu Ray (at the expense of image contrast) or preserve the contrast, but at the expense of either the shadows or highlights.

Actually I theorize that a lot of the Disney Blu Rays etc that look so flat actually chose the first option, to sacrifice contrast and instead preserve the entirety of the dynamic range.
Reply
Thanks given by: CSchmidlapp , pipefan413
#15
Wow, this is really a wealth of information.
Thank you for putting it as best you can into layman's terms.
I was / am / have been an professional editor for 20 years (self taught), but I was more of a 'cutter', and for most of my time in the field relayed heavily on picking up technical information like this, from forums like this, once a problem presented it's self.
I'm going to read and re-read the thread a few times to digest.

My editing work has become much more of a hobbie over the past 5 years, due mainly to a bad set of circumstances and the introduction of my little one. I had to throw in the towel professionally to put food on the table, so upgrading my media kit became low in my priorities and in turn my knowledge of how things have evolved.
Fanress has become my little connection to that life, and I thank you all for taking the time, sharing your knowledge and keeping that part of me alive.

Now... ill most probably be back with more questions soon Hahaha
Reply
Thanks given by: pipefan413
#16
(2021-01-20, 04:28 PM)PDB Wrote: My current method of conversion is to use FFMPEG to convert UHD to HD. I found bits and piece of the correct coding all over the place but Chew helped me pull together the best coding. Here is an example:

Code:
"C:\ffmpeg64.exe" -i "F:\Input.mkv" -vf "zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable,zscale=t=bt709:m=bt709:r=tv,format=yuv422p10le,scale=1920:1080" -sws_flags sinc -f mov -c:v prores -pix_fmt "yuv422p10le" -profile:v 3 -an -y "F:\Output.mov"

This converts a 4K-2020-HDR MKV to 1080p-709-SDR-4:2:2 10 bit prores so I can edit or color it. Obviously, different settings for different results and you can substitute what parts you need in or out. The most logical alternative being converting to a simple 4K-2020-SDR 10 bit prores, if needed. You just need to leave in the HDR to SDR tonemapping.

What about 709 HDR to SDR?
[Image: sNn6jyF.png] [Image: 0sPZMBH.png]
Reply
Thanks given by:
#17
(2021-01-23, 12:49 PM)Valeyard Wrote:
(2021-01-20, 04:28 PM)PDB Wrote: My current method of conversion is to use FFMPEG to convert UHD to HD. I found bits and piece of the correct coding all over the place but Chew helped me pull together the best coding. Here is an example:

Code:
"C:\ffmpeg64.exe" -i "F:\Input.mkv" -vf "zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable,zscale=t=bt709:m=bt709:r=tv,format=yuv422p10le,scale=1920:1080" -sws_flags sinc -f mov -c:v prores -pix_fmt "yuv422p10le" -profile:v 3 -an -y "F:\Output.mov"

This converts a 4K-2020-HDR MKV to 1080p-709-SDR-4:2:2 10 bit prores so I can edit or color it. Obviously, different settings for different results and you can substitute what parts you need in or out. The most logical alternative being converting to a simple 4K-2020-SDR 10 bit prores, if needed. You just need to leave in the HDR to SDR tonemapping.

What about 709 HDR to SDR?

Dunno how exactly ffmpeg does it, but the typical AVISynth tonemapping approach actually does the tonemapping already in the Rec709 color space afaik, once the HDR PQ Rec2020 data has been unfolded into 32 bit floating point linear light information.
Reply
Thanks given by:
#18
(2021-01-21, 09:35 PM)TomArrow Wrote: Yep it's a similar process, if not identical. Except film is not that exorbitantly high in dynamic range as real world scenes, so 2 shots are enough to get a good picture.

I suppose the demarcation between HDR and SDR is a bit arbitrary. I think what's typically seen as HDR is more than 12 stops of dynamic range. And yeah I agree that HDR does make sense for 35mm scans, totally. It's not a lot of extra dynamic range, but many films do have a little bit more than SDR lets you comfortably display. I'd say most prints could profit from just leaving everything linear and pushing 2 stops so that the peak highlights are around 400 nits or so. Most people would likely call this "fake HDR", but I think it's the same obsession with people back then demanding that every picture take the HDR process to the absolute maximum. I think it's much healthier to see HDR simply as a tool with a very high maximum, but no requirement or obligation whatsoever to actually have to reach that maximum. Way I see it, it simply allows to preserve the extreme highlights on prints that would have to be flattened or clipped otherwise, the way they are on the print, and that's good enough for me.

A bit ironical perhaps is that SDR "technically" does have roughly 12 stops of dynamic range (or more if you use a higher bit depth) and that's kinda enough for many prints out there I'd say. A single exposure (non HDR) scan can actually theoretically capture just about that, in 12 bit. However that doesn't factor in noise, which tends to make the darker stops unusable and, well, noisy. So the second exposure doesn't necessarily always increase the dynamic range as much as clean up the noise in the shadows, but I'd say it's still very much worthwhile as the improvement in image quality is very noticable.

I imagine that a bleach bypass print might yet have a notably higher dynamic range than a normal one due to retaining silver (bypassing the bleach process that removes the silver) and the silver has a higher and more uniform spectral density than the 3 dye layers combined, so black areas I imagine would have the potential to be much denser. Would love to see a HDR/non HDR scan comparison.

Now then of course when talking about negative, that's a whole different beast, since the dynamic range on a negative might not be higher in the sense of requiring a higher dynamic range scanner, but higher in the sense that a negative records a rather flat image and the contrast basically has to be added back in. So if the film characteristics are known, you can probably comfortably unfold a negative to a rather high dynamic range and thus have the original scene's dynamic range again. Question is of course if that is a sane approach, especially with old movies where this was never intended and might very well ruin the atmosphere. But technologically it's fascinating. I imagine especially for old documentaries and such, which aim to preserve the way the scene originally looked, this might be a godsent.

I think even the films done with a DI have slight potential for HDR actually. The reason is that they were graded on CRTs and likely in 10 bit, plus DCI-P3 has a gamma of 2.6 afaik, so there's actually a lot more dynamic range potential there than in a 8 bit Rec709 video with gamma of around 1.9-2.4 or so. So just preserving the original DCI-P3 master could probably yield nice HDR results already. I imagine for the SDR Blu Rays back then they had to make compromises when mapping the DCI-P3 into the Rec709 space - either add gamma or other curves to preserve the entirety of the dynamic range into the SDR Blu Ray (at the expense of image contrast) or preserve the contrast, but at the expense of either the shadows or highlights.

Actually I theorize that a lot of the Disney Blu Rays etc that look so flat actually chose the first option, to sacrifice contrast and instead preserve the entirety of the dynamic range.

All you have to do to get to HDR is get to 101 nits. Smile

I'm joking but not by much. For example, Roger Deakins has been vocal about disliking HDR so a movie like Blade Runner 2049 is basically SDR with a few shots getting close to 200 nits at best. Most of it fits comfortably in the SDR range. Contrast (no pun intended) that with Pan (2015) which is often used as a HDR demo, since its HDR is beyond 4000 nits in many spots. Both are technically HDR but the HDR experience is wildly different.

And that is one of the legit complaints about HDR(10) is that it is ill defined. You can master a movie to what ever peak brightness you want. I can do one movie at 400 and another at 10000 nits peak. That is great for artistic options but offers little in consistency. I think the current groupthink about grading to approximately 400 nits is both grading to the lowest common denominator and the technical limitations of mastering monitors.

Most TVs can't do HDR over that 400 level. Hell, OLEDs, the kings of black, can nominally only do about 450 nits consistently while topping out at 700ish for a limited time. (Although LG promises much brighter OLEDs this year). Most mid to high level FALDs can easily do over 1000, some peaking at 2000. But most LCDs TVs that consumers have also range in that 200-400 area. So, it may be extremely nearsighted but colorists are giving consumers the best product for the TVs they have.

Also consider the most professional grading monitors still use OLEDs, which just can't get the bright. If you go over the OLED's peak brightness the panels have to "guess" by downsampling the HDR within the reach of the panel. Any colorist worth his or her's salt will just grade to what their panel can do. It's why Sony and others are moving back to LCDs for their mastering monitors. They are sacrificing absolute black for more brightness to get better brighter HDR grades. Sony's newest is actually a dual-layer LCD, trying to get the max black out of LCD tech while using the raw light output of LEDs.

But to your point of the DCP's superior gamma. You are right but it's like saying these movie were graded (DIs and pure digital) at 2K which is greater than 1080p therefore the upscaled UHD should have more picture information than say a 1080p BD. In practice, I have rarely seen any more real details in the upscale that makes the UHD worth it. And I rarely have seen 2K-based HDR UHDs superior to SDR BDs. In fact, I think they are probably upscaling the 1080p masters not the DCP masters and putting on a "fake" HDR grade on top of that.

All those factors are why, in general, I stick to SDR BDs for the 2000-2010 era of filmmaking. Its only in say the last 5 to 6 years that digital cameras can now do 14+ stops and in camera HDR. Combine that with color grades have moving from a 2K to a 4K pipeline. Those movies can take advantage of the UHD format in the same way older 35mm films can. Actually better than 35mm if you do it right. Finally after 20 years, digital technology has caught up to and started to exceeded 35mm film standard.
-------------------------
Its not just Disney, flat seems to be the order of the day. I'd say 90% of the BD/UHD transfers I watch are graded for max range with only a few focusing more on contrasty look. That's why I like looking at 35mm release prints scan. Its almost impossible to get that dynamic range of a negative from a release print so you might as well lean into that contrast look. Beside the more I see, the more I truly believe that film makers were counting on the contrast to hide effects or create a mood that the negative doesn't have.

Not only the processing of the print is going to be a factor in the dynamic range but the stock itself. Early color film has terrible dynamic range. Its why 50s era films seem to lack in contrast. Current, Kodak Vision stock are very neutral and meant to have as much dynamic range as possible so you have "options" in the DI.

But if you are looking for the king of dynamic range, you don't want bleach bypass, you want to look at 30s and 40s films shot using B&W nitrate stocks. The grayscale of a film scanned from a nitrate negative should have over 12 stops, no problem. And that's with higher levels of contrast. That's one of those misconception also that B&W films can't benefit from HDR, whereas they benefit from it the most. Again, it comes from that confusing about HDR being all about colors.

Sadly, most of the nitrate negatives are gone, long ago transferred to safety prints. That's because of the nitrate being dangerous and studios being short sighted. I've seen a few release prints in nitrate shown at 35mm presentations and the blew the socks off physical media. And those was release prints, not the negatives.

(2021-01-23, 12:05 PM)CSchmidlapp Wrote: Wow, this is really a wealth of information.
Thank you for putting it as best you can into layman's terms.
I was / am / have been an professional editor for 20 years (self taught), but I was more of a 'cutter', and for most of my time in the field relayed heavily on picking up technical information like this, from forums like this, once a problem presented it's self. 
I'm going to read and re-read the thread a few times to digest.

My editing work has become much more of a hobbie over the past 5 years, due mainly to a bad set of circumstances and the introduction of my little one. I had to throw in the towel professionally to put food on the table, so upgrading my media kit became low in my priorities and in turn my knowledge of how things have evolved.
Fanress has become my little connection to that life, and I thank you all for taking the time, sharing your knowledge and keeping that part of me alive.

Now... ill most probably be back with more questions soon Hahaha

Kids change everything, don't they?

Our problem here in fanres in in the conversion of HDR to SDR. Its kind of a function of art and a math problem. It's the art of squeezing 6 pounds of cr*p (HDR) into a 5 pound bag (SDR). Absolute black is absolute black in both systems but beyond that nothing is the same and different AVI or FMMPEG plugins handle the conversion differently. I tested a bunch of different ways and came to that FFMPEG solution as my go to. Not because it handles everything perfectly, it doesn't, but it seem to produce the best overall conversion for the most possible UHDs. Its a good mid point.
Reply
Thanks given by:
#19
@PDB Totally agree, old B&W HDR might be amazing. I believe the reason they were so contrasty wasn't because of the nitrate (well, I believe it had a clearer base, so that likely contributed too), but because they were using higher amounts of silver at the time and reduced that later. But I think another factor is also motion clarity combined with these factors, which is superior on 35mm or on a CRT by default compared to some LCD type display and makes everything look "crisper" of sorts?

About the DCI-P3 thing ... the reason I said it is because I've seen quite a few ProRes trailers. ProRes is 10 bit and boy, does it ever make a difference in shadow detail. You can get right down to the camera sensor noise by blowing up exposure! A Blu Ray would crap out at that point, not only because of the 8 bits but also because of blocking in dark areas. But I suppose you're right in saying that the camera technology back then or even the scanning technology wasn't necessarily up to it. So even if the digital data could hold that info, it might just not have been there.

On the other hand, a negative does have its dynamic range compressed to be flat, so it doesn't necessarily require that much of a high dynamic range sensor to scan, and when you linearize that scan back to the dynamic range of the original scene the film was capturing, I think even those older sensors could comfortably encompass HDR since it's not really HDR on the film. About whether older color film has a lot of dynamic range ... I honestly don't know, but I would imagine it's not too bad. But maybe I'm wrong. I think it might all also come down to people working on HDR remasters not necessarily having access to the original scan files and maybe not the expertise to properly unfold the dynamic range again anyway? Since it would require a very precise curve and controlled, calibrated data and even that might be simplifying it due to how different dye layers interact. So the best they can do is just apply some curves by eye and hope it looks ok.

The approach I currently take with HDR grading is to apply a calibration matrix in linear space to get the colors correct, and then to increase brightness to a degree that feels right for any given movie. It's not quite "tried and tested" yet but I think it should logically work.

On a side note, I've recently been confronted with my assumptions about absolute black and realized that the concept is much more confusing and unintuitive than I initially assumed. The reason is that compared to absolute nothing, even the tiniest amount of light is literally an infinite step up in terms of multiplication. So thinking of it purely in terms of stops makes it really weird really fast because when absolute black comes into the game, you are looking at infinite stops, which is obviously a nonsensical or at least not very useful number lol. I still haven't quite find a way out of this conundrum...
Reply
Thanks given by:
#20
I do hate the fact we have no standard in place for colorists and TV manufactures to work to.
Standardizing at say 400nits would help us all in be on the same page creatively and from a consumer level.
This is the part of the 'HDR' experience I just don't get, it's always going to look different on different gear.
Not good for people who want to calibrate and experience as close to what the filmmaker intended.

I like the 10bit and extended color space of REC2020
I would have much preferred a standard peak nit and higher color space compression like 4:4:4 or native 4:2:2

I'm still digesting the information and have had no time to play with trans-coding ect
If downconverting from 4K HDR REC2020 4:2:0 to 1080p SDR, is there a set of commands that take advantage of the extra information in the resolution and colorspace to give a correct downsample to 4:2:2 or 4:4:4? (hope that makes sense)
Reply
Thanks given by:


Possibly Related Threads…
Thread Author Replies Views Last Post
Lightbulb [Idea] Lossless import MKV with DTS into Adobe Premiere Bilbofett 1 1,515 2022-10-14, 04:30 PM
Last Post: PDB
  A stubborn intent to render yuv444 10bit video Amadian 4 2,503 2022-03-15, 11:57 AM
Last Post: Amadian
  44.1khz Lossless Options borisanddoris 7 3,146 2021-12-01, 06:07 PM
Last Post: borisanddoris
  Is there a lossless (or as close as possible) way to convert from 30 fps to 23.976? Serums 2 2,102 2021-11-16, 01:34 PM
Last Post: Serums
  Adding and editing subtitles... Bo Lero 8 4,528 2021-09-19, 08:48 AM
Last Post: Bo Lero
  Lossless codecs Colek 3 5,322 2019-11-17, 04:51 PM
Last Post: CSchmidlapp
  Best format for exporting from Resolve? bronan 24 16,920 2019-07-03, 08:28 AM
Last Post: deleted user
  Lossless conversion from MOV? Serums 2 3,167 2018-12-20, 05:14 AM
Last Post: Serums
  Any way to watch VUDU mp4 files in MKV format? Jetrell Fo 2 4,993 2017-12-18, 10:57 PM
Last Post: Jetrell Fo
  lossless codecs problem spoRv 7 7,976 2016-11-08, 12:10 AM
Last Post: Chewtobacca

Forum Jump:


Users browsing this thread: 1 Guest(s)