2018-02-20, 08:02 AM
(This post was last modified: 2018-02-20, 09:50 AM by ChainsawAsh.)
So I'm trying to encode a 5.1 DTS track using Surcode from six mono AIFF files. The resulting DTS file is perfectly fine - the channels are mapped correctly, it syncs when muxed with video, etc.
But when I pop it into MediaInfo, it's saying the bit depth is 24-bit, whereas my source material is all 16-bit. I've tried everything I can with Surcode and eac3to to get a 16-bit output file, but no matter what it always reads as 24-bit.
So in Googling around for advice or info, I came across a few people who insist that bit depth is irrelevant for lossy codecs, so whether MediaInfo reads as 16- or 24-bit shouldn't matter. But then others have been saying that standard 48kHz DTS reading as 24-bit is incompatible with various programs and players they've used.
So does anyone have any advice on this? How can I force a 16-bit output DTS file? Can I patch the 24-bit DTS file to read as 16-bit without reencoding again and degrading quality (under the assumption that bit depth *is* irrelevant for lossy codecs)? Does it even matter?
But when I pop it into MediaInfo, it's saying the bit depth is 24-bit, whereas my source material is all 16-bit. I've tried everything I can with Surcode and eac3to to get a 16-bit output file, but no matter what it always reads as 24-bit.
So in Googling around for advice or info, I came across a few people who insist that bit depth is irrelevant for lossy codecs, so whether MediaInfo reads as 16- or 24-bit shouldn't matter. But then others have been saying that standard 48kHz DTS reading as 24-bit is incompatible with various programs and players they've used.
So does anyone have any advice on this? How can I force a 16-bit output DTS file? Can I patch the 24-bit DTS file to read as 16-bit without reencoding again and degrading quality (under the assumption that bit depth *is* irrelevant for lossy codecs)? Does it even matter?