2022-02-06, 10:11 PM
FWIW I think that the codec Dolby uses on 35mm prints is the same as the commercial home format ie AC-3. As I understand it the decoder chipset is (or was initially) the same as the chipset used in DVD players (filmtech discussion here). Knowing how tech firms operate there would have been an alternate designation for a different codec and then of course the marketing spiel of how it's actually a better more efficient codec than the theatrical blah blah, whereas in reality the domestic flavours were superior only in their increased bitrate. The difference methinks lies with it's implementation.
Although the SMPTE standard for film projection is 24fps, according to this filmtech thread the playback speed could fluctuate greatly and the digital audio formats would still work, ie playback in sync without dropping out and defaulting to the analogue track. My guess is that whereas purely digital playback (from a DVD or digital file etc) is clocked precisely to a fixed speed the theatrical decoders have to set playback to the running speed of the projector at any given time by parsing the data it is receiving from the sound reader. With the case of Dolby SRD given the data is stored in the space between sprockets, every frame would have 4 sections of 'blank' data as the sprocket hole itself was read by the CCD, perhaps these pulses in the signal were used to determine the projection speed? In any case the incoming signal from the CCD would have to be converted into the real-time audio by decoding the data and then adjusting the playback speed, which is probably why the pure data can not be tapped at any point in the processor (as it does not conform to a fixed clock speed), leaving the only option available of capturing the analogue outputs.
Although the SMPTE standard for film projection is 24fps, according to this filmtech thread the playback speed could fluctuate greatly and the digital audio formats would still work, ie playback in sync without dropping out and defaulting to the analogue track. My guess is that whereas purely digital playback (from a DVD or digital file etc) is clocked precisely to a fixed speed the theatrical decoders have to set playback to the running speed of the projector at any given time by parsing the data it is receiving from the sound reader. With the case of Dolby SRD given the data is stored in the space between sprockets, every frame would have 4 sections of 'blank' data as the sprocket hole itself was read by the CCD, perhaps these pulses in the signal were used to determine the projection speed? In any case the incoming signal from the CCD would have to be converted into the real-time audio by decoding the data and then adjusting the playback speed, which is probably why the pure data can not be tapped at any point in the processor (as it does not conform to a fixed clock speed), leaving the only option available of capturing the analogue outputs.