(2019-04-23, 05:52 AM)meantux Wrote: I'm the guy that Tom invited here, I transfer 35mm films with my Gugusse Roller..
I cherish the naïve magic thinking idea (or should I say wish) that the job of translating the dolby frames in audio data will only be to serialize the bits in bytes and then pass it on as an ac3 file into ffmpeg. Maybe the AC3 library will recognize this data as its own and will even handle the error correction itself.
I hope.
Yeahm, it may be a bit naïve, but I have some reasoning keeping the thought simple:
1.: It was invented in a time, when not every houshold had a Computer. Speaking of the power to work with scanned/photographed AC3 stream Pictures.
2.: The AC3 on LDs came afterwards, but there is still no really need to change the Encoding or Decoding Technology. It still was save that for playback a Dolby AC3 Decoding was needed, because, still, Computer Technology wasn't that far in private homes, that t really was in danger to be misused.
So, I would guess, that they might just used the same Encoding Technology, and such a black White coding would be the easy way to transform just a stream of bits.
You NEED some Controlling parts, like how big the single dots are, you Need the Definition of what's a 1 and a whats a 0.
My guess would be:
White = 1 and Black = 0
Why? There need to be a sensor reading it with the White part there Comes more light to the sensor, the sensor dot is more triggered, and you got a Peak of power.
while black dots stay powerless.
I can imagine the Dolby sensor was like a sensor in a dslr, and a small processor translated it in bitstream.
It may be that there is a error correction System, but I imagine it would be Kind of sinple, too.
I cann imagine the Inventors where hyped about that method, but did not really think of "protecting" the data, just transferring and processing...
I hope that it might be that simple, but it is worth some tries and Errors. like calling every third dot a error correction dot. Or maybe every third "code".
BTW: The Corners have a size of 8*8 dots. so 256 dots are lost due to Corners in each code Segment...
8 Dots can be one bit.. so if each dot is a bit: 32 byte are lost in each square...
Edit 2:
If I am not wrong, each second has 516096 Bits... after removing the edges and the DD Logo...
Edit 3:
Just a thinking:
Couldn't that patent from 1991 be used from its holders, to sue the "Inventors" of QR-Codes, because technically a QR-Code is not that much different...
So, if Dolby in theatres is 320kbps, thats 320000 bits per second, if I am not wrong. If we think of a rato of 2 bits and 1 error correction bit, thats exactly 480000 bits you need per second.
So, if I calculated correctly the coded bitrate in the squares is indeed 516096 bits per second.
(After substracting the Corners and the Dolby Logo we have 5376 (data)bits per square, thats 21504 bits per movie Frame and thus 516069 bits per second.)
So, 36096 bits per second that are just "too much" to have a 320000 bps stream with additionall 160000 bips error correction.
After recalculation, there 376 bits in each square that are just "too much" in the error corrected bitstream.
What could be done with these 376 dots in each image?
It still might be worth a try to create one big bitstream out of the single Frames, reduced by the Corners and the Center.
It really might be that easy that a normal AC3 Decoding recognize these Patterns and error correction. I am sure that a bitstream from LaserDisc etc. need These error correction and informations, too.