2017-05-01, 11:19 AM
Bitrate is a difficult "animal" to tame... in a perfect world, we just don't need compression for video and audio, and everything would be uncompressed (or, wisely, losslessly compressed); maybe one day it will happen, but, until then, we should rely on lossy compression to get or audio and video contents.
So, which is the "right" bitrate to use? The best answer is "it depends"! Really, there are variables that influence this data; resolution, codec, codec settings, media size (physical memory, or internet bandwidth), and final user quality expectations... I mean, I'm "expecting" higher quality from BD than from Netflix, right?
While it's normal than higher bitrate = higher quality, someone still can't believe that, given sufficient bitrate, encoders would get similar results...
Just for fun, I made three test clips using three different encoders; I took a difficult scene, full of movement, fog, fire, to stress them; a fair figure of 20mbps bitrate average, 50mbps max; 1920x1080 23.976fps; encoders used were:
h264 - Simple X264 Launcher
Mpeg-2 - HCenc
Mpeg-1 - TMPGenc
clips are here: https://mega.nz/#F!kjo3wKJC!jxoZnkzveaP2f2zZilaXdA
Well, in motion, at the right (read: not too far, but not too close either) distance, they look really similar, and do their job; of course, frame by frame, magnified, the differences are evident, in particular with Mpeg-1; but if you think how old this codec is, result is outstanding!
Of course, nobody would use Mpeg-1 anymore - there are much better and efficient codec out there - but this little test is useful at least to understand that HDTV broadcasts and streaming services, albeit far from perfect, if well encoded could have a really high quality, and eventually be used as source for any project.
So, do not discard a video due only to its bitrate and/or codec used...
So, which is the "right" bitrate to use? The best answer is "it depends"! Really, there are variables that influence this data; resolution, codec, codec settings, media size (physical memory, or internet bandwidth), and final user quality expectations... I mean, I'm "expecting" higher quality from BD than from Netflix, right?
While it's normal than higher bitrate = higher quality, someone still can't believe that, given sufficient bitrate, encoders would get similar results...
Just for fun, I made three test clips using three different encoders; I took a difficult scene, full of movement, fog, fire, to stress them; a fair figure of 20mbps bitrate average, 50mbps max; 1920x1080 23.976fps; encoders used were:
h264 - Simple X264 Launcher
Mpeg-2 - HCenc
Mpeg-1 - TMPGenc
clips are here: https://mega.nz/#F!kjo3wKJC!jxoZnkzveaP2f2zZilaXdA
Well, in motion, at the right (read: not too far, but not too close either) distance, they look really similar, and do their job; of course, frame by frame, magnified, the differences are evident, in particular with Mpeg-1; but if you think how old this codec is, result is outstanding!
Of course, nobody would use Mpeg-1 anymore - there are much better and efficient codec out there - but this little test is useful at least to understand that HDTV broadcasts and streaming services, albeit far from perfect, if well encoded could have a really high quality, and eventually be used as source for any project.
So, do not discard a video due only to its bitrate and/or codec used...