About 5 years ago, I recall noticing the difference in quality between the signals that I received on non cable based tv from say 10 years ago and today's digital cable signals. The main difference lay in the fact that digital tv replaces ghost and snow artifacts for digital pixelization. In the early days I noticed a marked difference in quality as when both signals were at their best , the local cable provider applied enough compression to a signal that various scenes would clearly show the artifacts. The compression used on the digital signals is mostly mpeg format compression which uses a discrete cosine based method to compress luminance but mostly chrominance information to reduce the bandwidth requirements of the signal for transmission. However, cosine based compression is subject to quantization errors and artifacts having to do with the selection of a specific sized quantization kernel for the compression algorithm, for scene data that moves faster than the algorithm can encode the chrominance data there is a marked pixelization of the image. There is also a good loss when contrast is low between objects displayed on the screen (shows particularly well on light to dark transitions), finally when there is a high level of variation in a particular frame a fixed compression sampling rate will vary the appearance of the pixelization from frame to frame, making for a horrible effect. If you've watched badly compressed web video on youtube you know exactly what I am referring to. Now, the cable signals aren't "that" bad but I was able to see the difference between them and what I was used to seeing with an analog signal or with a pure dvd signal from a set top box to know that the cable signal wasn't as good as it could be. I recently upgraded my cable service so that my cable receiver is able to access the "HD" tv signals that many channells are providing along side their standard definition channels. I have a standard definition flat screen television set, the Toshiba 27AF43 that I purchased 5 years ago mostly for its convenient inputs (component out) and for the perfectly flat screen. It provides a clean and sharp and noise free display for my DVD player (also a Toshiba) I've used this signal as a reference for just how good the screen is compared to the cable signals it displays when I am watching CNN or some the Science channel and the difference is clear. The experience gave me the indication that the HD channels might provide quality to approach the DVD signal, sure enough upon upgrading to the new receiver and tuning to an HD channel I was surprised at how much better the signal was. Gone were the obvious pixelization squares in low contrast transitions, fast moving object scenes and high detail scenes. The simply reduced compression on the digital signal improved it markedly on my standard def. TV. It makes you wonder that as we are being prodded by the electronic companies to purchase new HD tv sets, many of us have existing standard definition screens that aren't being pushed to their limits of resolution because the cable companies have so severely compressed the digital signals they are sending. I have seen an HD screen both on a computer and on an HD monitor and the difference in quality between a 1080i/p and a standard def is again obvious but I don't I wouldn't say the difference is bigger than what I observed when going from normal cable digital on a standard def. monitor to HD cable digital on that same monitor. A few of the cable providers are getting away with providing HD quality that only barely exceeds the resolution capability of a standard definition monitor it seems!
Just an observation I thought was worth sharing...
video compression
Just an observation I thought was worth sharing...
video compression
Comments