A chronicle of the things I find interesting or deeply important. Exploring generally 4 pillars of intense research. Dynamic Cognition (what every one else calls AI), Self Healing Infrastructures (how to build technological Utopia), Autonomous work routing and Action Oriented Workflow (sending work to the worker) and Supermortality (how to live...to arbitrarily long life spans by ending the disease of aging to death.)
Search This Blog
HDTV without the HDTV
About 5 years ago, I recall noticing the difference in quality between the signals that I received on non cable based tv from say 10 years ago and today's digital cable signals. The main difference lay in the fact that digital tv replaces ghost and snow artifacts for digital pixelization. In the early days I noticed a marked difference in quality as when both signals were at their best , the local cable provider applied enough compression to a signal that various scenes would clearly show the artifacts. The compression used on the digital signals is mostly mpeg format compression which uses a discrete cosine based method to compress luminance but mostly chrominance information to reduce the bandwidth requirements of the signal for transmission. However, cosine based compression is subject to quantization errors and artifacts having to do with the selection of a specific sized quantization kernel for the compression algorithm, for scene data that moves faster than the algorithm can encode the chrominance data there is a marked pixelization of the image. There is also a good loss when contrast is low between objects displayed on the screen (shows particularly well on light to dark transitions), finally when there is a high level of variation in a particular frame a fixed compression sampling rate will vary the appearance of the pixelization from frame to frame, making for a horrible effect. If you've watched badly compressed web video on youtube you know exactly what I am referring to. Now, the cable signals aren't "that" bad but I was able to see the difference between them and what I was used to seeing with an analog signal or with a pure dvd signal from a set top box to know that the cable signal wasn't as good as it could be. I recently upgraded my cable service so that my cable receiver is able to access the "HD" tv signals that many channells are providing along side their standard definition channels. I have a standard definition flat screen television set, the Toshiba 27AF43 that I purchased 5 years ago mostly for its convenient inputs (component out) and for the perfectly flat screen. It provides a clean and sharp and noise free display for my DVD player (also a Toshiba) I've used this signal as a reference for just how good the screen is compared to the cable signals it displays when I am watching CNN or some the Science channel and the difference is clear. The experience gave me the indication that the HD channels might provide quality to approach the DVD signal, sure enough upon upgrading to the new receiver and tuning to an HD channel I was surprised at how much better the signal was. Gone were the obvious pixelization squares in low contrast transitions, fast moving object scenes and high detail scenes. The simply reduced compression on the digital signal improved it markedly on my standard def. TV. It makes you wonder that as we are being prodded by the electronic companies to purchase new HD tv sets, many of us have existing standard definition screens that aren't being pushed to their limits of resolution because the cable companies have so severely compressed the digital signals they are sending. I have seen an HD screen both on a computer and on an HD monitor and the difference in quality between a 1080i/p and a standard def is again obvious but I don't I wouldn't say the difference is bigger than what I observed when going from normal cable digital on a standard def. monitor to HD cable digital on that same monitor. A few of the cable providers are getting away with providing HD quality that only barely exceeds the resolution capability of a standard definition monitor it seems!
Just an observation I thought was worth sharing...
This will surely go down as a seminal advance in cancer therapy. It reads like magic:
So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.
This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …
I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.
The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.
Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.
Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…
Permissions, fine grained versus management headache
The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform certain actions. Also, the…