Skip to main content

less fiber more lambda's should mean lower costs...

Following article was written as a response to an email regarding the linked wired article back in 2/27/2006.


Leave it to the greedy corporations in search of endless revenue streams to satisfy their investors to come up with this idea.
I am amazed at the audacity of these telecom heads, as if the fact that people (consumers and businesses) already pay for internet access isn't enough. Now they want to chop up service on the internet to extract greater profit. Showing yet again why public corporations eventually succumb to evil by nature IMO, they can't help it. Investor driven businesses simply can not afford to have a moral compass.
God this stuff makes me mad, though I am glad that unlike other business driven initiatives which the consumer has no hope of fighting against (anything the big Oil or Tobacco companies do or say) this one has other big businesses in opposition to the plans. Namely big name sites like Yahoo, Ebay and Google. Hopefully the draconian ideas the telecoms are planning will be killed on the vine just as they deserve.
The true kicker is that anyone familiar with the technologies that the telecoms use to allocate additional bandwidth for internet traffic on their backbones knows that the telecoms are blowing hot air about the issue of having to pay for the increased bandwidth that customers are using. Just prior to the dot com burst most telecoms upgraded their infrastructures to include the latest WDM based switching technology, they also laid tons of fiber. These two actions have future proofed dramatically the entire bandwidth acquisition process. Literally, for the telecom to allocate more bandwidth involves no more than enabling a new lambda at both ends of infrastructure using add/drop WDM switches. So although it is true the original cost outlay for changing their infrastructure was highier, the long run costs will be next to nothing. The fiber in the ground now will last for a lot longer than even the best copper cable could last, and thanks to WDM never needs to be dug up to increase bandwidth. This means that they will be able to allocate the needed bandwidth very easily and routing traffic regardless of type will be easier and less likely to incur latency as a result..making the entire foundation of their argument for tiered service a huge white elephant. Of course it doesn't strike me as a coincidence that they are coming out with this idea now, they have their backs to the wall. With their traditional business being encroached upon by cable companies and independents like Vonage and Skype, the boards of these telecoms are scrambling for ways to grab a bigger piece of all the services flying around on the net. Also most of them put themselves in a huge hole during the telecom bust of 2001. No one anticipated the fact that WDM would become so cheap so soon, making all the fiber that they were laying irrelevant over night. Add to this the fact that the need for bandwidth simply did not scale as fast as everyone was predicting. Even now, tons of fiber lays in the ground unused because well..there is no need to use it..but the telecoms paid for it and now they want the consumer to pay for their provisioning mistake. I am all for them charging for access service as they already do or even increasing the charge as their costs naturally go up due to inflation, but it is the height of greed for them to create "tiers" when none will be necessary. This is a classic example of a corporation attempting to diversify a service (in this case access to the internet) in order to increase profits by selling consumers what they really don't need. Now IF they can provide the tiered service as an option and not a requirement based on how much bandwidth a particular business or customer might be using, fine..existing customers can choose to upgrade to the supposedly more reliable service if they wish. Additionally, if the telecoms can provide demand statistics for their customers indicating a customer demand for this tiered service I would be inclined to say it is fine, but if the customer is NOT demanding it and introducing it will automatically hike prices for existing high bandwidth customers (like Google, Ebay who already pay huge access fees to the telecoms) there is only one purpose to introduce it and that is to squeeze more profit out of the existing already at equilibrium cost/service matrix which to me is wrong and immoral.
What is your take on this issue?
(the reason for the telecom bust..over provisioning due to the emergence of cheap WDM and to much fiber)

Comments

Popular posts from this blog

Highly targeted Cpg vaccine immunotherapy for a range of cancer

Significance?


This will surely go down as a seminal advance in cancer therapy. It reads like magic:

So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator​ T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.

This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache
The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform certain actions. Also, the…