Skip to main content

Social networking is really AOL's innovation...

In a recent post on Facebook, a friend posited a question I have seen posed a few times in the last few years but usually answer individually. It is:

"I wonder sometimes, how did Zuck think that this site of his would get so popular even if MySpace was waay popular back then?
What made him keep doing things?"

My answer is simple, it's based on historical knowledge of what was happening (or not happening) in the space circa 2004. It also is based on what had happened in the critical 5 years prior to 2004 that enabled the shift to a new type of social networking to be possible.

My first answer was:
"Novelty, what is scarce has inherent value....doesn't matter what it is. Find a piece of 20,000 year old fossilized human shit (coprolite) and you'll understand what I mean."

To elaborate, there was nothing new concept wise about what Zuckerberg was doing....what was new was the technology he was using to do it. Almost every feature we have here on FB we had in some form on AOL. The differences were the lack of ubiquitous web accessible real time feeding elements which were made possible only with the invention of the XmlHTTPRequest object by MICROSOFT in 2001 to use in their OWS product (Outlook Web Access). The features of the object were quickly modeled into a version for non IE browsers and using it enabled background updates to the HTML DOM (document object model) which is an in memory tree of all the element nodes in a web page. Using XmlHTTpRequest calls (which are asynchronous) one can both make background calls to external resources on a page and based on the returns of those calls dynamically update the page elements, this allows the updating of pages without actual refreshes being manually performed by the user...making feeds (like this one) far more useful. Organizing content elements around a set of "friends" is something AOL was doing way back (95?) when on their service with their (then) innovative approach to IM. However, because their technology was silo-ed in two important ways that web 2.0 social networks were not (it was siloed in hardware because the users were all dial up and it was siloed in web technology because it was before XmlHttpRequest object was invented) it was at significant *scale* disadvantage to what came later once broadband penetration seriously started to pick up as telecoms laid fiber lines with abandon into the ground (which would soon be at over supply as ways to multiply bandwidth (DWDM multiplexing) became cost effective). Increased scale meant a much more efficient access to large swathes of people via any connection medium to the web and also meant the players could provide services with a much reduced technological load than older services like AOL...which required compiled software be installed on every computer using it.


So after the .com 1.0 era bubble burst in 2000/2001...new players started looking at the space (now with broadband penetration rapidly increasing) and said hey we can do that AOL social stuff but do it without any application to install, without any restrictions on access beyond a web browser that supports web 2.0 technologies...boom! Thus came Friendster, thus came Myspace, thus came Facebook...thus came Friendfeed...etc. Again ...in those early days...novelty and inherent scalability of the approach made those players inherently valuable ...just as today, consolidation took nearly 4 years and Facebook ended up winning the game despite early leads by MySpace...which failed to socialize their offering as quickly as Facebook did (the feed being key in my view).

Well....here we are.

Links:

http://en.wikipedia.org/wiki/Coprolite

http://en.wikipedia.org/wiki/AOL_IM

http://en.wikipedia.org/wiki/XMLHttpRequest

http://en.wikipedia.org/wiki/Dial-up_Internet_access

http://en.wikipedia.org/wiki/Web_2.0

http://en.wikipedia.org/wiki/Scalability

Comments

r0b M. said…
hey David, gr8 piece u wrote re: AOL & the lead-in or flow that Z-schmuckerberg [can i trademark that nick?!] got so much credit-for in the eyes of the massy-masses. So here's an off the wall yet related space-oddity: I did some freelance work in '94/'95 for a
digital publisher/company in the valley called "MEDIOR;" Well, awhile bak i decided to
look them up, just to
connect to the old days, etc: Turns out they partnered and/or merged w/ AOL bak in
the latter 90s. Man, it's like AOL's freaking-tentacles reach everywhere. meh: When i use winamp from time to time,

i think to myself 'sh!t,
AOL owns (or pseudo-owns) winamp, why am i using this!' --but then, i never pay for anything connected to it, so it's not too bad from
that perspective. Anyway, i '''originally''' went to ur blog to ask u about "bitcasa" but i'll send that/those questions in another thread-----yeah the caffeine + taurine are enabling my OCD to run amuck, anyway, keep up the gr8 blogging =][=

Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

http://online.wsj.com/article/SB10001424052748703481004574646402192953052.html

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Highly targeted Cpg vaccine immunotherapy for a range of cancer

Significance?


This will surely go down as a seminal advance in cancer therapy. It reads like magic:

So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator​ T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.

This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…