Skip to main content

Internet 2.0 service models: continuous versus limited value

The last few months have been quite an interesting ride into the rise of an internet 2.0 darling. Twitter, the nombre du jour of the internet 2.0 crop of quickly built but toughly scaled web applications without any revenue has exploded onto the public scene thanks to its high profile use during the recent American elections and more recent use by various media pundits in live segments as a means of extracting real time opinions from the viewing audience. The growth curve of the companies user base has been extraordinary and has easily outpaced the growth curve of previous internet darlings like 1.0 stalwart (some would say first 2.0) Google, Youtube and others. Twitter is plotting an amazing hockey stick , particularly in the last few weeks as talk of a competiton to 1,000,000 users was joined between actor Ashton Kushner and CNN! However, despite the short term apparent success of the service in gaining users, it is still without revenue and is operating at an amazing accelerating burn rate. Ultimately, regardless of the fireworks of growth that is currently occuring , the site will have to make money. To avoid being like many web 2.0 companies that have come before, been given much lime light and flash in the media, run its course for a good year or two only to burn up and die, twitter has to provide a compelling service to its users, one that keeps them coming back without a novelty effect. A novelty effect is what happens when a service is seen as "cool" and that is all it is seen as, the internet past is littered with "cool" sites that elicited oohs and ahhh's only to be relegated to the deadpool a short time later. These are almost always sites that do nothing but provide limited stagnant entertainment to the user. Whatever is provided it needs to be constantly refreshed in some way to make it useful to users and thus engender them to come back.

Youtube is a perfect example of a site that provides a useful service that is entertaining (watching videos) but continuously so by ensuring a steady diet of new video for members to return to view. The service model is a continuous value one and this is reflected in the high average time that users spend on the service. Facebook has a similar sticky effect on its users , providing a combination of social features that make it a compelling area for users to call "home base" and tie all their social activities (messaging, photo sharing, blogging, forum posting) into one portal. This keeps people again engaged on the site for long average periods. Twitter provides a much shorter interval of interest for users, at most users come to the site to post tweets and then read tweets of others but the novelty of this act will wear away as there is no other service to be exploited. I predict that the turn over rate for twitter users is very high, that though the number of new users is high, the number of users leaving the service will also be high as the novelty wears off particulary for casual consumer use. This describes a limited value service model that must seek to extract profit in the short delta that the new users who join the service , gain followers, experience the novelty of it, become bored with it and then stop using the service. Facebook's revamped feed has a much better chance of becoming the continuous value model that Twitter currently lacks and could siphon off the cache of the service quite easily by making a few cosmetic changes to how it currently operates.

All is not doom and and gloom for the limited value service model of Twitter, the last few months as demonstrated the value that having instant feedback from a collection of followers for entertainers, media outlets, journalists and in particular businesses. These entities seek to extract large amounts of information regarding their professional actions from their user communities in as efficient a way as possible, twitter provides access to a real time conversation over how the community followers feels about an actors latest work, a media outlets coverage of a news event or a businesses plan to create a new product or change a service option. This contact with the real time community serves as a massive, highly motivated and most important, unpaid focus group for anything that the entity wishes to probe. Twitter is already starting to see the value of this, as rumors swirl of a business model revolving around selling featured slots for individuals or businesses on the twitter main feed page. This would allow featured businesses to gain visibility into the views of many twitter users and potentially gain better insights into how to market pending products or services.

As the growth to twitter continues to rise it will be interesting to see if the increasing burn rate will swamp the companies ability to institute revenue generating measures. One way that can be useful to the site would be to use an inverse payment model, where they would charge accounts that have accrued followers beyond a given amount of users. Say 100,000 a fixed price, they can then incrimentally increase the charge as the accounts cross additional milestones. 200,000 or 300,000. The accounts that achieve such high numbers of users are very likely to be entertainers, journalists, media outlets or businesses that see the captured audience of users as a gold mine for marketing purposes and they would be willing to pay for those users even if Twitter springs the pay requirement on them. This would be a great way to extract revenue from the service by demanding payment from those that would be most willing to pay to retain access to their collection of followers. Time will tell what models Twitter will employ but it is certain that they need to act fast if they are to avoid burn rate implosion from accelerating costs as the hockey stick growth curve continues in the light of mainstream success.

Comments

Popular posts from this blog

the attributes of web 3.0...

As the US economy continues to suffer the doldrums of stagnant investment in many industries, belt tightening budgets in many of the largest cities and continuous rounds of lay offs at some of the oldest of corporations, it is little comfort to those suffering through economic problems that what is happening now, has happened before. True, the severity of the downturn might have been different but the common factors of people and businesses being forced to do more with less is the theme of the times. Like environmental shocks to an ecosystem, stresses to the economic system lead to people hunkering down to last the storm, but it is instructive to realize that during the storm, all that idle time in the shelter affords people the ability to solve previous or existing problems. Likewise, economic downturns enable enterprising individuals and corporations the ability to make bold decisions with regard to marketing , sales or product focus that can lead to incredible gains as the economic

How many cofactors for inducing expression of every cell type?

Another revolution in iPSC technology announced: "Also known as iPS cells, these cells can become virtually any cell type in the human body -- just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor -- Sox2 -- to directly reprogram one cell type into another without reverting to the pluripotent state." -- So the method invented by Yamanaka is now refined to rely only 1 cofactor and b) directly generate the target cell type from the source cell type (skin to neuron) without the stem like intermediate stage.  It also mentions that oncogenic triggering was eliminated in their testing. Now comparative methods can be used to discover other types...the question is..is Sox2 critical for all types? It may be that skin to neuron relies on Sox2

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform cert