Skip to main content

Incremental truth and Wikipedia

Update: 3/17/2008

I was made aware by a reader that this post might be taken as acceptance of the practice of using wikipedia as citation for research work. I am not making this point, I am simply highlighting the vector toward more truthful , that all articles on wikipedia tend to be directed as time and contributors increases. In the most mature wikipedia articles, one will note a plethora of valid technical first source citations at the bottom of the article that can be used for academic sources. Again, the post below is simply illustrating the trend toward "incremental truth" that attends wikipedia articles.

I originally commented on this article in an email to my brother, the entire comment in transcribed below.


http://www.nytimes.com/2007/02/21/education/21wikipedia.html (may require free registration to access)


This article is a perfect example of why Wikipedia is soo cool! On one hand I agree with professors that say it shouldn't be used as a citation source , the reasons for this are simple.

a) An article still in open edit mode (ie. anyone on the net can edit it semi-anonymously) can be edited by anyone. I know I've edited several dozen wikipedia articles myself. However not everyone providing edits is pledged to apply the wikipedia mantra's of POV free input along with citations for the source of the knowledge provided. In open edit articles this ends up with the possibility that a crack pot or two can edit an article with bad information, in wikipedia parlance vandalize it.

b) Even when the participants agree, if they are only a subset of the persons that have expertise in a subject they article quality is still poor , even if it has few "edit wars" behind it's history. Low number of participants tend to correlate to lower quality articles.

c) Kids can edit an article to include ideas they feel should be in there instead of what was actually in history. Thus an article used as citation could be fluffed up by students looking for material on their papers. This problem though is correlated with the first two points as such articles are soon viewed by more knowledgeable wikipedians who sound the alarm and soon revert changes and the quality of such articles go way up again.

That said, the opposite side of the coin allows all of these weak points to be mitigated against. First, if an article suffers frequent vandalism or edit wars it can be clamped down to future editing or restricted by a removal of open edits. (only users with accounts can edit) Many "mature" articles in fact are in this state, I've found these articles to be the most citation filled and accurate of all on the site. Thus it seems quality is directly correlated with time. So wikipedia article histories start out with large swings in quality, as a low number of participants contribute to it...but over time the quality line approaches not just a consensus of the participants but a correlation with the actual facts of the subject in question as more participants go up and the article itself is viewed by more users. Also, as the number of participants to an article increases , edit wars tend to go up but the quality of the article goes up as well. POV references are quickly flagged and removed. Also, the editors tend to include experts in the field. The overall picture is that as more time goes by, Wikipedia articles *always* tend to get better and this is the power of the medium.

As soon as I started the article I realized the folly of the writer, as he stated the historical error regarding the Jesuits. I thought "I bet it isn't wrong now." , sure enough by the end of the article the author felt confident enough to assert:

"And yes, back at Wikipedia, the Jesuits are still credited as supporting the Shimabara Rebellion."

I thought to myself, "and this is why Wikipedia is so cool." as I typed in the article for Japan. As sure as I expected the disputed passage was changed (possibly in response to the authors very article!) between the time the NYT published it (2 days ago!) and the time I read the article.

http://en.wikipedia.org/wiki/Shimabara_Rebellion

http://en.wikipedia.org/wiki/Kirishitan

http://en.wikipedia.org/wiki/Talk:Shimabara_Rebellion

Any direct statement that the Jesuits aided the rebellion is gone in the first article, the second goes so far as to state that the Catholic Church didn't consider the rebels who died martyrs because they used violence to achieve their aim. Lastly, the "discussion" page of the Shimabara article shows the actual reference to this NYT article by Noam Cohen (only 2 days ago!), and mentions the controversy of the Jesuits (which was deleted) and requests contribution by the historian mentioned in the article to vet the edited version!!! Awesome!!! This is something that simply can't happen with traditional encyclopedias where publicly revealed non factual knowledge remains stale for *years* before it is corrected in the next edition. No amount of "current" debate on the mistakes can force an update of the edition, for this amazing expedience of updates Wikipedia gets my vote. The magnitude of error in peer reviewed articles is lower but the correction time is much longer than a wikipedia article which tends to have highier magnitudes of error but blindlingly fast correction tmes. Over time both achieve similar quality.

Still for the error magnitude problem mentioned above, I would be cautious about using a wikipedia article as a citation for a paper if I were still in college, simply following a couple of rules of thought can ensure the veracity of the information extracted. Ensuring a long edit history with many citations, with many participants and a discussion page with few "running" controversies or edit wars will ensure the articles chosen are mature. These are the articles I tend to give more credence to, like anything one shouldn't accept it on the face of what it states, critical analysis should always be employed!

Comments

Popular posts from this blog

the attributes of web 3.0...

As the US economy continues to suffer the doldrums of stagnant investment in many industries, belt tightening budgets in many of the largest cities and continuous rounds of lay offs at some of the oldest of corporations, it is little comfort to those suffering through economic problems that what is happening now, has happened before. True, the severity of the downturn might have been different but the common factors of people and businesses being forced to do more with less is the theme of the times. Like environmental shocks to an ecosystem, stresses to the economic system lead to people hunkering down to last the storm, but it is instructive to realize that during the storm, all that idle time in the shelter affords people the ability to solve previous or existing problems. Likewise, economic downturns enable enterprising individuals and corporations the ability to make bold decisions with regard to marketing , sales or product focus that can lead to incredible gains as the economic

How many cofactors for inducing expression of every cell type?

Another revolution in iPSC technology announced: "Also known as iPS cells, these cells can become virtually any cell type in the human body -- just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor -- Sox2 -- to directly reprogram one cell type into another without reverting to the pluripotent state." -- So the method invented by Yamanaka is now refined to rely only 1 cofactor and b) directly generate the target cell type from the source cell type (skin to neuron) without the stem like intermediate stage.  It also mentions that oncogenic triggering was eliminated in their testing. Now comparative methods can be used to discover other types...the question is..is Sox2 critical for all types? It may be that skin to neuron relies on Sox2

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform cert