Skip to main content

Google apes Wikipedia, to introduce veracity ranking to search

In a blog post from 2008, I introduced the idea of incremental truth following that is the secret sauce behind why wikipedia works as a source of reliable information for many subjects despite having a generally unrestricted editing model for new articles by any possible authors.

Over the years people have variably misunderstood the power of this mechanism and as time has gone by the relative approach to truth that has been clear to see with various types of politically or religiously focused topics is clear. For such articles wikipedia had found that "comment wars "would break out as clever but subjectively based arguments were used to change articles in subtle or gross ways producing articles that appeared to muddle down the historical or empirical facts to long paragraphs of highly conditional statements with little definite statements of veracity.

Moreover these highly active articles were continuously subject to comment bombing of various types as vandals holding opposing positions continued to random strike from anonymous accounts. Wikipedia implemented new features that restricted article editing to named users, closed some articles down to editing only to users vetted by the community as experts in given areas (still contentious) and took actions to ban users and IP's that continued to post edits that were considered vandalism.

Unfortunately the still remaining element of subjectivity that remained in determining who would be able to edit various articles poses a force toward some articles becoming controlled by small groups of colluding and over zealous posters on a given subject who may in fact not have the full description of truths of a given subject in their goal set.

Still despite this, wikipedia articles across many areas of technical back ground has become a reasonable quality first source for access to academic references to content within those sources that is extremely useful for mature articles (those with long posting histories).

Yet the reliance on human actors and the variably long path to maturity stands as problems. Enter Google and rumors that they are going to be performing a first big update to the functionality of page rank since it's invention by Page and Brin in that paper of theirs from 1996.

To effect a major positive shift to the quality of information that people are given when a search is performed at google, the algorithm is being tweaked to also sample veracity of content sources in generating the rankings and display of content listings as well as link popularity. This would allow the selecting of not just the loudest voices speaking on a given subject from the din of linked sources but rather the loudest voice that has the most verified set of objective facts supporting it.

They are apparently doing this by leveraging a data base of facts that they have been collecting called the "knowledge vault". This is a radical change in their approach that will now bias for the truthfulness of a link reference over it's popularity, it's going to take some clever data science to minimize errors but as popularity ranking did the algorithm will get better over time.

Technology applied this way is exactly the type of potentially socially revolutionary application that could improve the state of available general human knowledge by preventing entrenchment of a pathology of ideologies that otherwise would keep reinforcing themselves despite the preponderance of evidence (facts) against the veracity of any of their planks.

I've written several posts on my view that access to information via the web and mobile device platforms is critically important to help reduce ignorance and xenophobia in the world...and to the degree that people have access to truthful sources this statement is correct but the ranking of search results up to now has biased for popularity over veracity which only cements ideology as mentioned before with this seemingly little act Google may accelerate a global conversion of people being informed more by what is true than by what they want to be true.

This is a potentially big deal, I am very curious to see how this change will effect the awareness of true things over say the next decade, it could easily be the one thing that Google does that has greatest impact ironically....more than any of the other "moon shot" initiatives they've put forward...and that would be a really good thing for humanity.



Popular posts from this blog

Highly targeted Cpg vaccine immunotherapy for a range of cancer


This will surely go down as a seminal advance in cancer therapy. It reads like magic:

So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator​ T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.

This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

First *extra Galactic* planetary scale bodies observed

This headline

So every so often I see a story that has me sitting at the keyboard for a few seconds...actually trying to make sure the story is not some kind of satire site because the headline reads immediately a nonsense.
This headline did just that.
So I proceeded to frantically click through and it appears it was a valid news item from a valid news source and my jaw hit the floor.
Many of you know that we've been finding new planets outside of our solar system for about 25 years now.
In fact the Kepler satellite and other ground observatories have been accelerating their rate of extra-solar planet discoveries in the last few years but those planets are all within our galaxy the Milky Way.
The three major methods used to detect the bulk of planets thus far are wobble detection, radial transit and this method micro lensing which relies on a gravitational effect that was predicted by Einstein in his general theory of relativity exactly 103 years ago.