Given the barrage of new link development techniques and spammy approaches that Google et al have had to contend with over the past year, it appears obvious to me that Google needs some divine insight (he says jokingly assuming that their myriad of PhDs haven't already considered this possibility) in their efforts to define and value inbounds links in the ranking algorithm. People are always trying to game the system, and understandably so when the value of high rankings is soooooo outstanding.

That said, how do the search engines really know if a link is a vote or not, or if it should be counted towards rankings? Its quite obvious that current link valuation techniques have their share of problems, and consequently Dave Naylor's posting today that Paid Links are being devalued on mass. So we know that Google is trying to solve this problem of link manipulation, but how else can they work to solve it?

Lets look at this from a different perspective. What if the search engines were able to create a 'checks and balances' mechanism that provided a second opinion about the value of a link? To some extent they're already doing this by filtering based on content relevance. So now, lets add a powerful checks and balances mechanism which we'll call 'the click test'. In its most simple variation, the click test is just, if a link from site "A" to site "B" is not 'clicked' on over a prespecified time period, then Google would set the value of the link to "0". If it was clicked on, then perhaps Google gives the link a value of "1". The click test value could then be multipled by the previous value yielding a score of either "0" or the previous score. Voila ... link values are validated on an ongoing basis, and only quality links are scored. Those scoring a "0" value are completely discounted.

This of course begs a number of other questions, namely:
a) wouldn't the search engines need to make sure the links aren't being gamed? The answer is yes ... but the search engines can use simple technologies to ensure that the same person doesn't click on the same links each month. This would be relatively easy to do for a Google, very similar to the algorithms used by Digg and other social media.

b) what if the value of a link was a multiple of the number of clicks it receives, so that the value is not merely as simple as assigning it a "1"? What if sites with links that did not receive clicks received negative points? Certainly possible, but far beyond the scope of this posting. Our main contention here is that, gaming the system should not be your goal, as the effort is doomed to fail long term. Google can use a number of relatively straight forward approaches to validate link worthiness.

c) could they gather these statistics? Absolutely, given all the tracking information Google has (see Why Does Google Remember Information About Searches and Yesterday's "the SEP guy" posting on SEP You Have the Right to Remain Silent. It wouldn't even require a whole lot more computing power to be frank.

So where does this leave us? Apparently, its going to leave me with an experiment to perform. That said, stay tuned, as I'll set-up and report on the experiment. In fact, special offer; subscribe to our feed through Feedburner, and we'll make the research findings available only through the feed. Regular blog readers going direct, will not see these results.

Stay tuned!