Weakest Link

by Barry Welford March 25th, 2009 


If you were expecting to find a website devoted to a disciplinarian lady dressed in black, then you have come to the wrong place.

anne robinson

This is not about that popular television quiz show, Weakest Link, hosted by Anne Robinson.  That was first shown by the BBC in the UK on 14 August 2000, and you can now even play an online version of that game.  It goes without saying that such a topic is not covered by SEO Scoop.

Instantly I am sure everyone then guesses what we will be talking about.  It clearly must be something to do with Google.  I should quickly clarify that this has only the vaguest connection with an earlier post this week, The Weakest Button – An Open Letter to Matt Cutts, Google

According to his comment in the Sphinn item on that post, Matt Cutts found the suggested mock-up for the Classic Search buttons interesting.  Whether anyone in Google will find what is written below of interest remains to be seen.

The title was chosen because in the game, Weakest Link, contestants were excluded if they did not measure up to the required standards.  That is the concept that will be developed in this article.


As most people realize, links which is short for hyperlinks are an important factor in the Google algorithms for search.  At one point, Google called them backlinks but now seems to use the more precise term, inbound links.  Yahoo uses the term inlinks to mean the same thing.  One good source of information on Google's thinking is the Google Webmaster Central Blog.  As its tagline says, it offers Official News On Crawling And Indexing Sites For The Google Index.

Here is what they say about inbound links.

Inbound links are links from pages on external sites linking back to your site. Inbound links can bring new users to your site, and when the links are merit-based and freely-volunteered as an editorial choice, they're also one of the positive signals to Google about your site's importance. Other signals include things like our analysis of your site's content, its relevance to a geographic location, etc. As many of you know, relevant, quality inbound links can affect your PageRank (one of many factors in our ranking algorithm). And quality links often come naturally to sites with compelling content or offering a unique service.

They have some useful suggestions on how to increase merit-based inbound links.  In summary, you should create unique and compelling content on your web site by such methods as:

  • Start a blog: make videos, do original research, and post interesting stuff on a regular basis.
  • Teach readers new things, uncover new news, be entertaining or insightful, show your expertise, interview different personalities in your industry and highlight their interesting side.
  • Participate thoughtfully in blogs and user reviews related to your topic of interest.
  • Provide a useful product or service.

You will find much more detail in that blog post and they also encourage the use of the Webmaster Tools website to identify how well you have managed to create merit-based inbound links.

Current Google Algorithm

Although the Webmaster Central Blog gives useful information, it is a little general.  How can we develop more precise guidance on what to do?

The Google search approach relies on PageRank Technology

PageRank reflects our view of the importance of web pages by considering more than 500 million variables and 2 billion terms. Pages that we believe are important pages receive a higher PageRank and are more likely to appear at the top of the search results.

PageRank also considers the importance of each page that casts a vote, as votes from some pages are considered to have greater value, thus giving the linked page greater value. We have always taken a pragmatic approach to help improve search quality and create useful products, and our technology uses the collective intelligence of the web to determine a page's importance.

Getting Quality Votes – Link Juice

A popular term for the value of the vote from another website that provides an inbound link to your website is Link Juice.  Any given web page has only a certain value for PageRank which is distributed across all the links leaving that web page.  Thus any more authoritative web page with few outbound links from it provides more Link Juice to each of those outbound links than a weaker web page with many outbound links.  The following three articles from 2007 provide more information on Link Juice and are still valid.

The problem with this approach is that once everyone knew that the more inbound links the better, such links were created by whatever methods would work.  The vast majority of such links could never be thought of as merit-based inbound links.  The Google solution to partially correct this was to introduce the NOFOLLOW tag


There seems to be some confusion about the nofollow tag but here is the Google explanation:

How does Google handle nofollowed links?

We don't follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using nofollow, or if the URLs are submitted to Google in a Sitemap. Also, it is important to note that other search engines may handle nofollow in slightly different ways.

The nofollow tag must be assigned by the owner of the web pages. The basis for this assignment seems to be based on intent.  If the owner of the website has received some benefit, usually cash, by placing the outbound link and the intent is to influence PageRank, then the link should be nofollowed. This of course then nullifies the PageRank influence.  

This paradoxical situation leaves a degree of arbitrariness in its application.  Some websites such as Twitter apply the nofollow tag on all outbound links.  This is why you will see cries from the heart that Google and/or Twitter Need to Ditch Nofollow for All Our Sakes!   In consequence, Julie Joyce provides guidance on How To Avoi
d The Link Vacuum Effect
.  In effect it comes back to developing worthwhile content.

The Present State Of The Web

It is interesting to take a big picture view of the web under the influences described above.  A whole industry has developed to create links to influence Google rankings.  Armies of individuals seek agreement between owners of pairs of websites that agree to exchange reciprocal links.  Or it can arise through multitudes of websites created purely to contain links to other websites.  Although Google has clearly stated that any such links created purely to influence rankings are counter to their Quality Guidelines, it seems to have little effect on the flood.  For reference, here are what those guidelines state on links:

Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

A Disclaimer

Google carefully guards the secrets of its search algorithms.  Accordingly the following is based purely on speculation and may be completely in error.

It is based on the assumption that almost all web pages are included in Google databases and have some PageRank value however small.  It should be noted that this is not the PageRank value as displayed by the Toolbar PageRank gauge.  Instead it is the precise mathematical value used within the algorithms.

It is also assumed that only a small proportion of Web pages are excluded from these databases.

Possible Alternative Algorithm

The assumption is made that the score that measures relevance for a given Web page in a given keyword query includes as one factor the sum of a very large number of PageRank contributions from inbound links from other web pages.  Even though each inbound link provides a miniscule contribution the sum from thousands and thousands of inbound links can produce a measurable contribution to relevance.  This is why spammers generate thousands of web pages to target web pages which can rank high in keyword rankings.

The alternative that is being suggested here is that most outbound links from most web pages would be assigned a zero PageRank value for algorithmic purposes.  Only those outbound links with PageRank contribution above a certain threshold value would retain this normal PageRank value in the algorithmic calculations.  This would be a very tiny fraction of all outbound links from all web pages.

This would mean that the algorithms are ignoring (setting as 0) potential PageRank contributions from such outbound links as:

  • those from web pages with many links on them, e.g. directories.
  • those from reciprocal link arrangements where many web pages are featured on low PageRank pages
  • those from automatically produced spam web pages with very many siblings

Benefits Of This Alternative Algorithm

At present the general view is that any link is worth having even if its contribution is incredibly small.  The more links the merrier.

If this alternative algorithm thinking has any merit and is accepted, then its logic that the weakest links are out can be explained and widely publicized. 

Given the cutoff arrangement in this alternative algorithm, it is no longer true that all links have value in the algorithm.  This simple and clear statement should encourage people to go for content that is valuable and stop wasting their time on links of dubious value.

How To Work Your Links At Present

Even if this alternative algorithm is not accepted, it is probably wise to behave as if it were true.  It is much better to put effort into getting worthwhile inbound links rather than going after thousands of possibly dubious links with probably miniscule benefit.

With the present algorithm, actions that go against the quality guidelines may or may not damage the keyword ranking of your web pages in any given period.  If you decide to take the risk, you can always use 'throw-away' domains if your actions are spotted and your website is penalized.  With the alternate algorithm it will be very clear that actions that go against the quality guidelines can be shown mathematically to have zero effect. 


This suggestion is very speculative.  What are your reactions?  Do you think it would have a beneficial effect?  Please add your comments.  There are no wrong answers.

Barry Welford

Offering practical, effective ways of strengthening Internet marketing strategy and getting bottom-line success, particularly through local SEO.


You May Also Like

9 Responses to “Weakest Link”

  1. Nick says:

    Interesting that a post calling for the negation of all 'low quality' links such as those from directories, 'recommends' (or should that be advertises?) Directory Submission Software in its sidebar. SEO hypocrisy. Wonder if this will get published?

  2. * I got a server error commenting.. trying again :P

    I don't think that Google is being 100% straight forward about the value they pass. They claim that they don't recognize No-followed links, either anchor text or page rank…however -If you verify with Google webmaster tools, it tells you what are the "recognized inbound links". I am looking right now at the webmaster tools account, and it is recognizing a comment on a blog that was no followed. The anchor text for that link also appears in the statistics section, under what Googlebot sees as Phrases "In external links to your site".

    If Google REALLY "does not transfer PageRank or anchor text across these links." then why would that link show up in the inbound links, as well as list the anchor text in the "What Googlebot sees" section?

  3. I agree with Jeremy that Google is not disclosing all of the information that they find in the online world. Everyone has their own opinion on the value of a nofollow. Personally, I'll take whatever link I can get as long as it adds value and makes sense from a relevancy perspective.

    I think that the end of this article hits spot on where it suggests to act as though the alternative algorithm was in effect. It goes back to the Eric Ward mentality where you think to yourself "if the search engines didn't exist, would it still make sense for me to get a link from this site?" If any webmaster follows that idea, they should be guarded against any algorithm updates in my opinion.

  4. rankfirst says:

    google will never say the whole story about links, as the algo is link based.

    Things are moving fast, and I think Google will soon judge links differently. I think there is going to be more social impact on rankings – and this is happening already.

  5. I don't think Google will ever come out and say exactly how they acknowledge links because this usually leads to bad apples pouncing on that particular effort. if you are actively growing a business online and using all the tools that are available to you then your natural links will build giving your website great link power.

    • That's exactly right, Nick. It is the reason why in this post I am suggesting that they should clearly state that they do not acknowledge low-value links. This is something where it is not a contest between Google and webmasters, but something where everyone could be agreeing.

  6. [...] Weakest Link. As most people realize, links which is short for hyperlinks are an important factor in the Google algorithms for search. At one point, Google called them backlinks but now seems to use the more precise term, inbound links. …   [...]

  7. [...] have suggested elsewhere that Google should adopt a Weakest Link approach in its search algorithms.  There are far too many people spending far too much time [...]

  8. Say for example that you have a website about gardening with good quality original content, and you have lots of incoming editorial links from "enthusiast" blogs. Links from people who are pointing to your site because they find it interesting or useful.

    Even though those are mostly internet dabblers with low traffic and low pagerank are they not exactly the kind of merit links that Google is looking for? Also a great indication that like minded surfers would find your content useful and interesting.

    It seems to me that you might be suggesting that Google throw out the baby with the bathwater.