Site icon Search Engine People Blog

5 Page Quality Indicators That Can Outweigh PageRank

page-quality

PageRank would be a great indicator of page quality and relevancy if there is no link selling or artificial link building involved. Lacking that, Google has to implement a quality scoring system independent of link popularity.

Let's explore how to gauge page quality in a more sophisticated way using these 5 metrics.

Clickthrough Rate

In a patent entitled Methods and systems for establishing a keyword utilizing path navigation information, granted to Google, clickthrough rate is proposed as a potential signal of content quality.

With this patent, Google can accurately estimate a spam score. The method involves examining clickthrough rate data and the navigational path that led users to the landing page.

If the page has a CTR higher than what is normally expected given the same impression size, then the web page is likely to be labeled as manipulative or spammy.

Why? If visitors cannot find relevant information, they would just click the ad banners in hope of finding better sites.

Originality

Plagiarism detection is a less difficult algorithmic task for search engines.

Web spiders can easily parse HTML body to check if the content is copied from other sites. Article spinning software that works through synonym substitution and erratic sentence rearrangement don't suffice, especially with Google also relying on community spam reports and human spam fighters .

A lot of junk blogs have vanished in Googles SERPs recently, thanks to the Report Scraper Page campaign and the Personal Blocklist browser plug-in.

Number Of Unique Authors

Google can detect the absence of editorial control.

If a blog relies too much on unedited guest posts for most of its content, its quality score may deteriorate.

Article directories that accept a lot of unedited submissions have suffered a big drop in web traffic since Google targeted sites that used crowdsourcing at the expense of quality output.

Authors Credentials

AuthorshipRank is a new ranking concept that is less easy to manipulate than PR.

By supporting HTML-5 markups to link blog posts with their true authors, Google can easily detect scraped blogs while encouraging talented writers to produce more quality blogs.

By adding the rel=me or rel=author attributes in hyperlink signatures linking to Google+ Profile page, bloggers can build their credentials in the long-run, which would have an impact on the ranking of the articles theyve written.

External Sources & Citations

Citing your sources can help improve both the quality and relevancy scores of your site.

Google looks at the external sources of blogs for indexing and quality scoring purposes. It is the same principle used by Wikipedia: downgrade posts that lack credible external resources.

Linking to authoritative sites can help convince search engines that your blog is well researched and accurate.

It also helps crawlers to properly index your site by giving them reference information to be used when compiling a so-called hybrid document. This is a combination of blog feeds, posts and external data which Google uses in evaluating the relevancy of blogs to search query.