On-site SEO strategy

by Alexander Zagoumenov April 17th, 2013 

There's a number of ways to categorize your SEO efforts (on-site and off-site, for instance). In this article I want to discuss on-site SEO elements as opposed to off-site strategy. In other words, I'll provide you with tips and tools to keep your site clean and have a great relationship with search engines. This article will be useful for both professional SEO (as a refresher) and novice SEOs (to get a perspective of on-site SEO strategy).

First of all, let me define the on-site SEO strategy. For further discussion, on-site SEO strategy is a collection of tactics to ensure that

  1. Search engines know about your site;
  2. Search bots can properly index your site;
  3. Your pages are well-formatted for SERPs.

Second of all, it's important to note that on-site SEO is not a one-time activity that you do at the beginning and forget about it. So, the following on-site strategy elements need to be monitored over time (weekly, monthly) depending on type and size of your site. Larger news sites need to be audited more often.

It is useful to get a hold of a good crawler tool such as SEOmoz where you can track changes in site errors and warnings on weekly basis. Or you can use desktop SEO tools such as Screaming Frog and Xenu.

Do search engines know about my site?

The way Google-bot works (discovers new pages through links) Google will eventually find your site even if you do nothing (a link or two from external resources are still needed). However, there's a way to 1) speed up the indexing process, and 2) ensure that all new updates (new pages / categories, etc.) get indexed in timely manner. Here's a couple of things to keep in mind

XML sitemaps

Sitemap.xml files are sitemaps in a format that is easy to understand for search engine bots. Such file is not created to humans (/sitemap.html/ or /sitemap/). It's located in the root directory of your site for search engines to pick it up. Learn more about XML sitemaps. There's a number of ways to generate such file once the site structure is finalized. Here's only a few of them:

Webmaster Tools

Webmaster Tools such as Google Webmaster Tools (GWT) and Bing Webmaster are the most direct doorway between your site and a search engine. These accounts will help you keep track of your site's health as it relates to search engines. So, ensure that you have such account created and your site's XML sitemap or feed is submitted.

Can search engines see what I want them to see?

To answer this question we need to make sure that there's nothing preventing robots from discovering pages inside the website. You can run a quick scan manually by searching Google for [site:www.domain.com] and taking a note of the number of results displayed. If it's about the same as you expected, then you shouldn't have a problem. If it's not, read on Let's take a look at several important domain-level and page-level.

Robots.txt

Robots.txt file is a file in the root directory of your website (yourdomain.com/robots.txt) that instructs search bots on what to index on your site. Read more about Robots.txt, what it is, how to configure it here.

404 errors

404 error pages appear when a page on the site is absent. Keep in mind, 404 errors happen, they are ok, nothing to panic about, but if your site is not prepared for it, such errors can damage your reputation and play a role in reduced rankings. Worst case scenario is when a user gets something like this

404 error, bad example

These pages are bad because they degrade user experience on your site. Of course your visitor can alter the URL in address bar and land on the homepage (yoursite.com/page-that-does-not-exist/) BUT most likely he / she will close the window / press the back button on the browser and never come back.

So in order to make sure you keep visitors happy and longer on the site (Google likes sites that keep users on the site longer), make sure you have a custom 404 error page.404 error, better example

Above is a good example that evokes positive feeling. However, here's a few things I would add to this 404 error page:

  • a few (2-3) text links to point people to popular directions on the site
  • a list (3-5) of options pointing to pages / posts related to the search query
  • a search field that would provide additional navigational opportunity

Redirects

It happens that you update / change URLs (due to a new site structure or specific page optimization) on the site. Once you did that, it's important to make sure that your XML sitemap is updated with a new one (and the old one removed). Also, it's a great idea to create a redirect from old to the new pages. The steps will depend on a server you are running but here's a good place to start.

URLs

Make sure your site's URLs follow a few rules in terms of depth, length, descriptiveness:

  • Fix URLs that are over 3-4 directories deep. Flatter URLs tend to index faster when the page is created. Plus it's less confusing for the bot and users.
  • URLs that are 100 characters long tend to rank worse. So, avoid stop words (in, a, the, etc.). Keep it short to 3-5 words.
  • Make sure your URLs are descriptive of the content on the page. Stay away from keyword stuffing the URL, but try to get a phrase in, if possible.

Here's a good read on changing URL structures on your site.

Titles

Your titles need to be short, specific and descriptive. It should tell an engine what the page is about in a short form. Avoid stuffing the title tag with keywords. Optimal title length is between 70 and 100 characters long including spaces. For dynamic titles I recommend you go from detailed to broad: Product > Category > Brand name. Read more in my earlier article here.

Headings

Make sure there's one h1 per page, it's descriptive, preferably short and includes your page-level target term. Headings (H1, H2, etc.) are supposed to divide your page content into logical sections, therefore presenting a value to search bots trying to understand what your page is about. Make sure your pages use heading, and those headings include your page's focus terms. Also, my rule of thumb: One H1, Two H2s and Three H3s per page. You don't have to follow this exactly, but make sure there's only one H1 per page.

Links (interlinking)

I've worked with a number of Yandex SEO projects. It appears that interlinking of pages is not as important for Yandex algorithm as it is for Google. So, if you're optimizing for Google, have a read on internal linking here.

Also, if you run a blog, forum, e-commerce site or a news site you are likely to have pagination issues. Learn more about the pagination issues for SEO and find how to solve it here.

Site speed

Users love faster sites. Therefore Google rewards faster loading pages with higher rankings. Make sure your site doesn't have slow pages that negatively affect your rankings and user experience. Test your site regularly using Pingdom or page speed test tools from Google.

Site analytics

Analytics can tell you a lot of things about your site performance (issues and speed) and your visitors. Regardless of what you use, Google Analytics or something else, make sure you keep an eye on content efficiency (how effective your pages are).

Want to create custom reports and save time in Google Analytics, check this guide from Google. Or feel free to import these templates by other people here and here.

Does my target audience like what it sees in SERPs?

Yes, it's important to rank highly on a search engine results page (SERP) but it's also key to ensure that users click on your results. Studies show that with the proliferation of rich snippets users tend to click on listings that provide 1) value proposition in descriptions and 2) additional information about the page through advanced search snippets. Let's look at both in more details.

Descriptions

Meta descriptions do not affect rankings, but they do affect your listing's click-through rates. A description that clearly shows the page's USP has a higher chance to be clicked on by the user. A few best practices on meta descriptions. Also, make sure to review this post on creating meta descriptions based on your PPC text ads.

Rich snippets & microdata

Same as good descriptions, search listings complete with rich snippets tend to get higher click through rates. Here's a good article on why rich snippets are important. Make sure your most valuable pages / landing pages have appropriate rich snippets implemented. Read my recent article on rich snippets. Use this Google's tool to test your pages for rich snippets. Want to see how your listing might look with microdata added, check out this cool tool.

In conclusion

Creating great content is primarily what it's about these days. However, if Google can't see your pages OR can't understand their relevance for specific terms OR if your listing looks incomplete and confusing, you'll be missing out a lot both in terms of rankings and actual search traffic to your site. As practice shows, on-site SEO is still very important to being successful in organic search.

If you liked this post, you might also enjoy Guest Blogging to Build Your Off-Site Site Map  

Alexander Zagoumenov

Alex is an SEO consultant offering website review services to improve usability and conversions of his clients' websites. He enjoys helping people by educating and sharing his experience through SEO training. Alex is also an internet marketing speaker and educator currently living in Perm, Russia. He consults and works with companies in Canada, U.S. and Russia. Comprehend Russian? Check his Russian site on internet marketing.

SEO blog

You May Also Like

2 Responses to “On-site SEO strategy”

  1. Hi Alexander,
    Very informative post! This can help newbie bloggers to implement SEO to their websites. I can see the effort and hardwork you have put for this post.
    Regards