Well, it has in fact been well over a year now. A lot has changed since April 2011, and in some ways, a lot has remained the same. Here is what happened;
11th April 2011 – My main site was heavily affected by the now notorious Panda Update. I wrote here about it here in November 2011 – On The Road to Recovery. With a lot of hard work and some excellent advice, I made a full recovery. The recovery started in January 2012 and by May 2012 traffic was better than ever, in fact last May was my best month yet.
However, the good times did not last, as additional Google changes meant that I suffered some slippage, and by August 2012 I had lost around 35% of traffic (Google referrals) once again. Maybe a bit of Penguin kicked in — I had done some pretty cheap directory submissions in 2008/9 — or maybe a Panda refresh. I really do not know.
I decided to focus solely on the site though. I have not done very little offsite SEO work since Panda struck as all my time and energy was spent on fixing the site. Where am I today? Well, the last few weeks have once again seen some improvements, Google referrals are up again. Not record-breaking, but on par with July figures. So, what did I do?
There were essentially 3 stages to my Panda recovery:
- Elimination of all low quality, duplicate and "fluff" content.
- Improvements to remaining content.
- Removal of legacy parts of the site.
Now, I spoke a lot about 1 and 2, as this is generally what most people had been doing. However, the 3rd part may have actually been the most important. To be honest though, I do not actually know for sure, as I broke the number one rule of only doing one thing at a time! In January 2012 I started another 4 weeks of work on the site – it was a last-ditch attempt! So, what did each part really involve?
Elimination of Low Quality and Fluff
The site started as a blog and in my early days it was a hobby. There was no interest in monetizing it and I honestly was not expecting so many people to read it! I used to write very short posts, often with large blocks of quote text. On Blogger there was a tool in which you could highlight text on a page and then press "blog it" and this would create a blog post. I would add my opinion to it. I also used free articles (not knowing at the time that they were massively duplicated). When I did write, it was often short news posts that would quickly become irrelevant.
So, the first phase was to cut out all this dead wood. How did I determine it?
- Ordered all posts by word count and started by chopping out the smallest ones.
- Searched for articles containing "isnare", "articlebase", "ezinearticles" etc. – i.e. popular free article directories. These were all deleted.
- Searched for URLs for popular news sites (mostly BBC News) to locate the longer articles with large quoted texts.
Then there was the fluff. Fluff is harder to spot. The articles that I consider to be "fluff" are not copied, duplicate or short. But they are low quality. They are the articles that no longer serve any purpose other than to bolster the content and provide keyword links to other articles. These were mostly deleted.
Improvements to Content
To improve the remaining content I reviewed all the articles and looked for ways to improve. In some cases several articles would be merged, sometimes photos and new images added. Sometimes clarifications and references added to improve arguments. In some of these articles the content was often good, or at least, acceptable. However, as a stand-alone article they were pretty poor. The solution was simple – combining articles, editing and republishing. Or more simply – turning several low quality articles into one large high quality article.
For example. If I had written about a particular topic several times over the last 5 years, I would combine all the articles to create a page that was closer to an encyclopaedic entry than a blog post. Think Wikipedia and you get the idea. Some pages saw 10 articles combined to create a single large post of over 20,000 words.
I did come up with one simple way to determine fluff. At least, to determine the pages that Google considered to be fluff. I took the approach that any pages that are no longer visited by Google are not liked. So these pages were deleted. Generally pages which received little traffic each month were eliminated. Some pages would get no visitors at all, others may get 5-6 a month. Overall, they were not attracting readers and for the sake of the project, they served no purpose. An archive is only of any use if people sometimes read it.
Removal of Legacy Parts of the Site
This may have been the most important task. For a while I had run a community section on Drupal. In this section I have also tested some affiliate stuff. One was a module (I used Drupal) which automatically generated pages based on Amazon affiliate links. I actually removed this module many years before Panda struck, but the removal of the module did not delete the pages. I had hundreds of pages on my site which were almost empty except for a title / header. They looked very spam in light of Panda.
The solution was radical – I deleted the entire section. It was in a subdomain, so deletion was easy. There was some good content there which I moved it to a new domain, but overall about 90% of pages (URLs) were deleted.
I did suffer a small traffic drop after one of the Penguin updates. As the drop was small it took me about 3 months to realise – I thought it was just a seasonal change! I have several of my own sites and so started by reviewing these and changing any keyword links on my sites to domain / brand named links. That is the only "Penguin" work I did. I cannot be sure if it contributed to a small recovery or not, but I think it helped.
This year I spent a fair amount of time improving the way the site looked. This included updates to the home page and category pages, navigation and overall appearance. I pushed social more, so added more prominent social buttons. The overall result was that the site looks more professional, and less like a site built to attract Google traffic.
Less Is More
Until April 2011 I was obsessed with creating new content. It was a simple method that worked for me. The more I wrote, the more traffic I got, the more money my business made. Panda certainly taught me to slow down. It is still important to blog, just not to blog everything.
While I have managed to keep my head above water over the last 18 months or so, I do know that I am still on rocky ground. One new change could destroy me again. My main focus remains on diversifying as much as possible. This fortunately fits in well with the "less is more" approach. I spend less time working on my main site and more time trying to promote some others. In a way it is like a business investment – using the capital from one part of the business to try to general new revenue streams. It is hard work, but I am getting there. Slowly.
I hope this post provides you with some ideas. Even if you were not affected by Panda or Penguin, it is always a good idea to review your own site content, give the most important pages a good overhaul each year, spruce them up with some better images, check that the facts are still relevant and update the articles in light of new evidence and information. In short – there is no such thing as "evergreen content". The content must be reviewed and kept fresh to help it rank well.
If you liked this post, you might also enjoy How to Recover After the Google Panda and Penguin Update?
I believe that content is king and all other SEO comes second. Build an awesome website and write brilliant content first, then your readers will do the SEO work for you.