SME Pals is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Learn more.
How to recover from a Panda penalty. Pic by Nathan Rupert

Step-by-step guide to my Google Panda penalty recovery

Restoring Web traffic volumes after a Google algorithmic penalty (especially one like Panda or Penguin) can be a very difficult and, often, fruitless task - without any guarantees.

Huge Drop in Traffic?

A significant drop in Google organic traffic is not necessarily a penalty. By all means, ensure that there are no technical issues preventing your site ranking - like accidentally blocking content with robots.txt.

The best SEO in the world won't necessarily make your site rank well, or recover lost traffic. It can only help give your site a 'chance' of ranking.

Google doesn't want to make sites popular by ranking them. It wants to rank popular sites.

Until you have become popular Google will happily rank thin (or even entirely blank) pages from bigger websites above your content.

The best way to rank well in Google is to understand your niche market. Know how it works. Know who links to who and why. The more you know the better the chances you have of building valuable relationships that deliver a strong backlink profile from genuine citations and editorial links.

SEO Recovery

Research your niche market and competitors to quickly understand why they're successful and how you can be too.

  • Analyze competitors' backlinks and SEO using SEMRush
  • Analyze any competing website using SEMRush and learn the secrets to their success.

    Understand your niche to make better marketing decisions, capture higher page rankings in Google, make valuable new connections and boost your earnings quickly.

    Don't waste time guessing what it takes to win valuable search keywords. Work out who is winning. Find out who links to them. Build your own backlinks.

    Try it out. Research a website right now.

You are able to use SEMRush for free by signing up. However, the free plan is limited to 10 requests per day - thereafter a paid plan is required.

In fact, many websites that find themselves penalized by Panda end up never recovering their original traffic levels and page rankings - and may, if their revenue streams are not sufficiently diversified, go out of business as a result.

Efforts at improving your website's content, link profile, performance, SEO and any other search related issues you can think of, may or may not result in a recovery of page rankings and organic search traffic.

And with very little access to specific information regarding the cause of a penalty (because Google doesn't want to release information that makes it easier for spammers to manipulate search rankings), Google is often not much help.

SME Pals was penalized by Panda in late November, after enjoying growth in organic search traffic through all prior algorithm updates.

Recently however, months of work to isolate the cause(s) and fix the problem(s) paid dividends as this site experienced a full recovery to Web traffic levels higher than before the original penalty.

This article highlights everything that I discovered and did in order to restore my Web traffic and recover fully.

If you are currently under penalty, or suspect that your site is not performing as well as it should because of a penalty, then hopefully you will find some information of value here.

Common causes of a Panda penalty

It's important to note that I won't be talking about back-link penalties here because these fall under the remit of the Penguin algorithm, which deals with spammy backlinks, or webspam.

1. Broken internal links

Google doesn't like poorly designed websites. Broken links lead to a poor browsing experience, and this is not desirable from Google's point of view. More often than not, any given blog or website will have a few broken links. Page addresses might change with time, or content is deleted without the corresponding links being removed, and so on.

This is not generally a problem. But, consider a minor error in your footer block that leads to one or more links breaking there. This mistake can be very costly because footer links appear on every page.

A single typo in your footer can lead to broken links that show on every page of your site.

Solution: Monitor Crawl Errors under Health in Google's search console and fix broken links as they arise.

2. Followed paid links

Google understands that blogs and sites need to make money from advertising - so you are allowed to sell advertising and sponsorship. However, they expressly forbid the sale of links that pass PageRank.

Any and all links that are not part of the natural flow of your content should include the rel="nofollow" attribute. This doesn't prevent Google from crawling the link, but it does tell them not to pass PageRank to it.

Solution: Check all your outgoing links and ensure that all outgoing links that aren't part of your natural content are nofollowed.

3. Thin affiliate content

From Google's perspective, there is little point in returning results that simply repeat information provided on the affiliate's site.

Adding affiliate links without offering some original content of value is a sure fire way to be penalized. Offer an opinion or review about the product in your own words. Give people a reason to come to you instead of going directly to the retailer.

Solution: Add original content to affiliate pages, or create a few affiliate landing pages and link to those from blog content (i.e. funnel people to an affiliate page from non-affiliate content).

4. Over optimized SEO content

Keyword stuffing used to be a favored tactic amongst SEOs. These days it is evidence of poor SEO and an impending penalty.

Don't sacrifice readability for the sake of keywords. When in doubt, write for human readers and use your target keywords only where appropriate. You can use synonyms to make the content more readable too - this can help drive more long tail organic search traffic.

Solution: Rewrite content where the keyword density is too high. Rewrite unnatural content. Create content aimed at human readers and not search engines.

5. Content syndication & duplication

Syndication refers to sharing content, with permission, on other blogs and sites (i.e. across domains), whereas duplication is the same content copied in various places.

CMS (Content Management Systems) often duplicate content, and Google understands this. For example, you might have a blog post that has a printable version. Both content syndication and content duplication is allowed by Google, so why am I including it here?

The reason lies in Google's wishy-washiness. They say that content syndication is ok, unless they suspect it is done in order to manipulate their search results. So syndication and duplication can appear all over the show without leading to a penalty, unless Google thinks you are doing it maliciously to gain an unfair advantage in the search results.

How would they know? The answer lies in the pattern of syndication and duplication. In a natural pattern, your content might reappear all over the show on a wide variety of domains.

However, someone intentionally trying to game the search results might duplicate their content manually or automatically on the same set of domains over and over. If Google spots a pattern in the syndication and duplication that it thinks is suspicious, you're in trouble.

Solution: Avoid excessive syndication to the same sites.

Do some basic research first

Losing valuable Google Web traffic from organic search, as a result of a drop in page rankings, can be a painful thing for online businesses. For many, Google's organic search traffic is the lifeblood that drives their traffic, business and revenue.

And, as of late, Google's Panda and Penguin updates, the EMD (Exact Match Domain) update, the top-heavy update, and countless more changes, have played havoc with webmasters who've watched their precious sites crash and burn.

The first thing that everyone wants to know as soon as they find a drop in traffic (regardless of whether it is immediate, or gradual), is how to get their rankings in the SERPs (Search Engine Results Pages) back.

0. Be realistic

The most common cause of a drop in organic search traffic, at least in my experience, is a lack of high quality original content.

Most site owners believe that the work they put in to get 100 000 products listed on their store is sufficient to capture rankings in Google. In reality, there is no way that a small business can create 100 000 good content pages - there's simply not the manpower to do it.

If your site has thousands of pages that have little new or original content, and you have started losing traffic, then you already know why.

1. When did page rankings drop?

Depending on how your Web traffic dropped, it may be possible to identify what aspect of Google's ranking algorithms slapped your site with a penalty. Knowing this is important because it can suggest possible causes, and provide clues to how to fix things.

Assuming you have Google analytics installed, check back over time to see when, if any, distinct drops in search traffic occurred (Hint: filter analytics to display organic search traffic by viewing Traffic Source >> Sources >> Search >> Organic).

If there are distinct drops then you can compare these with known major algorithm updates to see if they correspond with any. You can find a full list of Google algorithm updates at Google Algorithm Change History.

2. Search console

Not all penalties are handed down by Google algorithms. It is also possible to receive a manual penalty. In this case, it is important to check your Google search console account because Google sends notifications of penalties, amongst other things.

If Google suspects you of attempting to manipulate search rankings using spammy backlinks, they will also notify you via Google Webmaster tools.

3. Google webmaster forums

The Crawling, Indexing and Ranking sub-forum is a great place to get information about SEO related penalties and page ranking problems.

It is worth spending some time browsing and searching this forum because it is likely that other webmasters have experienced the same or similar problems that have been answered already.

You can also ask for help as there are plenty of people willing to give you a hand.

4. Google search

Of course, one of the best places to look for clues as to what went wrong is in Google's index itself. To check what Google has indexed for your domain, type in:

site:your-domain.com

Replace the bolded section the the URL of your own site (no spaces in-between). This will show you all the pages that are contained in the index.

Spend some time ensuring that what is there is what you expect. Also, make sure you view any omitted results as this will give clues to any duplicate content or stub pages (both common issues with CMS systems).

Identify potential SEO problems

Google's Matt Cutts advises that businesses and bloggers focus on making quality content that interests their readers, and everything will turn out ok. The sentiment is correct - in the sense that everyone should compete on the strength of their content and value of their information.

But that's what I do... honest!

Yet still, I found this site inexplicably penalized by Google Panda. Inexplicable because I wasn't doing anything malicious - but you'll be amazed what a bit of digging turns up.

Confirm it's a Panda penalty before taking remedial SEO action.

Prior to March 2013, Panda was updated, more or less on a monthly basis. If a noticeable drop in traffic coincided with a confirmed Panda update, then that is a strong indicator that Panda caused it. Well, that's what happened.

Also, as if I really wanted a reminder that my income was being completely wiped out, I received this from the Googlers at AdSense:

Google Adsense warning showing drop in earnings

Fun!

So, after a lot of investigation, I came across the following issues with my site (some of them are such rookie mistakes, from an SEO perspective, that I'm finding it hard to put them down on paper):

  • Slow server response time/page load time
  • Some followed paid links (advertising and affiliate links)
  • Negative SEO attack sites
  • Cross-domain duplicate content/syndicated content
  • Incorrect HTTP response codes in the forum
  • Broken internal links
  • Low percentage of content "above the fold"
  • Google site index contained many "imaginary" pages
  • Lack of, or incorrect use of, canonical URLs
  • Overuse of robots.txt
  • Underuse of NOINDEX
  • Sitewide incoming links
  • Content heavy template
  • Potential top heavy design

Ok, so not all of these issues fall under the remit of Panda, but if you find problems you may as well fix them, right? The above list probably sounds worse than it was. Many of the issues identified here were not sitewide - they were confined to a small section of the site.

It's also important to note that I received no warnings in Google search console - in other words, even though I found followed paid links, they were not in sufficient number to warrant a notification.

How I fixed my SEO problems

Let me start by saying that, as you can probably tell from the screenshot above, the site was under successive and compounded penalties for a period of about 5 months.

SEO is a medium to long term investment in time, resources and money. Just because you fix the problem, doesn't mean Google will reinstate your site to its original glory either. However, succeeding in online business requires tenacity - and Google will test your tenacity to the limit (if and when the time comes).

But, here we go...

I started by tackling my site's performance

While my site has always been well designed for SEO (with some pretty obvious exceptions), the server response time was negating pretty much all the performance related work I put in on the front-end.

I fixed broken internal links

It pays to keep an eye on Google Webmaster Tools because really innocuous mistakes can have pretty dramatic effects on the quality of your site - from Google's perspective.

In my case, while editing the footer HTML, I inadvertently left out a leading slash in one of the links. This caused every single page on my website to contain a broken link.

Broken links are a part of the Web. Google understands that. But an excess of broken links may cause Google to drop page rankings because of the compromised user experience (people hate sites that don't work).

I removed all followed paid links

This was a time consuming task because I had affiliate banner ads embedded within content pages. Being an idiot, I had neglected to add the rel="nofollow" attribute to them - plus the banners made the site look cheap.

In addition, I had a few followed text affiliate links that needed to be nofollowed. Spending hours scouring content for offending links is not much fun - but paid links are against Google's guidelines so it's a good idea to sort this issue out.

I should point out that Google does apply double standards because I know some high traffic, well respected sites that use followed, paid, sitewide links and have never even looked like losing out on traffic.

I stopped syndicating content

I wrote an article about how syndicated content was potentially the cause of my Panda penalty - Google Panda penalty from content syndication?.

Whether it was the cause, or simply a contributing factor, it's hard to say. At any rate, I now syndicate very rarely - if ever.

I issued DMCA takedowns against various attack sites

I documented how a negative SEO attack on my site by some lunatic may have contributed towards my Panda penalty (completely duplicating a site, post by post, may be construed as webspam and an attempt to manipulate page rankings).

Oddly enough, no sooner was the offending site removed when another was created - this time one that included malware that attacks the visitor's computer. Clearly someone out there does believe negative SEO is worth the effort.

I fixed HTTP response codes

Parts of this site, namely the forum, uses parameters to sort and order the results, as well as a page parameter to paginate content. This leads to a huge combination of paramterized URLs that all show pretty much the same content.

Google should be able to sort out which pages are which, but sometimes it struggles. What made it worse is that the forum had customizations that meant it always returned a page (even when one did not exist).

The combination of parameters and incorrect HTTP response codes meant that Google filled its index with imaginary pages that all looked suspiciously similar. In fact, the site was returning the forum homepage for thousands and thousands of URLs.

I fixed canonical URL behavior

In addition, the automatically generated canonicals were returning whatever URL was entered because the site was returning a 200 (Page found) response code. What happens now is that parametrized pages contain the canonical URL set correctly, and non-existent pages return the 404 page not found response.

That issue is fixed, but Google's index contains thousands of "orphaned, non existent pages" in its index.

404'd pages with no links to them take a long time to clear out of the index.

I implemented NOINDEX instead of blocking with robots.txt

Blocking Google from parts of your site is a bad idea - especially if it has already indexed a bunch of pages before they were blocked. If a page is blocked, Google can't crawl it, so it won't act on it. Those pages will hang around forever.

I wrote an article on when to use robots.txt and 301 redirects for Panda SEO recovery.

I redesigned the page template

The design of your site's theme and layout is actually very important for SEO and page rankings. In my case, I had too much stuff (not necessarily ads) pushing page content below the fold.

In addition, I had too much non-original content appearing in the sidebar. This meant that on any given page, the ratio of original, unique content, to content that Google was already aware of was low.

I modified both the page layout to push content up, and cut out a lot of unnecessary teasers and summaries that appeared in the sidebar - making each page leaner, with a higher percentage of original content.

I removed or disavowed poor quality incoming links

Finally, although Penguin is the algorithm responsible for dealing with spammy links, I noted that a few very poor quality websites had generated thousands of incoming links to my site. After attempting to have them removed (without even eliciting a response), I proceeded to disavow them using Google's disavow tool in Google Webmaster Tools.

So that's about it. As you can see I expended a lot of time and effort to restore page rankings and search traffic. I'm now firmly back at the top of Google search results for any number of highly competitive search terms.

With patience, research and application of all your technical and SEO skills, I know that you will be able to do the same. If you have any other suggestions to help people recover from Google Panda penalties, please share them in the comments below.

SEO (Search Engine Optimization) for eCommerce sites plays a vital role in driving valuable organic search traffic that converts into sales and revenue.

Does Google suck, or do you? Pic by Dean Hochman

Would you love to get completely irrelevant traffic from Google's organic search results? How about absolutely no traffic at all?

We love goats uses goats to trim your grass. Pic by Mike_tn

Really unique and clever business ideas don't come along all that often, and when they do I feel obliged to help them out with some free SEO advice.

I came across We Rent Goats today, and loved their eco-friendly idea of clearing weeds and brush with goats, instead of harmful herbicides.

The site opens with a cute sign of a goat carrying a sign around his neck - "will work for food". Nice touch.

Webmasters, bloggers and anyone else trying to make a living off the Internet, and in particular, Web traffic generated Google search, will understand how en

Tips on being proactive about building links

Making sure your site enjoys the benefits of high quality, relevant backlinks is about more than creating great content and waiting for nature to take its co

A guide to awesome search engine optimization

Better Search Engine Optimization (SEO) can mean the difference between capturing high page rankings in Google, for relevant keywords, and remaining

robots.txt vs 301 redirects

Use robots.txt and 301 redirects to "shape" your content into a Google search friendly high-traffic machine while avoiding common pitfalls

Comparing Ahrefs to SEMRush for business SEO. Image by Arek Socha from Pixabay

Check out two of the best business SEO tools (Ahrefs vs.

Extracting valuable insights from Google Analytics

Google analytics provides a wealth of valuable SEO data. But are you using it to its full potential to help create better content, drive more traffic and convert it more effectively?

It often helps to mine Google analytics data for SEO intelligence with a specific business objective in mind. The analytics and SEO tips covered in this article are all techniques I use to help me decide what new content to create, and whether or not my content is making an impact.

Google places for business mess up with Google search

About two or three months ago I decided to have a quick look at what sites Google consider related to mine using the related: search operator.

To my horror I saw a list of SEO agencies - some of which were no more than a landing page for a paid SEO service (of no doubt dubious quality). Why would Google think this blog is an SEO company?

SME Pals' tagline is 'Start a small business today', and the focus of the site is to inspire entrepreneurs to find a business idea and startup as quickly and easily as possible. Sure, search is a big part of growing an online business, so I talk about it... but is this enough to cause Google to think this blog is an SEO service?

"How good is Drupal for SEO?" The answer: great Drupal SEO is easy to implement with the right strategy, information and modules.

Implementing local SEO (Search Engine Optimization) techniques has become more and more important, from an Internet marketing perspective because, over time,

Back to Top