How to Recover from a Google Panda Penalty

Released in February 2011, Google Panda was one of the most important algorithm changes of that year. Formerly known as “Farmer Update”, this algo change is targeting low quality content.

Google doesn’t offer much info on what the quality content really means. The posts published on their official blog in 2011 (by Amit Singhal) and 2012 (by Matt Cutts) contain limited details. Therefore it’s difficult to understand the reason behind receiving a penalty if you’re affected by this algorithm change. Between February 2011 – May 2014, more than 20 algorithm refreshes were rolled-out.

google-panda-update

The first generations of Google Panda were characterized by frequent updates. The algo was rolled-out almost monthly. The latest generation of Google Panda 4.0 update was officially launched on May 20 2014. This update seems to be a rewritten algo change which Google characterized as being softer and better targeted than previous generations of the Panda.

Which are the ranking factors involved

A lot of speculations were launched by webmasters and SEO experts. Still, the real ranking factors remain a secret. A recently launched Google Patent seems to shed some light in this case. But even so, I’m not very sure if this is 100% related to the algo update. US 8682892 patent is written by Navneet Panda and Vladimir Ofitserov, two of Google’s engineers. A detailed analysis of this patent is available here.

The patent describes a couple of methods to detect low quality content. For example, the mention of a brand in search results is interpreted as a sign of high quality content.

quality-rater-document

A screenshot showing the main interface of the Google quality rater platform

The Panda update is also influenced by Google Quality Raters, a dedicated team used for the detection of low quality sites in search results. The latest version of this manual is available online on a couple of websites. The votes given by the quality raters team are introduced in Google’s Panda algo as signals when a refresh of this update is launched.

The speculations about the ranking factors included in this filter contain:

  • insufficient content on pages;
  • internal duplicate content generated by tags, pagination, indexed searches and filters;
  • duplicate content (usually copied from high authoritative sources);
  • a high amount of adverts on the page, especially on the “below the fold” area;
  • keyword stuffing;
  • difficult navigation through the website;
  • a low percent of returning visitors;
  • lack of contact information;
  • low quality / non-informative articles, usually written using limited vocabulary;
  • a high bounce rate and return to SERPs (can indicate a frustrated user, unable to find what he needs);
  • dead pages still included in the website’s navigation or through internal linking.

Before moving to the recovery guide, don’t forget that these points are just speculations and not confirmed ranking factors of Google’s Panda update.

What you should check

The most important thing when you want to recover your website’s positions is to be sure that the site is affected by Panda. This filter is a sitewide one and affects the entire website.

Basically, a website affected by this update will loose its positions for multiple pages. You must check the moment of the penalty and you must compare it with the official dates of the Panda updates. You can also check this with the Panguin tool.

If you have a website with massive duplicate content, copied from various sources and affected by this update, you can forget about it.

Google don’t like duplicate content and your website will never get back to how it was if you keep that content. You have two solutions: forget about the site or delete all the copied content, then replace it with high quality one.

If your regular website has unique articles and it was affected, you must check the following:

  1. Was your content copied by other sites?
  2. Is the number of indexed pages almost similar to the number of pages from the sitemap?
  3. Is the navigation of the site easy to use and understand?
  4. Do you have pages with keyword stuffing /  a high percent of money keywords?
  5. Do you have a large amount of adverts on your pages?
  6. Is the bounce rate of your site a high one?
  7. Do you have a low percent of returning visitors?
  8. Do you have pages with limited content (short articles, for example)? If the answer is yes, how many do you have (in percentage)?
  9. Do you have deleted internal pages that are still featured on the internal linking of the site?
  10. Do you have limited contact info on the site?

If you have problems with some of the above, you’re probably affected by a Panda Update. You must repair your website if you want to recover its former positions and traffic.

How to recover your website’s traffic

1. Uniqueness of content

First of all, you must check the uniqueness of the content published on your site. You can use a dedicated tool, like Copyscape, for example. With its help you can randomly check some important pages from the site (usually, the most copied ones). If you find copied pages, I advice you to check the entire site for plagiarism. You can use paid services provided by the same Copyscape tool to check the entire content.

copyscape-example

An example showing a copied fragment of text, identified by Copycape.

Another method to find scrapper sites is to copy different fragments from your text into the quote and then check Google results for searches.

If you find copied content on other sites, the best solution is to contact the owners and ask them to remove these pages. If they don’t answer and the pages are still online, you should contact the DMCA team and file a report against the owners.

2. The number of indexed pages

If you’ve solved the problem with the external duplicate content, you must check the number of pages in Google’s results. Usually, the number of pages sent for indexation through the sitemap must be the same with the number of articles present on the site.

If your site has about 1,000 pages and you find a higher number in the search results, your website has an internal problem. Usually, these problems are generated by tags, internal filters, pagination and indexed searches.

If you are in this situation, you must block the indexation of these pages. Better yet, you should completely remove them. These “pages” are duplicates of the real ones and must be removed from Google’s results because they’re targeting the same keywords with the same content. If you don’t want to delete them, you must add “noindex” attribute to them.

3. Navigation problems

If you’ve solved that problem, you can pass onto the next step of our Google Panda recovery guide. For that, you must check your website’s navigation problems. The site must be easy to navigate and the menus must be accessible from every page. The 404 page must have a menu. It should also contain useful content in terms of navigation capability, such as search field, some filters so on.

4. Keyword stuffing

Onto the next point. Is the keyword stuffing present in your site? If yes, you must take measures and remove that content immediately. A lot of webmasters ask me “which density is optimal for me? If I have about 3% of money keywords in my articles, will I rank higher?” This is the wrong approach to the problem. Keyword density doesn’t matter. Google is smarter than 5 years ago. The new Hummingbird is best proof for that. Text must be easy to understand instead of filled with too many keywords.

5. The above the fold area

If you’ve solved the keyword stuffing problem, you must jump to the next checkpoint of this recovery guide: adverts on your site. Nowadays, the majority of webmasters use adverts as a main method of money-making. This is a normal thing Google know about and tolerate.  Problems appear when the webmaster exaggerates with the number of ads or is trying to arrange them inappropriately. The Page layout algorithm is signal of Google and after its launch a lot of speculations suggested that it was included in the Google Panda algorithm. If you are in this situation, you must try to reduce the number of ads in the above the fold area. Don’t forget that a lot of visitors use devices with limited display resolution, so you must adapt the amount of ads to that resolution.

6. The bounce rate

The next point on our list: is your site’s bounce rate high? Before offering possible solutions to solve this problem, we must define what a bounce rate is. The answer is simple: a bounce rate represents the percentage of visitors who access the site and leave it rather than continue accessing other sections of articles within the same website. To put it simply: a bounce is someone who visits a single page of the site, then leaves.

The time spent on the site is not important from the bounce rate perspective. In terms of percentage, it doesn’t matter if a visitor stays 10 seconds or 3 minutes on the page. If he leaves the site without clicking / accessing another page, the visit is considered a bounce. The bounce rate values are related to the characteristics of a site. For example: a blog will have different rate values if you compare it to an online shop.

But what you should do in this situation? The answer is relatively easy: you must increase the internal linking between pages, you must add related pages / products, widgets, so on. Basically, you must offer alternatives for the page that’s visited; and to allow the user to access other sections of the site.

7. Returning visitors / back to SERPs

One of the most controversial signals of Google Panda penalty seems to be the so-called “back to SERPs” element. This is in an appropriate relation with the percent of returning visitors.

Even if it isn’t confirmed by Google, this can be an indicator of the website’s quality. If a user is not pleased with the quality of a site, he will exit that page and click on the next result offered by Google. Basically, if this element is included in the Google’s Panda algorithm, signals from users will automatically filter the low quality sites. At the next roll-out of the algo, these sites will be affected by a penalty.

Also, a low percent of returning visitors can mean they’re not happy with the info provided by that site; or something is wrong with it. If the users avoid clicking on a website when they see  the search results, Google can interpret that as a sign of low quality.

If you are in this situation, you should wonder why the visitors aren’t happy with your site. Are you a liar who tells your visitors (via the website’s metadescription) that you’ll provide some interesting info and they don’t find it on your page? Is the website layout so outdated that they exit quickly from your site? Is the content written in a difficult manner which is difficult to understand?

You need to identify the reason of the users’ behavior and solve the website’s problems. You can use a landing page testing tool to observe the visitors’ behaviour before you decide what to do.

8. Thin content

The signals referring to Google Panda seem to also be related to the length and quality of the articles on the website affected by the update. If you have pages with insufficient content or poorly written pages, you’re in danger and this can be the reason of your penalty.

What to do in this situation? You must identify short articles (= less than 300 words, for example) and the ones with low quality. Once you do that, you must get to the next step and decide what you want to do with them. You have a couple of options: delete them, add noindex to them or rewrite them.

If the affected website has a large percentage of articles that are in this situation, the recovery after the penalty will be very difficult.

9. Dead pages, still present in the website’s architecture 

If you’ve deleted a lot of pages from your site, you’re in the situation of having dead pages inside the website’s internal architecture. If you are a frequent user of internal linkbuilding, the number of “dead links” will increase.

In this situation, you must remove all the mentions to these pages from within other pages of the site. You can check the entire site using Xenu, a simple but effective tool made for website analysis. Once identified, you must remove all links to these pages.

10. Lack of contact data

Regarding the quality of a site, Amit Singhal asks you if you’d be comfortable giving out your credit card info to the website you visit.

In other words, contact data seems to be another important ranking factor involved in the Google Panda algorithm. A site that doesn’t have contact data can’t be trusted. If you are the owner of an online shop, then you must have these: detailed info about the company which manages the site, a clear TOS and so on.

Conclusion

Even if it’s difficult to do, the Google Panda penalty recovery is possible. I’ve personally managed multiple recoveries of the websites which were affected by this algorithm.

If your own website went through something like that, you must have a lot of patience and work very hard if you want to recover your website’s positions and traffic.

In case you have ideas on what the Google Panda update is all about or you have a successful recovery story, don’t hesitate to contact me or add a comment on the form below.

Catalin Nichita is the founder of SEOVerse.com project. Catalin has about 6 years of experience in SEO and is specialized in penalty removals and website audits.

16 Comments

  1. Jake

    Oct 27, 2014 at 11:14 am

    I believe the recovery is very difficult if you have a large website. The duplicate content can destroy your website’s rankings if somebody use your content on his website. If you are new on the market and somebody uses your text, you have a lot of chances to be outranked by that website.

    Do you have any solutions for this effective situation? I have a blog affected by this update and I am disappointed. I’ve tried everything but with no success.

    Can you help me with an advice?

  2. Catalin Nichita

    Oct 27, 2014 at 5:49 pm

    Please, check your email. I will take a look at your problem after you will send me the url of your site.

  3. Amit

    Jan 18, 2015 at 11:24 pm

    An impressive analysis of the signals involved in this update. The elements involved in this filtering method are very sophisticated and the recovery seems impossible for a webmaster if he manage a large website.

    I think the only valuable option if you want to regain your traffic and positions is to start from scratch with another project. Sometimes is more difficult to repair…

    I’ve tried multiple methods to recover my positions of a site affected by the Panda algorithm update but so far I didn’t obtain any effective result.

  4. Catalin Nichita

    Jan 18, 2015 at 11:32 pm

    If the project is a large one, you can start with an audit using Xenu or Screaming Frog. I am very sure that you will find a lot of problems still unsolved: short articles, duplicate content or keyword stuffing…

    Just check, point by point the ranking factors described in the article and I am very sure that you can obtain positive results and you can recover from the Panda penalty.

  5. Amit

    Jan 19, 2015 at 1:23 pm

    Thank you for your answer. I want to know which are the most important things that I should check on my website? I’ve tested the Screaming Frog solution but I’ve discovered that the software is just a trial, limited to 500 pages…

    The Xenu solution looks very obsolete. I want to buy the Screaming Frog solution but the price seems to be very high for me (I have a single website).

  6. Catalin Nichita

    Jan 19, 2015 at 8:11 pm

    Indeed, Xenu is not the best solution if you want an in-depth audit of the site. However is still a good solution if you want to take a look at the most important possible problems of a website.

    You can also try the “Website auditor” solution: http://www.link-assistant.com/website-auditor/

    Even so, the best solution remains the Screaming Frog. You can obtain a report showing the shortest articles from the site, so you will have access to the info regarding the “low quality content”, present in the site.

    If you have a WordPress based website, you can also try this plugin: https://wordpress.org/plugins/post-admin-word-count/screenshots/

    With its help, you will be able to see which are the shallow articles. You can rewrite the articles or you can add “noindex” on all of them. If you have duplicate content, published on your site, you must remove them. This is the only solution if you want to recover a website affected by the Google’s Panda penalty.

  7. Alan

    Jan 27, 2015 at 9:40 am

    I’ve read the entire Google patent but I have some doubts that this is related with this kind of penalty.

    I will made some tests these days because I want to see which is the influence of the brand related queries to the entire website if I will suggest to Google that my site is a brand.

    Will be more visible in search results? The penalty will be revoked?

  8. Catalin Nichita

    Jan 27, 2015 at 7:38 pm

    Hello Alan,

    I am interested in a similar test, so please, come back with supplemental details when the test will be finished.

    Regarding the brand related queries, I believe is not so simple. I believe the queries must follow a pattern and the CTR must look legit to Google.

  9. Larry May

    Feb 21, 2015 at 11:11 pm

    Did you know any effective Panda penalty checker? I’ve read your article and I have a website affected by a penalty but I am not sure that my penalty is related to this algo change or not. I am ask you this because I think my website is a quality one.

    I’ve tried the Xenu tool on my website but the server is crashing in just few minutes and I am not able to receive the final report of this checking.

    Do you have any other solution to decrease the speed of the analysis? Do you offer any recovery service or diagnosis? Thank you!

  10. Catalin Nichita

    Feb 21, 2015 at 11:28 pm

    You can use any one of these free Google Panda penalty checkers. The first is labelled Panguin and is available here: http://www.barracuda-digital.co.uk/panguin-tool/

    The second penalty checker is available on this location: https://fruition.net/google-penalty-checker-tool/

    Regarding your server problems, you can try the Screaming Frog solution. The software allows you to change the speed of the spider, so your server will not be loaded with requests and will remain live. However, is not normal that a server to crash in such conditions. Maybe is a good idea to change your hosting company or to discuss with they.

    Regarding your latest questions: yes, I am offer recovery services and diagnosis for any type of penalty (including Panda). However, the recovery will took few months if your site is affected by the latest type of penalty. If you are interested in any website recovery service or diagnosis, you can use the “contact” page and we will discuss in private.

  11. Janie

    Mar 30, 2015 at 9:15 am

    Maybe is a good idea to give us an example about how must look a complete audit of a site. The explanations are good but I think you must offer more details about the weak points of a site and real examples of “low quality content” because Google refuses to give us examples on their official blog.

    I am writing you because few months ago my website was affected by a penalty and so far I didn’t seen any sign of recovery. I’ve checked the site with the tools suggested by you and they are told me that the site is affected by a Panda penalty. I don’t understand the reason of the penalty because my content is unique and I believe I have no “low quality content” on the site.

    I’ve deleted the tags and I’ve just put “noindex” on the categories. The only think I suspected to be responsible for this penalty is the internal duplicate content generated by these tags / categories. I’ve solved this potential problem few months ago.

    Can you tell me when is possible to see the first signs of recovery? Can you take a look at my website, to see what can I do to recover from this Google Panda penalty?

    I’ve just sent you an email using the contact form. Thank you! 🙂

  12. Catalin Nichita

    Apr 6, 2015 at 6:16 pm

    Hello Janie! First of all: am am not a Google employee and I am not Navneet Panda, so these conclusions are made after few Panda penalties successful recoveries. 🙂

    I’ve just looked at your website. The first possible problems seems to be related to the website content: are the posts rewritten in some way? I am asking you this because the content looks very similar with articles published on large / more authoritative websites.

    The landing pages are not very intuitive and the ads are positioned very closed to the content. I don’t know which is the bounce rate / return to SERPs percentage, but this can be a very negative signal: the users can press the “back” button and then click on another Google result. You should work at this chapter because a good landing page will keep the users on your site and will offer positive signals to Google which can reverse the Panda penalty.

    Other possible problems are related to the dead pages, which are still present in the website’s architecture. You should run a Xenu audit and you will be able to identify all the problems mentioned at this chapter.

    I recommend you to read the original Google Panda penalty patent, located here: http://www.google.com/patents/US8682892 This can make some light about what is Panda penalty and how to recover your website.

    I will made a complete audit of a website penalized by Panda and I will publish it as example. From my experience, the recovery is possible but you should be careful at all the signals mentioned in this article and into the Google’s Panda official patent.

  13. Thomas

    Aug 21, 2015 at 9:52 am

    What do you know about the latest refresh of the Panda (4.2)? I’ve seen that the Google engineers claims that the update will took few months(!?). What kind of update is this?

    I believe this is just a new Google’s joke. It’s very clear that they want to eliminate any kind of competition: the SERP is a real problem for their Adwords program. So they will “improve” the algorithm to remove the “low quality sites” from first pages with results.

    If the update will took few months, how can we know if the site is affected by this penalty of affected by another algorithm change? This is some kind of smoke screen, and it’s only purpose is to kill the SEO industry.

    So far I was not affected by the latest Panda update but I believe the penalty can appear anytime from this moment. I also believe the patents published by Google are another method to spread lies regarding their algorithm.

  14. Catalin Nichita

    Aug 22, 2015 at 7:02 pm

    Google claims that the latest Panda is just a refresh of the old algo. I have some doubts regarding this statement because if is similar with the old Panda then it doesn’t need few months for propagation.

    Indeed, all these statements coming from Google can produce some panic to SEOs and can be another Google strategy.

  15. Emily

    Aug 27, 2015 at 8:03 am

    I have two similar websites (the same niche) and one of them was affected by the latest Google Panda update. I don’t understand the reason of the penalty.

    The both sites have almost similar structure and are not interlinked. The sites are not linked to the same webmaster tools account, so I really don’t understand why just one of the sites was penalized by this new update.

    I believe Google must fired his “engineers” because they are responsible by this disaster.

    Even Wikipedia was affected by these updates. This Panda update seems to negatively impact their site too:

    http://www.searchenginejournal.com/wikipedias-traffic-from-google-down-11-why-the-drop/138791/

    I have no solution for my affected site. The traffic decreased with about 40% once this Panda penalty hit me. Can you help me with some advice if I send you the website?

  16. Catalin Nichita

    Aug 27, 2015 at 8:22 am

    I’ve read the Search Engine Journal article, but the Panda penalty is not mentioned in the article.

    I believe they have a problem with the migration from http:// to https:// because this was implemented few weeks ago. I don’t know if all the redirection was correctly done.

    I’ve checked with redirectdetective.com and everything looks correctly done. However, their sub-domains structure makes impossible the mass verification so I cannot be sure if everything is fine.

    Even with the redirects are correctly done, the link juice will not be instantly transferred to the target, so this can be another Wikipedia problem. I believe in few weeks, after the link juice will be transferred, the “penalty” will be revoked.

    Anyway, you can send me your website via the “Contact” page. I will take a look at it to see why was affected by the Google Panda penalty and which measures can be taken to recover.

Leave a Reply

Your email address will not be published. Required fields are marked *