The Panda Update is one of the most renowned and far reaching updates to Google’s search algorithm. First rolled out on February 23rd 2011 it has now gone through multiple iterations or further updates that Google have rolled out. The fundamental core of the Panda Update is to assess the onsite factors on a website, most notably the quality of its content in order to prevent websites with low quality content from being returned as a result for Google searchers. As a byproduct of this it will also reward websites with high quality content. Originally named the ‘Farmer Update’ by the SEO community due to the type of websites that were being penalised when it was first rolled out, we now know that Panda does indeed focus on the quality of a sites content. Content farms are prime targets for Google’s voracious Panda but you do not have to be spinning thousands of articles a day to be caught!
Unlike other updates, if Google’s Panda identifies low quality content on certain pages of your site its penalty can affect the entire site, not just specific pages. These penalties can be genuinely debilitating for businesses so it is essential that you look to avoid low quality content and focus on providing the most engaging and valuable experience for your users via your content.
Why was the Panda update released?
As with all Google updates, Panda was released to further ensure that Google provides the most relevant and high quality result when a someone types a query into the search engine. As a result, they need to address the issue of websites attempting to manipulate rankings by creating thin content or utilising spun content. If searchers are returned results that are not of the highest possible quality and offer the most value to them as users this is detrimental to the subsequent value they place on Google’s search engine.
How can I avoid being penalised by the Panda update?
The Panda update focusses on websites that contain poor quality content, however ‘low quality content’ is a rather broad term so here are a number of ways in which you can avoid your content being classes as low quality:
Focus on value for the user. If you haven’t read any of our other content on the site, this is a recurring theme across almost all of our advice. Writing content purely for search engines will inevitably can bring you close to the dangerous lands of keyword stuffing and reduce the readability of your content. User and usage data is an important quality indicator for Google so engaging your user is key. Be honest with yourself. When Panda was first released Google’s Amit Singhal wrote a list of questions that content creators should be asking themselves as a way of assessing their own content, most of which are concerned with the value and authority that a user places on the content and site as a whole.
Eradicate duplicate content on your site, both internally within your site and cross site. This can be exact match duplicate content or near match. Do not copy and paste large swathes of content from other sites without referencing the other site and do not simply change a few words thinking that this will fool Google. E-commerce sites are common violators of this through their use of manufacturer’s descriptions or data that has been pulled from another website. If you are featuring another business’ products on your site invest in rewriting the product descriptions as you see fit.
Avoid having lots of content that appears on every page on your site as this simply reduces the amount of real estate available for content that is unique to that page and will run the risk of being considered duplicate content.
Do your research and focus on producing content that cannot be found anywhere else on the web. Using techniques such as inverse document frequency (td-idf) can help you identify subject matter that is rare and may be of high value to your users.
Using automated tools such as Google translate can also land you in hot water. Automated translation tools have improved dramatically since their inception but they are still reasonably innaccurate and therefore any content that has been translated by a computer is likely to be of a lower quality than that which has been professionally translated.
Be vigilant with the work conducted by a third party SEO agency. We have seen example of websites that appear to have well researched and formulated content, only to simply copy and paste a paragraph into Google and see that it has been scraped off other sites!
What can we expect future Panda updates to do?
One thing is for sure, Google will continue to wage its war against websites that look to use thin or spun content as a method of gaining more visibility in search results. At the time of writing Panda 4.2 is being rolled out over a number of months and there is much clamor about Google’s attempts to integrate Panda into its core algorithm so that eventually it will be able to constantly update itself instead of having to be manually rolled out. Historically sites have been able to see reasonably clearly if they have been affected by updates but the fact that this update (or refresh) is taking so long to roll out means that identification is substantially more difficult.