The second phantom update, two years after the first one was later confirmed as a quality update from Google. It was released in May 2015 and many believed that it was connected to the highly publicised mobile friendly update, but upon analysis from the likes of Glenn Gabe at G-Squared interactive this was quickly discounted from the equation due to the fact that many of the sites that were hit by this update had no mobile compatibility issues.

The update was originally dubbed Phantom 2 due to its unannounced roll out and similarity to the original phantom update in 2013. However, as more data from analysis was gathered and Google confirmed some of the related suspicions, the update also became known as the Quality Update and also Reverse Panda.

What did the Quality Update entail?

The initial analysis pointed towards a Panda like refresh with onsite factors appearing to be the triggers, although Google was quick to establish that this was not connected to a Panda refresh and that a refresh to that particular update was coming soon anyway. Google did eventually confirm with Search Engine Land that there had been a change to their core algorithm in regards to how they factor in quality signals to their rankings. The SEO industry jumped into analysis of what the overriding factors were that contributed to sites losing nearly a quarter of their traffic:

Ads and thin content creating poor user experience

There has been a considerable focus on websites with certain types of reasonably thin content, for example sites such as EHow.com and Answers.com were heavily hit by this update. However, there is a convincing argument that the primary issue was not necessarily the content itself (although if it was of particularly low quality then it would naturally run into problems) but instead the associated structure and user experience. Many of these sites had automated ‘related article’ style options that redirected users to pages of very low quality or had excessive ads displayed to the user. For example, full page ads that appeared as soon as the user visited the page were deemed as a signal for this update. I know that I find them irritating and that it detracts from my user experience!

Poor options for user flow

Many of the sites that were hit also featured areas on the page which had related article links, links which directed their users to pages of even lower quality and little value. Again, this is heavily connected to the user experience and the associated value of the page to each individual user. No doubt user and usage data is a significant contributing factor for raising red flags with this update.

A malicious move by The Knowledge Graph?

There has also been speculation of a potentially malicious move by Google in connection with their Knowledge Graph. Rolled out in 2012 the Knowledge Graph can scrape content from sites to provide instant answers to search queries. Reducing the ranking of sites with information that is being displayed in the Knowledge Graph results will provide more real estate for Google on the first page of SERPs to return a variety of information for these query types. Many will shout about conspiracy theories and we are still unconvinced about this; the fact that many of these sites have articles which offer very little unique information is more likely to be the primary cause for this update.

Considerations

It might get tiring to hear but at least it still rings true. Central to avoiding any penalties associated with Google updates is an ability of the webmaster to ensure that all content on the website provides genuine value to the user and contributes to a great user experience. Unlike Panda or Penguin which require refreshes to see any recovery from loss of rankings, this is a core update which will run in real time.

Whilst Glenn Glabe at G-Squared Interactive claimed that a whole domain can suffer from individual pages with thin content and poor user experience, many other experts believe this to be a page level update affecting only individual pages. Furthermore, it appears that whilst some websites have seen a considerable loss in rankings, this updates does not actively penalise websites for low quality updates like Panda but instead rewards website with awesome content, hence why it is also known as Reverse Panda. Therefore as a by product of websites gaining better rankings, some have to lose out. It is all relative though. Whether an update penalises or rewards, the result it still the same. If you have awesome content and great UX you have a better chance of ranking and vice versa. Matt Cutts and Google as a whole have been harking on about great content for years, so nothing has really changed!