What’s with all the updates?
Google certainly releases a lot of updates, over 1,000 every year in fact, but what’s the reason for constantly updating the platform? Well, ultimately it comes down to the search engine wanting to provide the best service for their users. It pays (a lot) for Google to continue their dominance in search. Like any other business, Google needs to stay ahead of the competition (in their case the likes of Bing and potentially Baidu) and provide the best service to retain their loyal customer base that has made them the world’s most popular search engine.
With over 40,000 searches performed every second, a lot of people would be affected if anything at Google was to go wrong. As such, the daily micro-updates often target technical issues to ensure that users are having the best experience possible while using Google. A Google searcher wants fast, accurate, helpful results – relevant and valuable results are the keys to making searchers happy. Therefore, the majority of updates are aimed at delivering results that are of high-quality and valuable to the searcher’s initial query.
So, what do they target?
As we’ve mentioned, the larger Google algorithm updates aim to improve the overall usability of Google as a search engine. They do this by targeting some of the primary functions of the web: links and content. By monitoring and improving these functions they can thus improve the overall user experience of the search engine.
If we break down the issues that Google targets, we can see the ways that the search engine has taken action through its updates.
Google doesn’t have an issue with links themselves. Instead it has an issue with how Black Hat SEOs utilise links to manipulate their search results. As such, the main link issues that affect people’s experience with Google are those sites making use of link spam tactics. Websites that make use of link spam by joining link networks, engaging in reciprocal linking or by spamming forums, comments or blogs were able to rise to the top of the SERPs through their spammy backlink profile. This is an issue because the sites were and are often low quality or untrustworthy, meaning that searchers weren’t getting access to the quality content that they’d expect Google to provide.
The most prominent update to attack major link issues was the Penguin update in 2012. Since its initial release, Penguin has gone through plenty of its own updates, all penalising sites involved in link spam. Originally this was done by lowering a spammy sites rank in SERPs, but now it’s done by ensuring that the spammy links pointing to specific pages are being devalued.
Ask any search engine guru and they’ll tell you that writing great content is a core pillar for any successful SEO strategy. If Google was pointing searchers to sites filled with badly written, keyword-ridden or simply heavily plagiarised content, the search engine simply wouldn’t be reliable. As such, the tech giant has released multiple major updates targeting sites producing low-quality content.
Although Google kept their cards close to their chest in regards to what it targeted, it seems that the update that set content on the road to becoming king was ‘Florida’ in 2003. After the Florida update, sites that had climbed to the top of SERPs by simply stuffing their pages with keywords were hit hard. Around 72% of sites dropped out of the top 100 results for an array of short tail search terms. Suddenly, producing quality content was the key to conducting a successful SEO campaign.
The SEO world was hit with another major content revelation in 2012. Along with the aforementioned Penguin update, Google introduced another algorithm update with an animal namesake; Panda. The Panda update scours websites, on the hunt for any low-quality or low-value content. If your site makes use of scraped content, engages in keyword stuffing or otherwise provides a bad user experience in regards to onsite content, the entire site will be penalised. If Panda discovers that your site is creating particularly spammy or irrelevant content, you could see it being deindexed entirely. So, beware of the Panda.
Technically, all Google updates are released to better the overall user experience: if Google provides searchers with the best results available, they’ve had a good search experience. A good UX isn’t only down to getting relevant results to a query, it’s also about the speed at which queries are answered, a focus on local searches and the ability for Google to provide you with relevant results (even if you didn’t search for a specific term).
Google has targeted all of these things and more through their user experience updates. Local search is an incredibly important facet of search; nobody wants to search for “barber” only to have dozens of SEO-savvy stylists from LA to Tokyo appearing in SERPs.
Venice really kicked off local search. A search for a generic term such as “barber”, “supermarket”, “restaurant”, etc. would now return results aimed at your location. Venice was released in 2012 and, with the growing popularity of GPS-enabled smartphones, was very effective in improving the user experience of mobile users. Since the introduction of Venice, it has been important for businesses to ensure they have an up-to-date Google My Business profile, as well as plenty of localised onsite content.
When the Pigeon update came along in 2014, it aimed to perfect what Venice had started. Local search results were even more specific and accurate, targeting local results towards small neighbourhoods rather than entire towns and cities. This made it even easier for searchers to find the best local businesses, often using only a one-word search term.
After Venice and Pigeon, it felt as though local results had almost been perfected; then the Possum update came along. When Possum was released in 2016, it affected around ⅔ of local search results. The Possum update helped to make local results more encompassing, with businesses that traditionally fell just outside a locational border being included in that location’s searches. Once again, Possum continued and improved on the location-specific results brought to prominence by Venice.
Of course, a better user experience does not only come through accurate local searches. Google aims to improve UX on all fronts, and the introduction of the Hummingbird and RankBrain updates did just that.
Hummingbird was introduced in 2013 as a way for Google’s algorithms to better understand the intent behind a search query, even if it wasn’t searched for in a particularly Google-friendly way. With the introduction of Hummingbird, Google would understand more vague queries such as “Why can’t I get Windows 10 on my laptop?”. While Google doesn’t explicitly understand what is being said in that query, with the use of a huge amount of data, along with latent semantic indexing, the knowledge graph and intent matching, the algorithms understand that the searcher has a problem downloading Windows 10.
Hummingbird, like the location-based updates above, couldn’t have arrived at a better time. The ever-increasing use of smartphones heralded a new era of voice search, far longer and more casual than a typed search. As such, Google needed to adapt to understand the influx of these kinds of long tail search queries.
Alongside Hummingbird, the world was also introduced to Google’s AI: RankBrain. RankBrain is a type of machine learning that was designed to help Google understand queries that had never been searched for before. As 15% of Google searches have never been searched before, RankBrain is a highly effective and useful addition to the Google algorithms. While RankBrain didn’t necessarily have a direct effect on SEO strategies, it once again elevated the importance of writing detailed, relevant content, while also indicating a trend towards the inclusion of long tail keywords.
As you may have noticed, many of the UX updates in Google’s algorithms are particularly useful for smartphone users. Location-based results and the capability to understand long tail voice search queries are two very prominent features closely tied to the smartphone generation. This is no coincidence. At the time of writing, the most recently released major Google update was Mobile First. Rolled out in March 2018, this update prioritised smartphones over desktop computers for the first time. Sites that are off-putting to mobile users, including those with slow loading times and without sites optimised for mobile, will take a backseat to those sites that prioritise mobile users.
Overall, we can see that Google updates have a major effect on both SEO and the users. As an SEO, it’s important to stay on top of the news coming from Google so that you can be prepared for whatever the company throws your way. If you manage to prepare early, you’ll be able to ride out updates with ease. Furthermore, if you focus on relevance and value to the user rather than an attempt to cheat the system you place yourself in far better stead to running sustainable and successful campaigns.
Audit your website
At Yellowball we audit our clients' sites to ensure that they are aligned with Google's Webmaster Guidelines, allowing for any future SEO efforts to be sustainable.SEO Audit Service