The Fred Update is a bit of an enigma. It was released in March 2017 without any official announcements from Google, and as a result we don’t have any concrete information about what it actually does. Fortunately, SEOs are like amateur detectives when it comes to unraveling new algorithm updates, so Fred came as a welcome challenge to the community.

The conclusion is that the Fred Update takes aim at low-quality content and, much like the Panda Update in 2011, punishes sites which don’t offer users any value. This serves as yet another reminder that Google cares about the user experience above all else, and by penalising weak content they ensure that SERPs prioritise only the most helpful and engaging content for any given query.

What (little) we know

Fred first showed itself when a number of webmasters reported sudden drops in traffic and rankings, with some even claiming that their sites had been completely deindexed. Since these fluctuations were fairly large and widespread compared to what is expected from Google’s everyday ebb and flow, it quickly became apparent that it must have been a major algorithm update. Those in the industry, after their usual bouts of panic had subsided, got to work on recording and reporting their findings so that they could change tack accordingly if needs be.

As the update’s impact developed it appeared that the main sites being affected by the update shared a common trait: they were mainly content-driven with a focus on generating revenue. According to Barry Schwartz, sites with this kind of profile saw declines in organic traffic of up to 90 percent – a hard hit indeed! Of course, there’s nothing wrong with trying to make money online through advertising and content marketing. But problems start to brew when a site tries to make money using poor-quality content as a basic lure: such an enterprise sees the user as nothing but a cash cow for cheap clicks. They’re not bothered about what they can do for the user, only what the user can do for them.

You can easily tell when a piece of content was written purely with revenue and rankings in mind. The copy is mostly illegible, saturated with repetitive keywords and poorly structured. The page is ridden with ads and affiliate links to the point where it’s difficult to navigate. The content has no depth, length, quality or originality, and it reads more like a rushed piece of school homework than a reasoned authority found in the upper-ranks of Google. Most importantly, you come away with a bad taste in your mouth: you didn’t get what you came for, your experience was bereft of any value and you feel completely unfulfilled as a result. Sounds pretty similar to the crappy splogs that popped up after Penguin, right?

So who are the winners and losers?

The chances are you don’t operate a large scale black-hat operation or a spammy affiliate site, and if so there is no reason to worry about the Fred Update. Still, it’s important to understand which sites were targeted and affected by the update so that you can take cues from their mistakes.

  • Spammy websites. With content that is either rubbish for the user (thin, outdated or spun content) or just straight-up manipulative (keyword stuffing and cloaking), spammy websites are always being attacked and penalised by Google’s algorithm.
  • Affiliate websites. In some cases, the sole aim of affiliate sites is to gain revenue from other websites by driving clicks to their sites, so the content itself just acts as a vessel for tonnes of affiliate links and advertisements. There is such thing, however, as an affiliate site that provides high-quality content. As Google’s Gary Illyes once tweeted: “there’s no inherent problem with affiliate links, the problem is when a site’s sole purpose is to be a shallow container for affiliate links.”
  • Websites with poor user experience. Many sites affected by Fred had so many ads that the content was actually quite hard to find on the page. They also featured poorly-written content with a low readability, random keywords scattered all throughout both the copy and the source code, and user interfaces that make the site difficult to navigate.

The Fred Update joins a long list of updates which target black-hat SEO tactics, the likes of which continue to plague the realm of web content. If it weren’t for Google keeping these kinds of sites in check, search results would be a lot more unreliable on the whole. It’s to the benefit of both Google (as a business) and their huge user base to keep SERPs full of relevant, useful, engaging, high-quality content. Sites with the sole purpose of getting you to click on links only serve to disrupt this long-term objective.

If you take anything away from this little summary, just remember that the main purpose of your on-site content should always be to provide relevance and value to readers – revenue, clicks, traffic, rankings, engagements, etc. should all be secondary considerations. Rewire your brain accordingly!