Search Engine Optimisation can be a tricky business at the best of times. With over 200 factors to take into account it is all too easy to get distracted and miss out a few crucial first steps. The result? You can spend weeks, if not months, working on your site only to gain no more visibility on the search engines than you did previously.
Fear not though, not only will a lot of your efforts be beneficial in the long run (except for the in the case of the last point in this article), but we at Yellowball have put together a list of easy-to-make fatal errors so that you can tick them off! We’re nice like that….
***when talking about search engines I will inevitably start referring to them as Google. No offence Yahoo! and Bing.
Also, if you any of the terminology is alien to you just visit our SEO Glossary for a simple explanation.
content=“noindex”
The ‘noindex’ tag on a webpage or within the robots.txt file on a website tells search engine bots not to index that page or website. In plain english? Having ‘noindex’ on any page of your website means that it is practically invisible to Google and other search engines. As a result, search engines will not display that page (or website) in their results because technically it doesn’t exist for them. Bummer.
You can do as much SEO work on your site as you want, if it can’t be seen by Google then it won’t turn up in their results pages. Pretty simple logic. Many development agencies noindex sites whilst in the development stage either to avoid people seeing the website whilst it is still being built, or to avoid duplicate content as the website is developed on a separate domain/URL to the pre existing site.
The issue is that many will forget to remove
The issue is that many will forget to remove the ‘noindex’ tag when they send the new site live, making your website invisible to all but human users. It does not take a genius to figure out that this is a big no no when it comes to SEO. The good news is that not only are the larger Content Management Systems (CMS) such as WordPress pretty good at telling you when a webpage is invisible but it is easily rectified in the robots.txt file.
Furthermore, all you have to do is remove the ‘no index’ tag in your robots.txt file or if you are using the SEO plug in by Yoast (we recommend it) you can select the status of your page to be ‘index’ in the advanced section of the plug in. Finally, make sure that in the CMS that the developer has not selected noindex. If the website is WordPress, this is in Settings > Reading > Search Engine Visibility.
Have a look at the images on the left for WordPress index options.
No Backlinks or Sitemap Submitted
Search engines document the world wide web by reading and indexing websites. In addition to this, they travel between websites via hyperlinks. As such you can view links between websites as metaphorical bridges which allow these search engine bots to explore new territories of the internet….like 21st century intrepid explorers of our digital age.
They are a crucial aspect in how search engines display such an enormous scope of the web in their results. Without this method of getting to your website, Google and other search engines have absolutely no idea that your site exists. They may eventually find it through other clever technical features, but why take the risk?
Building or ‘earning’ backlinks
Building or ‘earning’ backlinks is rather a large subject in itself (a very infamous part of SEO) and something that we will be discussing in great depth in later articles, so have a read around before you dive headlong into a backlink building campaign. Suffice to say that they are very important in providing a method by which search engines can find your site and subsequently decide which search phrases they will associate your site with.
One of the easiest ways to make search engines aware of your existence is to submit a sitemap to their system. For Google this is through a platform called Webmaster Tools. Submitting an XML sitemap to Google is a way of giving them a blueprint of your site and the URL’s at which they can find each page. Genius. Once you have created a Webmaster Tools account for your website, simply go to Crawl > Sitemaps > Add/Test Sitemap.
Incompatible Site
There is no denying that Google’s algorithm is incredibly complex and capable of analysing unfathomable amounts of data….but it is not perfect. Some aspects of websites are as confusing to Google as a toddler would be looking at Larry Page’s (co-founder of Google) original code for the Google algorithm. Humans take for granted their ability to look at a picture and understand what it represents. “A picture is worth a thousand words”….not for Google (or other search engines).
Alternative Text
Google understands text, but pictures mean nothing to it. The only thing that Google can understand from an image is the title, caption, alternative text and any other text that is added as an attribute or tag to the picture.
Therefore, if your website is very image led with minimal amounts of text it makes Google’s job very hard in determining what your site is about…..which will often result in poor search results. It might look awesome to humans, but it means next to nothing for Google.
In addition to the issue of image led sites, flash sites are even worse. Flash is great for creating dynamic websites but you are scuppering any chances of appearing on search engine results due to the fact that search engines cannot make sense of them. Anyway, html5 can do much of the cool stuff that flash does!
***** I realise that Google did/has made efforts to be able to crawl flash and is attempting to read images but as a general rule image led and flash sites can be pretty catastrophic for those looking to rank highly on search engines.
If you can think of any other fatal SEO flaws please include them in the comments below.
Thanks for reading, more to come later! If you would like to know more about our services, just visit our SEO Experts page.