Your website nabbing the top spot on a Google search is unquestionably the holy grail for online business presence, and achieving this position in organic search results promises to deliver a higher click through rate as well as leads of higher quality for your organisation.
Unfortunately, everybody who knows what they are doing are running the same race, and with more competitors than ever, chances of grabbing the top spot are growing slimmer. The good news is that even though everyone can be in with a chance, very few racers actually have even the basics in order. Once you’ve got a technically-viable website up and running, there are some top mistakes to avoid making at all costs in order to secure the success of your SEO campaign.
No sitemap
A sitemap is an XML file that processes data about a website’s most important parts to search engines – this data includes date of most recent updates and page ranking in terms of relevance. This data enables its spider to patrol a website more economically and efficiently. Creating a sitemap may not be a guaranteed recipe for success in the search engines, it is a quick and easy win.
Failed canonical domain check
If a domain fails a canonical check, it is likely that the homepage is being accessed through multiple URLs. For example, webdesignermag.co.uk/index.php, webdesignermag.co.uk/home.php and webdesignersmag.co.uk all load the homepage. Multiple URLs loading matching content is an issue, as inbound equity often gets distributed, ruining the site’s overall SEO value. To avoid this, it is important to put proper canonical tags in place or to have 301 redirects rerouting the duplicate pages to one main location.
Slow load times
Although the UK’s broadband speeds are getting better, Google declared back in 2010 that loading times were accounted for in their algorithm, and that slow loading speeds for desktop PCs and mobiles was still an ongoing issue. With Google’s PageSpeed insights tool, pages of any sort can be examined, and reasons for slow processing identified. Popular solutions to this problem include: eliminating render-blocking JavaScript and CSS above – the – fold, leveraging browser caching, optimising images, enabling compression and minifying JavaScript, CSS and HTML (which means removing any unnecessary spaces.)
No header tags
Header tags are the skeleton of content, and guide the search engines in picking out the important parts of a website. Tags enable search engines to prioritise the content of a webpage and poor use of them can confuse the system. The way to solve this issue is to make certain that the main tag is unique and accurately relates to the topics of the site, including relevant keywords. Similarly, any subheadings should also be tagged where appropriate.
Missing image description
Search engines cannot decipher images, which makes it important to attach pertinent descriptive texts in the form of an Alt attribute to them. This enables the search engine to comprehend the image, and offers a great chance to make the page more keyword rich than it was. Being too vague with image descriptions is a classic mistake. Remember, a pair of blue trainers can be elaborated into blue and white limited edition Nike running shoes.
Poor metadata
Title tags are considered one of the top priorities for making onsite SEO successful, perhaps because it is the first thing users see on search engine results pages. It should be unique and to-the-point, a maximum of 70 characters, and keyword optimised to the content. Meta descriptions are another key solution, effectively there to tempt the user to click on the page in the form of ad copy. This should be very easy to do and edit on any reputable CMS system.
Keyword spamming
For any copywriter, achieving the fine balance of targeted keywords in website content is a challenge at the best of times. Too many keywords runs the risk of the page looking spammy, but too few may result in search engines not being sure what to rank you for. To overcome this issue, it is a good idea to use a wide range of targeted variations of the page’s keywords, but don’t do this to the extent of sacrificing good, readable content. Engage your readers first and the spiders will soon find you.
Duplicate content
One of the most prevalent mistakes in online content is duplicating pages. This often happens on larger sites or eCommerce sites that have many different product listings, and can have a detrimental effect if not handled properly. Duplication usually occurs when filters are applied to product listings or when several variations of a single product are listed. Once again, the solution is a canonical tg, which can redirect the duplicates back to the relevant main pages, which effectively rolls all of your SEO value into one page.
The way you name your links
Anchor text is the name given to clickable text links on your page. While being crawled, search engines use a site’s anchor text to verify the content and relevance of linked pages. Standard anchor text like ‘click here’, especially when linking to internal pages, is a missed opportunity for refined SEO. Again, use those relevant keywords that you want your page to be ranked by search engines for if you can, but remember that overdoing the keywords can easily tip the scales too far in the other direction, so keep an eye on your keyword balance.
About the author Matt Eldridge:
Matt is the owner and head web, design and marketing master of Melt Design. He has previously worked for Entrepreneurs Circle running Botti Creative and now aims to grow Melt Design into one of the most well respected design, web and marketing agencies in the UK. Find out more: https://www.meltdesign.co.uk/