Digital

How to avoid 4 common SEO issues and get more visibility for your business in search

- January 4, 2022 3 MIN READ

Search engine optimisation (SEO) is a long-term strategy. I say this because there are so many factors to take into consideration that affect a business’ SEO score. Consequently, even the most digitally-savvy businesses have to experiment, tweak and revisit their SEO strategies over a period of months or even years.

Although this sounds daunting, there is no reason to be scared of SEO. There are a lot of common on-page SEO issues, for example, that may cause a business to be not achieving the search rankings it wants, and many of these can be fixed in a matter of minutes (although the results may take several weeks to show).

Discover the common SEO issues and how you can fix them

#1. Duplicate content
Duplicate content, put simply, is the same content being displayed across multiple URLs. This creates a problem because Google becomes unable to tell which page is the original and therefore which URL to direct users to. Therefore, if you have duplicate content, Google may choose not to display it in the search results. 

Often businesses have created duplicate content completely by accident. For example, a page may be accessible at both http and https, businesses may have created printer-friendly pages of existing content, and the use of capital letters e.g. www.example.com/Product-page and www.example.com/product-page may also create confusion.

These problems can be fixed in a couple of different ways. One way is to include a canonical tag in your website’s code. Once you decide which URL you want Google to direct users to you can insert the following code into that page to let Google know that it is the canon i.e. the main page for your duplicate content:


rel=”canonical”

Once you have done this, you need to label all the additional pages of duplicate content with the following coding:

<link href=https://www.example.com/page-a rel=”canonical”>

If you are not familiar with coding, there are a couple of other ways to achieve the same result. You can either use 301 redirects to divert traffic to your preferred URL or you can set up your preferred domain in Google Search Console.


#2. Pages cannot be crawled
If you have recently redeveloped or redesigned your website you may have forgotten to allow search engines to crawl your new site. During the redevelopment process, it is a good practice to prevent robots from indexing your new site and displaying the content while the old site is still live. However, once the redevelopment has been completed, many webmasters forget to update their robots.txt file to allow their new pages to be indexed by search engines. To fix this problem first check your site’s robot.txt file and check what permissions you have enabled. If you’ve blocked access to your site, your robots.txt will look like this:

User-agent:*
Disallow: /
To fix this and allow the crawlers to access your entire site, simply delete the above robotx.txt and replace with:
User-agent:*
Disallow:

#3. Lack of mobile-friendliness
Mobile-friendliness is a combination of three important factors: mobile viewpoint configuration, touchscreen readiness and mobile speed. The lack of any one of these factors could affect your SEO score considerably, causing you to miss out on a significant source of traffic.

One of the first things you can do to improve your mobile-friendliness is to set your website’s mobile viewpoint (this is the area of the page that is visible to the user). This viewpoint varies between devices with different screen sizes. Without setting a viewpoint tag mobile visitors to your website are likely to encounter unreadable text sizes and tiny images. To set the mobile viewpoint enter the following tag into the <head> of your page:

<meta name=”viewport” content=”width=device-width, initial-scale=1.0”>
To address the other aspects of mobile friendliness, please refer to this article on speed tips and this article which has other suggestions for building an awesome mobile site. It may also be worth consulting with a professional web developer for expert assistance.

#4. Keyword stuffing
Google’s SEO algorithms have changed a lot over the years, and current algorithms now heavily penalise sites that have too high keyword densities. Google is trying heavily to present searchers with quality content, not keyword-stuffed, spammy-sounding articles. This means that a keyword density of 1-3 per cent is the ideal range for ranking. Any higher and you risk getting labelled and penalised by Google as a potential spammer. A free tool that you can use to check that your keyword density is within the optimum range can be found here. If you are over the limit, don’t despair, a quick edit or rewrite of some of your content can fix that.

This post first appeared on KBB on August 20 2020. It was updated January 4 2022.

Want more? Get our newsletter delivered straight to your inbox! Follow Kochie’s Business Builders on FacebookTwitter, Instagram, and LinkedIn.

Now read this

7 quick wins to increase your SEO success

KBB Sales and Marketing Workshop