5 ways to force Google’s bots to come to your website

As part of your digital marketing strategy, you’ll already be posting awesome content on a regular basis. But you may have noticed that there’s a time lag between when you post and when those updates appear in search results. That’s because in order for your new content to appear it has to be re-crawled by a search bot first. And unfortunately, it’s not as simple as just clicking a button to get an on-demand, real-time recrawl! So how do we go about requesting a recrawl?

1. Awesome Content

Google’s indexing algorithms are extremely complex and involve artificial intelligence to determine what content is relevant and what isn’t. Before we request a recrawl we need to ensure our content is informative, relevant, engaging, and fresh. Part of remaining relevant is posting regularly. Only Google’s engineers truly know how their algorithms work but it is clear that if you produce high-quality content on a regular basis, Google search bots will recrawl your site on a more frequent basis.

2. Search Engine Optimisation (SEO)

Long-time followers of my content will know that following SEO best practices is really important because it will get crawled more regularly. But it is worth repeating because it is a common pitfall for those newer to e-commerce or the digital space. You have to make sure your content is as friendly as possible for search bots.

For 2018, these are the priorities you should pay particular attention to:

  • use a mobile-first strategy—Google is transitioning to a mobile index so when you develop content you should start by making sure it displays well and is easy to navigate on smartphones and then scale up from there.
  • responsive design—your website should adapt to whatever screen size and device your visitor is using. You can use a free library like Bootstrap to help achieve this.
  • fast page loading times—with the switch to a mobile index, Google have made it very clear that how fast your page loads matters a lot! Optimise images, minimise HTTP requests to external content like web fonts, and use a tool like Google’s PageSpeed Insights to test that it loads in under two seconds.

3. Eliminate Technical Errors

Technical errors often prevent your site being crawled. To find out if this is affecting you and your search rankings head over to Google Search Console and sign-in. Add your website if you haven’t already so that you can make use of Google’s free tools. The tools you want to pay particular attention to are located under Crawl. If you notice any errors it means that your site isn’t able to be crawled properly so resolve these as quickly as possible. I’ve written other articles on what the common issues are and how to fix them but if you have difficulty you can get assistance from Information Technology professionals that have Digital Marketing and SEO skill sets.

Fetch As Google is another very useful tool to find out how Google sees your site.

4. Request Indexing

The previous points were about making sure your website is as friendly as possible for search bots. Once you are sure that you have covered all your bases above, you can then move on to requesting re-indexing. You can manually request a recrawl (if you’ve just updated some content) by following these steps:

  • sign into Google Search Console
  • under Crawl you should see Fetch As Google
  • enter the page you wish to fetch or leave blank for the homepage
    • select Desktop or Smartphone then click Fetch
    • in a few moments you’ll be presented with the results. Click the Request Indexing button that is included.

5. Submit a Site Map

    • sign into Google Search Console
    • under Crawl, select Sitemaps
    • you should see a red ADD/TEST SITEMAP button at the top right which you can use to submit it. Google will then schedule a recrawl.If you’ve made a lot of changes to your website including adding new pages, changing the navigation menu or structure or even deleting pages, the most efficient way to request a recrawl is to re-submit your sitemap. The process is similar to the previous section except that first you need to generate a valid sitemap. Try the free version of XML Sitemaps or search for one of the countless free tools.

      Once you have it:


    Although Google generally will crawl your site of its own accord, you don’t want to leave it all up to them! Changes can take several months to filter through if you rely on this alone. Also, getting familiar with Google Search Console will give you a tonne of data about your website and how it performs which is not only useful for search bots but also for you. Going through the steps outlined above may uncover hidden issues with your website that you were not aware of that drive your customers crazy and prevent them from engaging with your company or making purchases. Think of checking in and reviewing your crawl status and SEO practices as the digital marketing equivalent of a six-monthly check-up with your doctor.


Related Posts

Luke Chaffey
Luke Chaffey is a senior member of the KBB Digital team, and heads up the search marketing division. With a keen eye on innovation and developing digital trends, Luke regularly attends the Google Partners Masterclass, and is also a prolific writer for websites such as Yahoo, The Australian Government (Digital Business sector), Kochie’s Business Builders, Smarter.Digital, KBB Digital.


Please enter your comment!
Please enter your name here