Whenever I think of bots crawling a website, in my head I envision a cute, smart Wall-E like robot speeding off on a secret mission to find website information and to provide this to search engines and to sort it like a librarian. But as an SEO I know that they are ‘only’ computer programmes that crawls the web and adds pages to its index – which is a little disheartening!
But what do these robots actually do?
So these little bots, sorry programmes, use you sitemaps and internal links to investigate and explore your site and return what it finds to the index of the search engines. So for example the Googlebot will crawl your site and returns what it finds to Google’s index which then helps with the ranking results.
These bots will determine how often it will crawl pages. To make sure that bots can correctly index your site, you need to continually check its crawlability because if your site is friendly to the bots then they will keep coming back to your site but if they struggle or get confused on your site or their time is wasted then they will be less frequent.
This could then have a knock on affect to the organic performance.
Now there are several different types of bots, for instance from Google, there are AdSense and AdsBot which check ad quality, while Mobile Apps Android checks Android apps. So you need to keep all of these bots happy, especially the Google Mobile Bot as this will now crawl your site first in line with Google rolling out this update.
So how do you know you are keeping these guys happy? Undertake Log File Analysis?
What is Log File Analysis?
Every request made to your web server is being recorded in a log file (access.log). This is an important file, yet not commonly known, because you can see exactly what resources these bots are looking at on your site.
Log files are a key source of information they give access to a wealth of information on how search engine bots crawl your site and what issues they encounter.
For websites with a very large number of pages (such as ecommerce sites with tens of thousands of pages) analysis of log file activity can be a useful indicator as to whether the bots are reviewing the correct pages or not, are they getting lost or are they touching the correct pages of your site. Log File Analysis can address this.
This process allows SEO’s to view what pages can or cannot be crawled and at the same time identify if the crawl budgets are which being utilised in the most efficient way.
You can review and analyse responses encountered by the search engine bots during their crawl e.g. 302s, 404s, soft 404s.
Identify crawl shortcomings that might have wider site-based implications (such as hierarchy, or internal link structure).
See which pages the search engines prioritise, and might consider the most important, which will then lead into a further ongoing strategy.
So it is a pretty big deal, these bots are important for your site and if you have made the right technical choices for your site, the bots will come often.
And of course if you regularly add fresh content they’ll come around more often, more so now in light of the algorithm updates over the past 12 months.
In short to ensure your organic strategy is fully optimised keep the bots happy, but also keep an eye on them!!