#seo

Streamlining Your Website for Bots: A Guide to optimise average response time

Sander van Surksum
by Sander van Surksum
Streamlining Your Website for Bots

As a website owner, you're probably aware that your website receives daily visits not only from human users but also from a multitude of bots. These automated scripts, which might be search engines crawling your site for indexing or analytics tools collecting performance data, provide valuable services but can sometimes be a double-edged sword. They are beneficial but can impose an additional load on your website, potentially slowing it down.

Why Streamline for Bots? #

Most bots are uninterested in your real-time tracking tools or browser-specific performance optimisations meant for human users. They're there for a different purpose.
By identifying bot traffic and delivering a more streamlined version of your site, you can optimise performance and make more efficient use of your crawl budget. This includes monitoring the average response time to identify any issues that might slow down the website.

Tracking Average Response Time: Search Console #

Before we start improving the response time, we first need to understand how well it's currently working and find any areas that might need changes. A key part of understanding how well a website loads is looking at the average time it takes for bots to get a response from it. Keeping an eye on this average response time is very important because it can help us find issues that might slow down the website. Google Search Console provides metrics such as the "average response time" to see how quickly your website is responding to search engine crawlers.

Google Search Console is a free service from Google that helps you monitor, maintain, and troubleshoot your site's presence in Google Search results. Here's how to find the average response time:

  1. Sign in to your Google Search Console account.
  2. Select the relevant property (website) from the dropdown menu at the top.
  3. On the left-hand side, navigate to 'Settings' and then 'Crawl stats'.
  4. Under the 'Crawl stats' section, you'll find the 'Average response time' metric. This shows the time taken by Googlebot to receive a response from your site after making a request. It's a great way to gauge how quickly your website is responding to search engine crawlers.

Remember, a low average response time signifies a well-optimised site, while a high response time could indicate potential server issues or site speed problems.

Google Search Console is an excellent free resource that provides basic metrics such as the average response time of your website. However, if you want more comprehensive insights, several commercial tools can help you dive deeper into your bot traffic data.

Botify, for example, is a specialised SEO platform that provides in-depth reports and metrics related to bot traffic. The platform offers data such as response times, and it also presents advanced analyses and optimisation recommendations based on the data. These features can provide you with a better understanding of how bots interact with your site and how this interaction impacts your site's performance.

Additionally, another effective way to analyse bot traffic is by examining your server logs with tools like Screaming Frog's Log File Analyser. Server logs can give you a clear picture of how bots crawl your site, allowing you to spot potential issues such as excessively crawled pages or parts of your site that bots may be missing.

By combining log file analysis with the data from SEO platforms like Botify, you can effectively manage your bot traffic. This comprehensive approach will ensure that your website is optimised for all types of traffic, providing a better user experience and improving your visibility in search engine rankings.

Curious how we can improve your performance?

Check the performance of your website using the Core Web Vitals and Lighthouse performance reports.

What to Disable for Bots? #

There are several features that you can disable for bots to optimise your website's performance:

  • Prefetching: While search engine crawlers account for a significant portion of traffic for long-tail content, they typically do not immediately download the images. For instance, Google utilises a separate crawler for images. Disabling prefetching for bots can enhance performance and decrease the server load.
  • 103 Early Hints: According to Gary Illyes, a renowned webmaster trends analyst, Googlebot can't deal with 103 Early Hints. If Google receives a "bad" response (the early hint), it either inefficiently recrawls the page or drops the URL from its index. Disabling this feature for bots can lead to more efficient crawling.
  • Real User Monitoring (RUM): Bots don't transmit beacons, so disabling RUM for bot traffic can help keep your data clean.Remember, the key here is accurately identifying bot traffic. With careful identification, these adjustments can improve your website's overall performance without impacting your SEO efforts. Search engine bots can still effectively index your content without these additional features.

The Big Picture of Website Performance #

Lastly, it's crucial to remember that a faster website isn't just about compressing images or optimising your code. It's about understanding the various types of traffic that visit your site and managing them effectively. By recognising that bot traffic operates differently from human traffic and adjusting your website accordingly, you can ensure a smoother, more efficient website. This mindful approach not only benefits your human users but also makes your site more bot-friendly, leading to better indexing and data collection. In the fast-paced world of the web, understanding and optimising for your entire audience - human and bot - is key to delivering a superior experience and driving success.


Continue reading

Get more business out of your website

The performance of your website needs improvement.

You can get more business by creating happy customers by giving them a good user experience. Start now and request a performance audit.