What is Crawl Budget and How to Optimise it

There are 1.2 billion websites in the world - search engines can’t check every site every day. So instead they have to prioritise when and what to crawl (discover), and this is where crawl budget comes in. 

So, let’s explore what crawl budget is, why it matters to you, and how you can optimise yours to ensure it’s being used efficiently. 

What is Crawl Budget? 

Crawl budget is the number of pages a search engine bot (such as a Googlebot) will crawl within a given timeframe. 

Google defines it as 'crawl rate limit' and 'crawl demand': 

Crawl rate limit - how many simultaneous connections Googlebot can make to your site and how much time it will wait between fetches. Crawl rate can be influenced by your server’s performance, meaning if your server slows down then Google might reduce the crawl rate.

Crawl demand - popular pages and ones that are updated more frequently tend to be crawled more often as Google sees them as more valuable. 

Why Crawl Budget Matters 

Crawl budget is less of a concern for smaller websites because most of the time these can be crawled efficiently. However, for larger sites with thousands of pages, it is something to consider. 

Crawling is how Google finds new pages to index. If a page isn’t indexed then it won’t appear in the search results. So, if your site isn’t being crawled efficiently, your rankings and visibility may suffer, resulting in low traffic and reduced revenue. 

This is where optimising your crawl budget becomes important. You want to ensure your best and most current content is being prioritised by search engines and your crawl budget isn’t being wasted on low-value or duplicate pages. 

How to Optimise Your Crawl Budget

There are several ways to ensure your crawl budget is used efficiently:

Remove Duplicate Content 

Duplicate content is a major cause of wasted crawl budget. However, sometimes duplicate content is unavoidable. For example, eCommerce websites often use URL parameters to filter and sort content so the user doesn’t have to reload the entire page when filtering products. Whilst these are helpful to the user, crawlers see URL parameters as separate pages and will crawl them individually. Since most of the content on the page will be the same, it’s a significant waste of crawl budget. 

If you have duplicate content that can’t be avoided, such as URL parameters, you can use a canonical tag to point to the original URL. This will guide search bots so they only crawl and index the original page. 

Update Your Sitemap 

An XML sitemap is a file that lists the URLs on your website that are available for crawling and indexing. Keeping your sitemap up-to-date and removing any broken URLs can ensure crawlers know where to focus their crawling efforts. 

Reduce Loading Times

Site speed is essential for providing better user experiences and improving rankings. But it's also important because it means that crawlers can crawl more pages in less time. 

You can reduce loading times by compressing file sizes, reducing HTTP requests, limiting plug-ins and themes, and enabling browser caching.   

Fix Broken Links and Redirects 

404 errors and redirect chains can waste crawl budgets and create unnecessary work for search engine bots. Make sure to regularly audit your site for broken links and redirects - you can do this using Google Search Console. 

Use Robots.txt Files

The robots.txt file can be used to prevent search bots from accessing pages that don’t need to be crawled - such as admin pages or log in screens. By blocking these pages that don’t need to be crawled, the search engine can focus on crawling more valuable areas of your site. 

Minimise Waste and Crawl Content That Truly Matters 

Crawl budget can play a crucial role in ensuring your site is indexed properly and performs well in the search results. By understanding what crawl budget is and what you can do to ensure crawlers focus on your most important pages, you can make sure the pages that really matter rank.

If you’d like expert help optimising your crawl budget, get in touch with us today.

Marcus Hearn

Marcus has spent his career growing the organic search visibility of both large organisations and SMEs. He specialises in technical SEO but he’s obsessed with curating strategies that leverage expertise and unlock potential.

Explore our other guides

SEO Checklist: What to Look For if You're Not Moving Forward
5 Common SEO Mistakes and How to Fix Them
Local SEO: How to Optimise for Local Search Results