Crawl errors meaning in English

Crawl errors

What are Crawl errors?

Crawl errors are errors that occur during the crawling process by a search engine bot or other crawler.

Errors could include DNS errors, server connectivity issues, or errors caused by the unavailability of a resource such as the Robots.txt file.

HTTP responses that would be classed as crawl errors include 404s, 500s, and other error based status codes.

Crawl errors can occur for a number of reasons, including but not limited to:

  • Resource being requested does not exist or has been deleted
  • Server is down or experiencing problems with network connectivity
  • Domain name is not properly configured (DNS) or has expired
  • The resource is blocking bots/search engines or otherwise preventing crawler access (eg with a firewall)
  • Robots.txt file is blocking the search engine bot or crawler from accessing the page or resource

Crawl errors can be identified and monitored through webmaster tools provided by search engines such as Google Search Console or Bing Webmaster Tools.

It can often be important to address the causes of crawl errors, as left unfixed they can impact the ability of search engines to properly crawl, index and rank a website's pages.

Citation URL

Last modified: Created:

What is this content? This article is part of an SEO glossary and reference guide created by Search Candy - an SEO consultancy based in the UK. The Search Candy team is committed to providing content that adheres to the highest editorial standards. The date this article was created and last checked for accuracy is posted above. To reuse this content please get in touch via our contact form.