Robots.txt meaning in English
What is a Robots.txt file?
Robots.txt files are used as a communication method to control which pages can be crawled by bots.
It is possible to allow and disallow all, or specific search engines/bots on areas of a website.
The robots.txt file should be found at the root of a website, for example
Robots.txt files can also include a references to sitemaps.
The following example robots.txt file contains a rule blocking search engines from a specific page, and a reference to an XML sitemap file.
User-agent: * Disallow: /blocked-page/ Sitemap: https://www.domain.com/sitemap.xmlCode language: plaintext (plaintext)