What is a minimum valid robots.txt file?

As indicated here, create a text file named robots.txt in the top-level directory of your web server. You can leave it empty, or add:

User-agent: *
Disallow:

If you want robots to crawl everything. If not, then see the above link for more examples.


The best minimal robots.txt is a completely empty file.

Any other "null" directives such as an empty Disallow or Allow: * are not only useless because they are no-ops, but add unneeded complexity.

If you don't want the file to be completely empty - or you want to make it more human-readable - simply add a comment beginning with the # character, such as # blank file allows all. Crawlers ignore lines starting with #.

Tags:

Robots.Txt