How Can I Encourage Google to Read New robots.txt File?

Solution 1:

In case anyone else runs into this problem there is a way to force google-bot to re-download the robots.txt file.

Go to Health -> Fetch as Google [1] and have it fetch /robots.txt

That will re-download the file and google will also re-parse the file.

[1] in the previous Google UI it was 'Diagnostics -> Fetch as GoogleBot'.

Solution 2:

I know this is very old, but... If you uploaded the wrong robots.txt (disallowing all pages), you can try the following:

  • first correct your robots.txt to allow the correct pages, then
  • upload a sitemap.xml with your pages

as google tries to read the xml sitemap, it will check it agains robots.txt, forcing google to re-read your robots.txt.