Google Webmaster Tools tells me that robots is blocking access to the sitemap

It would seem that Google has probably not yet updated it's cache of your robots.txt file. Your current robots.txt file (above) does not look as if it should be blocking your sitemap URL.

I guess google just hasnt updated its cache.

There is no need to guess. In Google Webmaster Tools (GWT) under "Health" > "Blocked URLs", you can see when your robots.txt was last downloaded and whether it was successful. It will also inform you of how many URLs have been blocked by the robots.txt file.

robots.txt reference in Google Webmaster Tools

As mentioned in my comments, GWT has a robots.txt checker tool ("Health" > "Blocked URLs"). So you can immediately test changes to your robots.txt (without changing your actual file). Specify the robots.txt file in the upper textarea and the URLs you would like to test in the lower textarea and it will tell you whether they would be blocked or not.


Caching of robots.txt

A robots.txt request is generally cached for up to one day, but may be cached longer in situations where refreshing the cached version is not possible (for example, due to timeouts or 5xx errors). The cached response may be shared by different crawlers. Google may increase or decrease the cache lifetime based on max-age Cache-Control HTTP headers.

Source: Google Developers - Robots.txt Specifications


I had the same problem with my site because during install WP I select don't track with search engine or same option.

To resolve this problem:

  1. go to Webmaster Tools crawls remove URL and submit your www.example.com/robots.txt with this option -> remove from cach for change content or ...
  2. wait a min
  3. resubmit your sitemap URL
  4. finish