Are files like favicon.ico, robots.txt, and sitemap.xml vulnerable to XSS?

We will be deprecating that passive scan rule shortly.

Here's the issue: https://github.com/zaproxy/zaproxy/issues/5849
Here are the related PRs:

  • https://github.com/zaproxy/zap-extensions/pull/2297
  • https://github.com/zaproxy/zaproxy/pull/5850

You should really have CSP in place, as for whether or not it matters for those files it depends if they exist, and if not, how the error might be handled.

Further, as pointed out on MDN:

  • Chrome has an "Intent to Deprecate and Remove the XSS Auditor"
  • Firefox have not, and will not implement X-XSS-Protection
  • Edge have retired their XSS filter

This means that if you do not need to support legacy browsers, it is recommended that you use Content-Security-Policy without allowing unsafe-inline scripts instead.

There is further discussion of this topic (X-XSS-Protection vs CSP) in this Q/A as well.

Update: The "Header XSS Protection" passive scan rule has been removed as of Passive Scan Rules v27 > https://github.com/zaproxy/zap-extensions/releases/tag/pscanrules-v27


favicon.ico and robots.txt: No. Browsers do not execute JavaScript within these files.

In theory, sitemap.xml could be an issue. There are lots of nasty tricks you can do with xml. In reality, it would be a very, very difficult attack to pull off (especially given the likely scenario that this is a static file).

Tags:

Xss

Zap