Prevent XML sitemaps from showing up in Google search results

Google does index XML sitemaps (like any XML file). If Google is aware of a URL and it returns a valid response then it's going to pass Google's inclusion rules and could get indexed. Personally, I only submit the sitemap through GWT and include a Sitemap: reference in robots.txt and this is certainly enough to get it indexed.

The recommended method to prevent these files from being indexed by Google is to include an X-Robots-Tag HTTP response header when serving the XML sitemap. For example:

X-Robots-Tag: noindex

Just like including a robots META tag in HTML files, the X-Robots-Tag header can be used for any type of file.

Reference: This document (from Nov 2008!) appears to quote our very own John Mueller (Google) with regards to the use of the X-Robots-Tag response when dealing with XML sitemaps.
Yes, Google Will Index & Rank Your XML Sitemap File

For more information see Google's developer guide:
Robots meta tag and X-Robots-Tag HTTP header specifications


MrWhite's answer about using X-Robots-Tag appears to be the correct way to do this.

Here is code that can be used in .htaccess or Apache configuration files to do so. (Reference: WebmasterWorld - Sitemaps showing up in SERP - How to prevent this?)

<Files ~ "sitemap.*\.xml(\.gz)?$">
  Header append X-Robots-Tag "noindex"
</Files>

Under nginx the configuration would be as follows. (Reference: Yoast X-Robots-Tag examples)

location ~* sitemap.*\.xml(\.gz)?$ {
    add_header X-Robots-Tag "noindex";
}

Why does it matter?

If you can actually find your sitemap in SERP then you have bigger problems.

I would focus more on getting pages up with useful content instead. That way, you will have a very hard time even finding you sitemap. Not that you would care at that point anyway.

P.S.

Pretty much every one keeps sitemaps in the same place. So if someone wanted to find where you keep it they will :)