Google Search Console is making up URLs which don't exist in my Sitemap and then complains that these pages have error

You have a misconception of what a sitemap is.

The sitemap is used to audit the site crawl by the search engine bot. The sitemap and crawling a site are two different and independent things. Google will continue to crawl your site independently of any sitemap. The sitemap will be used to audit/see if Google is able to properly crawl your site. For example, if pages are found in your sitemap and Google has not seen the page, Google may add the page to the fetch queue to be included.

The converse is not true. If a page is not found in the sitemap, Google will not remove it from it's index. Why? Because Google found it by crawling the site.

What you seem to believe is the sitemap is the be all - end all authority that Google uses to know what pages exist on any particular site. This is not the case. The crawl is. The sitemap only helps Google know whether they can properly crawl your site and, if not, what pages Google are missing that should be added to the fetch queue.

Your expectation, that Google will no longer try to access pages because these pages are no longer in your sitemap, is incorrect. Sitemaps are cached and only checked periodically. Why? Because it is an audit process.

You do have a real problem you need to solve.

You are returning a 500 error for pages that are not being found. This is bad. Your site should be returning a 404 Not Found error. The 500 error is a system error and Google will treat the condition as temporary. If your site returned a 404 error, Google will still continue to try the page for a number of tries over a period of time until it decides the page no longer exists. If at all possible, you want to issue a 410 Removed error for pages that you have removed. If this is too much work or not possible, the 404 will amount to the same thing over time.

You do need to fix your 500 error.


Closetnoc is correct about sitemaps. Don't expect them to limit what URLs Google will crawl and index. In fact sitemaps have little to no influence over SEO. See The Sitemap Paradox

Google won't complain about errors from your old URLs if you redirect them. When you change your site's URL structure it is best to redirect all the old URLs to their corresponding new URLs. Redirecting is better for search engines because it preserves your SEO value and rankings (usually). It is better for users because if they do happen to get to the old URL, they are automatically taken to the new URL.

So make sure your site implements proper redirects that use the "301 Permanent" status:

/home/browse/2/45/139 -> /home/browse/fashion/women/tops-and-shirts
/home/browse/5/60/160 -> /home/browse/sports/team-sports/football

Google doesn't add random parameters to the URLs. All URLs that it crawls it finds somewhere. It likely found the links to the pagination on your own site. Googlebot also has dumb heuristics where it scans JavaScript for string literals that look like they could be URLs and crawls those. The parameters could also come external links. Occasionally other sites can randomly link to your site in weird broken ways.

If you no longer have pagination, it is fine to redirect those requests too. Even if you never had pagination, it would probably be fine to redirect to remove pagination parameters.