Allow a folder and disallow all sub folders in robots.txt

User-agent: *
Allow: /news/$
Disallow: /news/

Explanation:

Google's robots.txt spec (https://developers.google.com/search/reference/robots_txt), which is more up to date than the "official" spec, states that:

/fish/ will match anything in the /fish/ folder but will not match /fish (and, no wildcard necessary, since "The trailing slash means this matches anything in this folder.") If you kinda reverse engineer that:

User-agent: * (or whatever user agent you want to talk to)
Allow: /news/$ (allows /news/ but the $ character says the allow can't go beyond /news/)
Disallow: /news/ (disallows anything in the /news/ folder)

Test it in Google Search Console, or in Yandex (https://webmaster.yandex.com/tools/robotstxt/) to ensure it works for your site.

Tags:

Robots.Txt