I need to have control over what URLs are allowed to be indexed. To do this I want to allow google to index only URLs that are listed in my Sitemap(s), and disallow Google from indexing anything else.
Easiest way to solve this is if there is a way to configure robots.txt to disallow everything:
User-agent: *
Disallow: /
And at the same time allow every URL that is listed in:
Sitemaps: sitemap1.xml
Sitemaps: sitemap2.xml
Can the robots.txt be configured to do this? Or are there any other workarounds?