Yes, that would be the shortest way to do it. It's not necessarily correct.
Not all bots support the Allow
directive. And some bots are confused about how to interpret the robots.txt when there is both a User-agent: *
section and a User-agent: Specific-bot
section that apply.
To be sure it would work, you'd want to do something like this:
User-agent: Googlebot
Disallow: /file1
Disallow: /file2
Disallow: /file3
# etc. until you have blocked every path except index.html
User-agent: Slurp
Disallow: /file1
Disallow: /file2
Disallow: /file3
# etc. until you have blocked every path except index.html
User-agent: msn
Disallow: /file1
Disallow: /file2
Disallow: /file3
# etc. until you have blocked every path except index.html
User-agent: *
Disallow: /
If you don't want to do all that work, then the best thing to do would be to test each of the engine's you're interested in and see if they'll accept the robots.txt file you proposed. If they don't, try the longer version.