views:

74

answers:

1

Hello guys. I have a few doubts about this robots file.

User-agent: *
Disallow: /administrator/
Disallow: /css/
Disallow: /func/
Disallow: /images/
Disallow: /inc/
Disallow: /js/
Disallow: /login/
Disallow: /recover/
Disallow: /Scripts/
Disallow: /store/com-handler/
Disallow: /store/img/
Disallow: /store/theme/
Disallow: /store/StoreSys.swf
Disallow: config.php

This is going to disable crawlers for all files inside each folder right? Or i have to add a asterisk at the end of each folder name?

I think this should do it. But i'm not sure if have to add Allow: / right after User-agent i suppose it isn't needed.

Anything wrong in this robots file?

PS: If someone can suggest a validation app for local use, i would be glad.

Thanks.

+2  A: 

It's fine as is, if I understand what you want. E.g.

/administrator/
/css/subpage

are both blocked, but

/foo

is allowed. Note that Allow is a less supported extension designed only to counter a previous Disallow. You might use it if, for instance, despite your

Disallow: /images/

you decide you want a particular image allowed. So,

Allow: /images/ok_image

All other images remain blocked. You can see http://www.searchtools.com/robots/robots-txt.html for more info, including a list of checkers.

Matthew Flaschen
Yes i think you have. I want all crawlers to index the website, with the exception of all those folders and files inside them. And that last php file.
Fábio Antunes
Forgot to say.. Thanks.
Fábio Antunes
Thanks for this last minute edit. It proved useful in another doubt i has having right now. And sure did cleared my mind of any doubts about the Allow condition. Thanks :D
Fábio Antunes