Hello guys. I have a few doubts about this robots file.
User-agent: *
Disallow: /administrator/
Disallow: /css/
Disallow: /func/
Disallow: /images/
Disallow: /inc/
Disallow: /js/
Disallow: /login/
Disallow: /recover/
Disallow: /Scripts/
Disallow: /store/com-handler/
Disallow: /store/img/
Disallow: /store/theme/
Disallow: /store/StoreSys.swf
Disallow: config.php
This is going to disable crawlers for all files inside each folder right? Or i have to add a asterisk at the end of each folder name?
I think this should do it. But i'm not sure if have to add Allow: /
right after User-agent
i suppose it isn't needed.
Anything wrong in this robots file?
PS: If someone can suggest a validation app for local use, i would be glad.
Thanks.