views:

41

answers:

2

Hi,

I want to block search engines like Google and Yahoo from crawling user sub.domains like user.example.com, how can i do it?

+4  A: 

Use robots.txt file in your web server.

So, in your subdomain put a robots.txt file that looks like this:

User-agent: *
Disallow: /
Pablo Santa Cruz
Do i have to add the sub domain name in the file or just what you have put up here. Can i also use disallow: all Thanks!
AAA
@AAA: Nope, all you need is the above... Disallow all is basically `Disallow: /` since it disallows robots to visit any page from the root directory and down (all pages).
Andrew Moore
Great. Thanks Andrew.
AAA
Facebook has done it in a very different way: http://facebook.com/robots.txt why have they specified?
AAA
A: 

All sites have something specifically specified because that's what they don't want crawled by search engines...

AAA