views:

51

answers:

2

Hello all,

I have a domain with a loto of indexed pages, I use this one as a online test domain. I understand that I should test it on a intranet or somewhat, but in time Google indexed a few websites which are not relavent anymore.

Does anyone know how to get a domain totlally unindexed from the most search engines?

+1  A: 

Place a robots.txt file in the root directory of your webpage. It can be used to control how much access search engine spiders have to your content. You can specify certain areas of your site off limits to indexing, on a directory-by-directory basis.

Evan Meagher
+1  A: 

There is a couple things you can do.

  1. Set up a restrictive robots.txt file
  2. Password protect the domain root
  3. Request removal directly from SEs
  4. If you have a static ip and you are the only one accessing the site, you can simply deny access to any ips other than yours.
code_burgar
I have read about this, and I this is the content of my robot.txt:I have made this .txt about 3 weeks ago, but still all pages from that domain are still indexed.User-agent: *Disallow: /
Chris
You need to understand that search engine indexing doesn't function in real time. Your domain is an inactive one and probably seen as having really low importance in google's eyes, it may take weeks for googlebot to respider your domain, and then some for it to actually fully remove your pages from the main index..
code_burgar
ok, then it's ok. Is this robot.txt sufficient?
Chris
1. + 2. will almost certanly have the desired effect
code_burgar