views:

63

answers:

3

I don't want the search engines to index my imprint page. How could I do that?

+9  A: 

You need a simple robots.txt file. Basically, it's a text file that tells search engines not to index particular pages.
You don't need to include it in the header of your page; as long as it's in the root directory of your website it will be picked up by crawlers.
Create it in the root folder of your website and put the following text in:

User-Agent: *
Disallow: /imprint-page.htm

Note that you'd replace imprint-page.html in the example with the actual name of the page (or the directory) that you wish to keep from being indexed.

That's it! If you want to get more advanced, you can check out here, here, or here for a lot more info. Also, you can find free tools online that will generate a robots.txt file for you (for example, here).

Donut
Here's a good tutorial: http://www.javascriptkit.com/howto/robots.shtml
Sam T.
Thanks Sam! Added your link next to the other tutorial.
Donut
Thanks a lot! Must I include robots.txt somewhere in the header? Or is it enough to just drop it into the root of the website?
BugAlert
Nope, you don't need to include it in a header; it's enough to just put it in your root directory.
Donut
+3  A: 

You can setup a robots.txt file to try and tell search engines to ignore certain directories.

See here for more info.

Basically:

User-agent: *
Disallow: /[directory or file here]
Bryan Denny
+5  A: 

Also you can add following meta tag in HEAD of that page

<meta name="robots" content="noindex,nofollow" />
seriyPS
good idea. Did this additionally.
BugAlert