views:

223

answers:

3

I have a large directory of individual names along with generic publicaly available and category specific information that I want indexed as much as possible in search engines. Listing these names on the site itself is not a concern to people but some don't want to be in search results when they "Google" themselves.

We want to continue listing these names within a page AND still index the page BUT not index specified names or keywords in search engines.

Can this be done page-by-page or would setting up two pages be a better work around:

Options available:

  • PHP can censor keywords if user-agent=robot/search engine
  • htaccess to restrict robots to non-censored content, but allowing to a second censored version
  • meta tags defining words not to index ?
  • JavaScript could hide keywords from robots but otherwise viewable
A: 

You can tell robots to skip indexing particular page by adding ROBOTS meta:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

UPDATE: The ways to restrict indexing of particular words I can think of are:

  1. Use JS to add those to the page (see below).
  2. Add module to the server that would strip those words from the rendered page.

JavaScript could be something like this:

<p>
  <span id="secretWord">
    <SCRIPT TYPE="text/javascript">
    <!-- 
       document.write('you can protect the word by concating strings/having HEX codes etc')
    //-->
    </script>
  </span>
</p>


The server module is probably best option. In ASP.NET it should be fairly easy to do that. Not sure about PHP though.

Dmytrii Nagirniak
Caps Lock break?
MitMaro
I know with meta tags, robots.txt or htaccess you can restrict indexing of a page, but I'm asking if certain words can be ignored. Kind of like the meta keywords, is there something like a meta "anti-keywords"
Peter
Sorry, missed that. Updated the answer a bit.
Dmytrii Nagirniak
+1  A: 

I will go through the options and tell you some problems I can see:

PHP: If you don't mind trusting user agent this will work well. I am unsure how some search engines will react to different content being displayed for their bots.

htaccess: You would probably need to redirect the bot to a different page. You could use the url parameters but this would be no different then using a pure PHP solution. The bot would index the page it is redirected to and not the page you wish to visit. You may be able to use the rewrite engine to over come this.

meta tags: Even if you could use meta tags to get the bot to ignore certain words, it wouldn't guarantee that search engines won't ignore it since there is no set "standard" for meta tags. But that doesn't matter since I don't no of any way to get a bot to ignore certain words or phrases using meta tags.

JavaScript: No bot I have ever heard of executes (or even reads) JavaScript when looking at a page, so I don't see this working. You could display the content you want hidden to the users using JavaScript and bots won't be able to see it but neither will users who have JavaScript disabled.

I would go the PHP route.

MitMaro
- Interesting you note the potential issues for "tricking" bots into redirecting to different content, even though it's legitimate content.- My thought with meta tags was something like the opposite to the keywords meta tag, something like an "anti-keyword" to ignore, although I know there's no such thing- I guess it depends how many use JavaScript, it's a relatively high percentage so I still see this as a potential option. But surely bots read JavaScript, what about ajax content run via JavaScript...
Peter
Content loaded using AJAX is often not read by bots because it requires executing JavaScript which no bot does atm. Google will attempt to follow links found in JavaScript but doesn't index any words from the JavaScript.
MitMaro
A: 

What's not clear from your posting is whether you want to protect your names and keywords against Google, or against all search engines. Google is general well-behaved. You can use the ROBOTS meta tag to prevent that page from being indexed. But it won't prevent search engines that ignore the ROBOTS tags from indexing your site.

Other approaches you did not suggest:

  • Having the content of the page fetched with client-side JavaScript.
  • Force the user to execute a CAPTCHA before displaying the text. I recommend the reCAPTCHA package, which is easy to use.

Of all these, the reCAPTCHA approach is probably the best, as it will also protect against ilbehaved spiders. But it is the most onerous on your users.

vy32