I have a large directory of individual names along with generic publicaly available and category specific information that I want indexed as much as possible in search engines. Listing these names on the site itself is not a concern to people but some don't want to be in search results when they "Google" themselves.
We want to continue listing these names within a page AND still index the page BUT not index specified names or keywords in search engines.
Can this be done page-by-page or would setting up two pages be a better work around:
Options available:
- PHP can censor keywords if user-agent=robot/search engine
- htaccess to restrict robots to non-censored content, but allowing to a second censored version
- meta tags defining words not to index ?
- JavaScript could hide keywords from robots but otherwise viewable