views:

222

answers:

4

I recently implemented a fix to create separate landing pages depending on whether or not the user has javascript enabled. Basically the way it works is this.

The default page is an HTML page w/ no javascript. Basic version of the site. Upon landing on it, there is a script that says if javascript is enabled then go to another page. That landing page is generated by sending the user request through a JSP file that renders the page (header, footer, etc.). The final landing page is http://whatever.com/home.jsp if the user has javascript enabled.

My question is if this will hurt SEO. Considering 99% of the world has javascript enabled I would hate to compromise any SEO benefit to accomodate the 1% who doesn't enable javascript.

Hope that make sense.

+1  A: 

http://www.google.com/support/webmasters/bin/answer.py?answer=66355

Short version, if your JS sends them to entirely different content, it's probably bad, and Google may give you a a hard time. Other than that, you should be good.

Dustman
It looks like google only frowns on this if it is deceptive. Sending one to a different page if javascript is not enable is hardly "delivering different content to search engines". Would you agree?
bgadoci
Oh, absolutely. That's why I mention the "entirely different content" part. So I suppose it all depends on what the content of the two pages are. Out of curiosity, why can't the JSP detect JS use and render the appropriate content? That would also solve the possibility of SEO issues.
Dustman
Not that familiar with JSP. Something interesting though, I followed Google's advice on testing. Given the set up mentioned above, using http://www.delorie.com/web/lynxview.html, the default search engine page stayed as the basic HTML page (like the bot didn't have JS enabled). It didn't move on to the "if you DO have javascript" page. Changed it back and just using an alert to tell people to enable to get full experience. Thanks for the help.
bgadoci
A: 

If the alternative version is an (almost) full-featured, full-content version, then it's perfectly OK.

Google even advices for making alternatives for Flash-only sites, for example, in regard to usability.

Read google FAQ

Bozho
+2  A: 

In general, searchbots should be treated as browsers with JS disabled. I think you can now imagine where they'll land.

This whole question is by the way completely unrelated to JSP. It is just a server side view technology which provides a template to write HTML/CSS/JS in and provides capabilities to control the page flow dynamically with taglibs and access backend data with EL. All what webbrowsers and bots sees (and thus all what counts for SEO) is its generated HTML output.

BalusC
A: 

You touch two topics, one is described as "Cloaking", the other as "Duplicate Content". With "cloaking", you present different (optimized-with-bad-intention) content based upon identification of the client that accesses it, e.g. by inspecting the User-agent header (google-bot versus Browser). You are not doing this, you just want to present content in a way that suits your client best, like a redirect on a page optimized for mobile clients ("m.example.com").

The other thing is how to avoid duplicate content. There's a way by indicating the original content source with a canonical tag, see here: http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html

initall