views:

165

answers:

3
+1  Q: 

SEO & Ajax

I'm experimenting with building sites dynamically on the client side, through javascript + a json content server, the js retrieves the content, and builds the page client-side. Now, the content won't be indexed by google this way, is there a work around for this? like having a crawler version and a user version? Or having some sort of static archives? Has anyone done this already?

+4  A: 

You should always make sure that your site works without javascript. Make links that link to static versions of the content. Then add javascript click handlers to those links that block the default action from hapening and make the AJAX request. I.e. using jQuery:
HTML:

 <a href='static_content.html' id='static_content'>Go to page!</a>

Javascript:

 $('#static_content').click(function(e) {
   e.preventDefault(); //stop browser from following link
   //make AJAX request
 });

That way the site is usable for crawlers and users without javascript. And has fancy AJAX for people with javascript.

Pim Jager
The concern is more about content, rather than interaction, this is not something the user will be aware of. I have a lot of content to load, so I only want to load it if necessary. That's a nice trick though which will provide itself useful.
cloudhead
+2  A: 

You could utilize a server rendered version, and then replace it onload with the ajax version. But if you are going to do that, why not build the entire site that way and just use ajax for interaction where the client supports it ala non-intrusive javascript.

Matt
+2  A: 

If the site is meant to be indexed by google then the "information" you want searchable and public should be available without javascript. You can always add the dynamic stuff later after the page loads with javascript. This will not only make the page indexable but will also make the page faster loading.

On the other hand if the site is more of an application 'ala gmail' then you probably don't want google indexing it anyway.

Jeremy Wall