tags:

views:

66

answers:

3

Many aspects of my site are dynamic. I am using jquery.

I have a div which once the DOM is ready is populated using load().

Then if a button is clicked, using load() once again, this value is replaced by another value.

This kind of setup is common across my site. My homepage is essentially lots of dynamically loaded, refreshed, and changeable content.

What are the repercussions of this for SEO?

Ive seen sites where each page is loaded using load() and then displayed using the animation functions... It looks awesome !

People have posed this question before, but noone has answered it properly.

So any ideas? JQUERY AND SEO??

Thanks

EDIT

Very interesting points. I dont want to overdo my site with jaascript.. just where neccesary to make it look good - my homepage however is one place of concern.

So when the DOM is readY, it loads content into a div. On clicking a tab, this content is changed. I.E No JS, No content.

The beauty here for me is that, there is no duplicated code. Is the suggestion here that i should simply 'print' some default content, then have the tabs link to pages (with the same content) if JS is disabled. I.E sacrifice a little duplicate code for SEO?

As far as degrading goes, my only other place of concern is tabs on the same page.. I have 3 divs, all containing content. On this page two divs are hidden until a tab is clicked. I used this method first before i started playing with JS. Would it perhaps be best to load() these tabs, then have the tab buttons link to where the content is pulled from?

Thanks

+3  A: 

None of the content loaded via JavaScript will be crawled.

The common and correct approach is to use Progressive Enhancement: all links should be normal <a href="..."> to actual pages so that your site "makes sense" to a search spider; and the click() event overrides the normal functionality with load() so normal users with JavaScript enabled will see the "enhanced" version of your site.

Rex M
Here's one way to implement this: if you have sub-pages A, B, and C, then the links will point to "/A.html", "B.html", and "C.html", but the .load() function will pull from "/ajax/A", "/ajax/B", and "/ajax/C". This way, you can modify exactly how the content differs between the static and dynamic version (namely, the dynamic versions do not need the header and stuff) Or you could use jquery to load a 'page fragment' like `load('/A.html #container')` (<http://api.jquery.com/load/>)).
Thr4wn
Or, easier in my opinion, use jQuery Ajaxify plugin <http://plugins.jquery.com/project/jquery-ajaxify>
Thr4wn
EDITS above, please see.
Thomas Clowes
A: 

If your content is navigable when JavaScript is turned off, you'll be a good ways toward being visible to search engines.

Note that search engine crawlers won't be submitting any forms on your site, so if you have any or elements that are meant to be navigating between your site's content pages, that content is not navigable by search engines.

Chris
+1  A: 

Here is a guidelines how to make Google to crawl content loaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html

codez