tags:

views:

40

answers:

4

I am converting a "bells and whistles" flash template into a HTML+CSS+JQuery template, mostly for SEO considerations.

The client is very fond of the original template, so I have to reproduce as closely as possible the look and feel.

I would like to load the content using JQuery without ever leaving the page, to give the website the same feel as the Flash template, without visible page reloading. Click -> Animation -> display.

My question is whether Google and other search engines would consider the website to be a single page or take into account all the content that gets loaded though AJAX calls.

Is it more safe, SEO-wise, to make separate pages to be loaded when the user clicks on the links (the usual way)?

There is very little point of going through the conversion of the flash site if in the end, the website ends up being as SEO unfriendly as the original, but if that's not an issue, I know the customer would prefer something as close to the Flash template as possible.

The way I was considering to address the problem would be to create standard links on the page, but stop them from loading using JQuery and load the data directly on the page instead.

Without JavaScript, the links would load actual pages, with a page reload, and with Javascript, the same data would be loaded into the page directly.

Anyone knows how Google would deal with something like that?

+1  A: 

Without JavaScript, the links would load actual pages, with a page reload, and with Javascript, the same data would be loaded into the page directly.

This is the best strategy. Google follows the "Without Javascript" path, Ajax and SEO to this date don't yet get along.

Jaanus
+2  A: 

It's not only bad for SEO to do this, it's also bad for accessibility. However, provided you provide real links to each page as a fallback, it will be both SEO-friendly and accessible.

For example:

<a href="/url/of/a/page.html" onclick="your_ajax_function()">Page Name<a>

Whereby your_ajax_function() would prevent the link from being followed when clicked. This way search-engines can follow the link and find and index page.html while most of your visitors will experience it the fancy AJAX way.

Edit: I see now you had added

Without JavaScript, the links would load actual pages, with a page reload, and with Javascript, the same data would be loaded into the page directly.

which is exactly what I said ;D.. yeah. Silly me.

lucideer
Thanks, that wasn't added. It was part of the original post.
Sylverdrag
Ha yeah, that is what I meant - sorry it was unclear - clarified.
lucideer
+1  A: 

Google and any other spider will not process your javascript it will simply just see's the page and follow any anchors. By making every individual page then loading into the page would be fine since the spider will follow the anchor tags to a real page but ignores the js attached to it. Plus it would also help out if by some chance the user has js disabled in their browser.

Tad
+2  A: 

Google crawler does support AJAX crawling. Described here. Also some info here on how to do it for jQuery Address plugin.

Igor Zevaka
Thanks a lot for the links.
Sylverdrag