tags:

views:

501

answers:

6

On the home page of my site I use JQuery's ajax function to pull down a list of recent activity of users.

The recent activity is displayed on the page, and each line of the recent activity includes a link to the user profile of the user who did the activity.

Will Google actually make the ajax call to pull down this info and use it in calculating page relevancy / link juice flow?

I'm hoping that it does not because the user profile pages are not very Google index worthy, and I don't want all those links to the User profile pages diluting my home page's link juice flow away from other more important links.

A: 

No, it won't make an Ajax call.

But if it discovers the Url in plain text your Ajax function is calling - it will definitely try to look what's there.

You may wish to add to those links "noindex" meta tag.

Developer Art
I think the noindex tag might still dilute pagerank. Google used to suggest the "nofollow" tag for pagerank sculpting, but within the last year has announced that even "nofollow" links will dilute the link juice away. I assume they would do the same with noindex and still dilute the link juice.
Doug
Nofollow uis for external links you don't trust. Noindex is a legit option to say some pages would make no sense to index.
Developer Art
+2  A: 

Maybe. If you want to guarantee Google won't spider your JSON responses, put them in robots.txt. That isn't security, though; indeed, it's the first place a cracker will look for "interesting" pages. And other crawlers may ignore it.

Craig Stuntz
hmm, maybe i could do that with the robots.txt. That wouldn't be considered blackhat SEO would it? Since the only reason I'm restricting it is to sculpt my pagerank flow?
Doug
It's not "blackhat SEO" to put a page in robots.txt. But this won't guarantee that the googlebot won't see that data if *you* include it in a non-excluded page. Generally, *any* attempt to make the page substantively different to the googlebot from what a real user sees is not allowed. But asynchronous content is a special case, since 1) it's common, and 2) Google is still figuring out how to deal with it. I suspect that sooner or later it will be indexed.
Craig Stuntz
+4  A: 

No, it will not crawl AJAX content by default.

http://code.google.com/web/ajaxcrawling/ has instructions on how to make AJAX content crawlable, but those are explicit steps you need to take, it isn't automatic

Chi
A: 

Here is some info on AJAX crawling from the makers of JQuery Address plugin.

Igor Zevaka
A: 

Google is definitely crawling content in our page that is only referenced within an AJAX request.

I'm wondering if Google is going through the page source, and looking for potentially valid URLs, and testing to see if they've got content.

Here is what our request looks like... which might offer some insight into what's going on.

'$(document).ready(function() {

    $("#theDiv").block({ message: 'Getting latest content...' });
    $.ajax({
        url: '/content/pages/articles?count=4&part=true',
        success: function(data) {
            $('#theDiv').html(data);
            $("#theDiv").unblock();
        }
    });
});`
adamb0mb
A: 

Google now has a way for you to enable crawling on AJAX pages. If your links contain "#!", then google will change that to "?_escaped_fragment_=" and request that document from your server. However, when it shows that page in the search results, it will show the original URL with the "#!".

http://code.google.com/web/ajaxcrawling/docs/learn-more.html

abjennings