views:

94

answers:

3

The question I have is a bit of a ethical one.

I read here that Google gives a little more influence to sites that are optimized to load quickly. Obviously this makes Google's job easier, using less resources and it is a better experience for everyone, so why not reward it?

The actual process of finding bottlenecks and improving page load speed is well understood these days. Using tools like YSlow and reducing the number of files is becoming standard practice (which is great!)

So Is it fair / smart / kosher to serve the googlebot (or other search bot) custom content that will download faster? (i.e. no javasript, images, or css) Or would it flag you as a cheater and throw your site into limbo, unsearchable from google?

Personally I'd rather not risk it, I'd actually like to improve the performance for my visitors regardless. But as it stands there isn't much info on the topic so I figured I'd throw it out there.


EDIT:

I found some new information which might factor in.

From Google's Webmaster Tools: http://www.google.com/support/webmasters/bin/answer.py?answer=158541&hl=en

Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature.

There is no guarantee that they would use the same algorithm for ranking pages in search results, but it might indeed show that it is the actual user experience that matters most.

+1  A: 

Check out the google webmaster guidlines: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769

Basicly it breaks down to: You need to give the googlebot and viewers the exact same experience except where the googlebot could not participate in the experience. One notable example would be logins. News web pages frequently skip login pages pages for the googlebot, because the googlebot cannot/does not sign up for accounts.

I would imagine google would not be actively looking for pages optimized/prioritized for the googlebot but if they ever found it one they would come down on it the violator like a hammer.

Sqeaky
+1  A: 

No one can say for certain exactly what Google will detect and ding your site for; they keep their algorithms secret. However, they generally frown upon anyone who serves different content to googlebot than they serve in general; and if they catch you at it, they are likely to reduce your PageRank for it.

Really, the question becomes why you would want to do this? If you make your page load faster for googlebot, you should make it load faster for your customers as well. Research has shown that just a tenth of a second longer load time can lose you customers; why would you want to get more customers in from Google just to lose them with a slow site?

I'd say this is not worth the risk at all; just improve your site and make it load faster for everyone, rather than trying to serve googlebot different pages.

Brian Campbell
+2  A: 

(i.e. no javasript, images, or css)

make them JS, CSS external. google will not touch it (very often) - (or you can block it via robots.txt, but that's unnecessary)

make all repeating images sprites. load the big image asynchronously via js after the onload event of the document body.

this is also good for the user, as the site renderes faster until they see something.

as long as the main content is the same for google and the average first time visitor, and if there is no miss-leading intent, it's ok and a great strategy.

don't worry to much about any possible penalities. as long as there is no missleading intent it's mostly ok.

what is not ok to deliver google based on the user agent something majorly different. (here it's better to be save than sorry)

Franz