The question I have is a bit of a ethical one.
I read here that Google gives a little more influence to sites that are optimized to load quickly. Obviously this makes Google's job easier, using less resources and it is a better experience for everyone, so why not reward it?
The actual process of finding bottlenecks and improving page load speed is well understood these days. Using tools like YSlow and reducing the number of files is becoming standard practice (which is great!)
So Is it fair / smart / kosher to serve the googlebot (or other search bot) custom content that will download faster? (i.e. no javasript, images, or css) Or would it flag you as a cheater and throw your site into limbo, unsearchable from google?
Personally I'd rather not risk it, I'd actually like to improve the performance for my visitors regardless. But as it stands there isn't much info on the topic so I figured I'd throw it out there.
EDIT:
I found some new information which might factor in.
From Google's Webmaster Tools: http://www.google.com/support/webmasters/bin/answer.py?answer=158541&hl=en
Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature.
There is no guarantee that they would use the same algorithm for ranking pages in search results, but it might indeed show that it is the actual user experience that matters most.