views:

47

answers:

3

I am working on a rails application that uses big javascript libraries (e.g. jquery UI), and I also have a handful of my own javascript files. I'm using asset packager to package up my own javascript. I'm considering two ways of serving these files:

  1. Link to the jQuery libraries from Google Code as described at http://code.google.com/apis/ajaxlibs/documentation/#jquery , and separately package up and serve my javascript files using asset packager.

  2. Host the jquery libraries myself, and package them together with my own javascript as one big merged javascript file.

My hosting solution is of course not going to beat out Google's content delivery network, so at first I assumed that end users would experience faster page loads via option #1.

However, it also occured to me that if I serve them myself, users would only need to issue one request to get the merged javascript (as opposed to one for my merged javascript and another for the libraries served by google).

Which approach will provide the best end-user experience (presumably in the form of faster load times?)

+4  A: 

The nice thing about Google is that because it is used by many websites, chances are that a user will have already viewed a site that used the google JS libraries, in which case, it would already be cached on their machine, and they might not need to download the file at all.

Personally, I would stick with using Google (via the google.load()) instead of trying to combine the files and load them from my own server. (You can also use google's loader to lazy-load the files, and only load them when you need them, rather than loading all your libraries and only using 1 of them.)

webdestroya
+4  A: 

I would say "it depends", but in most cases I'd go with Option #1 (Google hosting) for an internet facing site. For an intranet I'd host everything internally for a number of reasons, but that's outside the scope of your question it looks like.

There's a few things to consider overall:

  • Your users aren't downloading the file except on forced refresh if it's cached correctly.
  • Google has more servers than you :) lots more, and they're geo-located to best serve any given request, I would guess that your hosting from a single or few locations.
  • The browser parallelizes downloads, even if it executes the script sequentially, so it'll download from you and google at the same time, increasing throughput.
  • Other sites use google for hosting jQuery (you're on one now), if the user has been to any of those, they already have the file cached, meaning no request was made.

You can host all the files in one file, but you have to consider the weight of a few things with this:

  • How large is that one file going to be, do you users need to download the entire thing again when you changed something in your script?
  • Are the multiple requests (and DNS lookups) cheaper than the download time for that file
  • Do you pay for bandwidth? :)

Depending on what percentage of the code is custom and how much is framework, Google's CDN can take a substantial part of your static js traffic off your server, leaving it available to serve and do other things (that's a huge benefit to a high traffic site), and when you change your script (much more common than a new framework release)...the client downloads only that, not the entire framework again.

Nick Craver
A: 

Besides practical issues of bandwidth and download speed, there are also potentially legal or at least moral ones, depending specifically on your privacy policy and requirements.

<tinfoilbeanie>

When you use Google's (or somebody elses) CDN, a referrer header gets passed along containing the address of the page, as well as a tracking cookie. Whoops! Google now knows which site your users were looking at when they downloaded their js. It's mitigated somewhat by browser caching, as if you've already got it you won't redownload it, and they use fairly aggressive cache controls.

However, if you have published a privacy policy saying that you won't share tracking information with third parties, then you are now lying to your users. If you are required to include such a privacy requirement because, say, you are developing a government website, you are now violating legislation.

</tinfoilbeanie>

The exact same thing happens with ad networks, offsite images, etc, so it might not be a big deal to you, but check your privacy policy requirements.

For the specific case of Google, they explicitly state in the terms of service for googleapis:

... which includes storing uniquely identifiable tracking cookies on your users' computers.


Note that even StackOverflow's privacy policy never mentions that Google may get pinged to let them know that you've visited their site, nor that there is a generic QuantServe tracking image embedded in the site. It mentions "We" do this and that, but presumably "We" is not intended to include QuantServe or Google. Privacy can be a hairy thing.

kibibu
Seeing as SO gets the vast majority of it's traffic from google in the first place, I would use another example for that argument :)
Nick Craver