views:

322

answers:

8

Hi everybody,

A theoretical question:
We all know about the pro's of minifying and combining javascript files in order to reduce HTTP requests to speed up a website. But when popular javascript libraries is used (jQuery for instance), it isn't too stupid to assume that these already have been downloaded to the clients computer from another page.

So what should be preferrered? How does the big guys in the industry handle it?

A) Combine and minify each script into a massive one and serve it from my own CDN.

B) Combine all "self-written" scripts into one file and utilize available CDN's of libraries where possible.

Thanks!

+5  A: 

I think it depends on you site:

  • If you site consists mainly of pages of the same type which need the same scripts I would go for A)
  • If you have a lot of scripts that differ from each sub site of you site I would go for B). Combine the most used scripts together in one script. If you have large scripts that are not used on every page, make a separate script for it.

The best way to really know what to do is to test which combination of techniques saves you the the most traffic / connections.

P.S.: I personally do not like the idea to let other people serve files for my webpage, because what will happen if the CDN fails, but your server stays alive? If this is not a problem for you, try to server all libraries you use from a reliable CDN.

Hippo
+1 for testing it out and using real metrics to make the choice
Anurag
+1 for potential CDN problems. I see those issues, too, every now and then; what's even worse, is that most visitors will think your site is crap, because your CSS/JS can't be loaded.
Marcel Korpel
+3  A: 

I think it comes down to a couple of things:

  1. How many pages use the code throughout your site
  2. The quality of the CDN
  3. How much code it is

There's also a difference between using popular Javascript packages, such as jQuery, and using a personal package, which only visitors who have visited your site will have.

The performance enhancement may occur from two places: 1) browser cache and 2) dns cache, which even if the file isn't stored locally, the dns server has a route that minimizes the request time, or even temporarily serves the file.

I would advise using a CDN and hosting the files locally. Depending on your resources (hardware/bandwidth), you might need to use a CDN anyhow. It'd be nice to use server side schedulers to check on the CDN status and reroute the path when applicable.

Also, take a reminder that some users choose to turn off their browser cache. So minifying your JS is always a plus. You should separate your JS into two files: 1) Needed on Load and 2) Not needed on load. Basically, get the necessary code out there first, to improve the perceived load time. Then load all the other extras (eg slideshows, color changers, etc).

One last point is to make use of the Expires headers, since none of this is important if you don't optimize that. That is what will really reduce the speed for returned visitors with cache enabled. YSlow is a nice Firefox addon that will help evaluate your load performance.


To answer your question: Reduce HTTP requests, but do your own evaluation on the file size of the JS.

(Being Extreme) You don't want one 10MB JS file, or your site will take too long to load. Nor do you want 1000 10KB files, because of the HTTP overhead. Again, use this to illustrate the point that you want a balance between size and number of files - and as I said earlier, package them into performance needed vs wanted.

vol7ron
To the object size point: " We found that if the size of component is greater than 25 KB, the iPhone’s browser does not cache the component. Thus, web pages designed specifically for the iPhone should reduce the size of each component to 25 Kbytes or less for optimal caching behavior." http://www.yuiblog.com/blog/2008/02/06/iphone-cacheability/IE6 SP1, the last IE version made available for win2k, has limitation on the size of the largest javascript file it can handle as well. IIRC, if it encounters a javascript file that's above the limit, it'll simply truncate it.
Frank Farmer
@Frank good point, client hardware should also be considered. @Industrial, you may wish to use a mobile version of your site though, which generally requires less JS overhead.
vol7ron
@Frank - Thanks a lot for your information about iphone and IE6. We will definitely head for a mobile version on this site so your comment came really handy!
Industrial
The other good thing to note from the blog is that the "Safari for iPhone is able to cache a maximum of 19 external components, placing a maximum cache limit at around 475 KB." And that if your file is larger than 25KB, it will not bump anything out of the current cache - only files that considered cacheable will unstack the current cache. Also, Safari decodes the file before cacheing, so 25KB is unzipped file size (important to know if you gzip your files).
vol7ron
+2  A: 

I think the best approach is to use a minified 'application.js' file (containing all application specific javascript) and then use a service such as Google AJAX Libraries API (found here) to load jQuery, Prototype, etc.

Kevin Sylvestre
+1  A: 

Particular to your question about how the big guys of the industry handle client side scripts, you could always look and see. stackoverflow.com seems fine relying on Google's version of the jquery lib. Others most decidedly do not....

LesterDove
IE8 didn't like google's minified jQuery lib for one of the image effects I was doing. So don't always rely on established code ~ still perform quality checks in the different browsers.
vol7ron
+8  A: 
  • "Combine and minify each script into a massive one and serve it from my own CDN"

    If you have a CDN, what are we talking about? :) You mean server, right?

  • "How does the big guys in the industry handle it?"

    The big guys always use their own servers.

  • "...it isn't too stupid to assume that these already have been downloaded to the clients computer from another page."

    Unfortunately it is. Facts:

    • 40-60% of users have an empty cache experience
    • browsers' cache limits are small
    • different versions of libraries are in use, cache only happens if they match
    • resource from a new domain creates a DNS lookup, which is slow
    • +you need to manage dependencies
galambalazs
+1 for the empty cache link, and +1 for the DNS lookup
fmark
The actual quote is "40-60% of _Yahoo!’s users_ have an empty cache experience". The numbers are probably very different for any site that isn't Yahoo. Notably, yahoo.com is a very common browser homepage. This results in a very different usage profile. Which brings us to the most important point: track your own statistics. You won't know what works best for *your* site until you measure it yourself.
Frank Farmer
"Notably, yahoo.com is a very common browser homepage" - LoL. That's why the chance of the same user hitting yahoo home page twice a day is actually higher than the chance of a user having exactly the same library with exactly the same version from exactly the same CDN in his cache (and not replaced by other content since then).
galambalazs
Well, I did not make a difference between CDN or server in this case. The storage with highest speed should be used, and will most certainly be a CDN. Thanks a lot for your post!
Industrial
+2  A: 

Omar al Zabir runs a site called pageflakes and has a slew of articles on how to improve performance. One of them in particular explains how to compress and combine all kinds of things at the server side before sending to the client.

You can also use css image sprites to reduce HTTP requests. They help a LOT for commonly used images.

*I certainly don't have a huge production site, but these things definitely helped save on bandwidth costs.

StingyJack
Yep, already aware of sprites, a really handy technique to use
Industrial
+1  A: 

There's no particular reason that getting one script would be faster than getting it split. It happens because browser concurrent downloads are limited, but not by one.

I think the idea should be to handle synchronous UI scripts first, and then the "user activity response" scripts (like validation, etc).

All else given equal, option B looks as the best one.

Pavel Radzivilovsky
A: 

The most important is to put all scripts into few .js files. While downloading, the most time it spent on establishing connection to the server.

KP

Web developer