views:

232

answers:

4

Hi people,

on all the "speed up your website" sites and books they always tell us to minimize HTTP requests at all costs. This is fine and nice but what if that means that on every page you have to reload 120kb again and again and again because the user cache is empty?

If i use 5 js files on every page of my website, wouldnt it be better to put them in one file and load this file on every page instead of putting them together with all other variable files in one big file and save one HTTP request. From which point or filesize on is it ok "cache" a file and have another HTTP request?

I give you an example of 3 pages when i use only one HTTP request for one minifed JS file per page:

  1. jquery, jquery ui, thickbox, lavalamp menu => together minified in one file = 300kb
  2. jquery, jquery ui, cycle plugin => together minified in one file => 200kb
  3. jquery, jquery ui, galleria plugin => together minified in one file => 250kb

And now the other possibility with always 2 HTTP requests: One File consisting of jquery and jquery ui => 150kb, lets call it "jui.js" for now

  1. jui.js, thickbox, lavalamp = again 300kb in the beginning, BUT now jui.js is cached for the other 2 pages
  2. (jui.js is cached now so not loaded), only cycle plugin => only 50kb to load but one more HTTP request as i load jui.js and the cycle plugin seperately.
  3. (jui.js is already cached), only load galleria plugin => only 100kb more to load but again 2 HTTP requests where one request is already cached

So at which point or Kb size is it ok to have another HTTP request on a normal "responsive" web server?

Does anybody have any best practices or is it just "Minimize HTTP requests at all costs!"?

(I hope i made myself clear :) And i will vote up people as soon as i have some points!)

EDIT:

It is basicly a simpler question: How long does a extra HTTP roundtrip for a cached js file need? If the http request is slower than the time i would need to download the extra non cached parts on every page, then i would put everything in 1 big file on every page(1 different big file on every page).

If the HTTP request for a cached js file is nearly nothing, then i would split the parts that every page needs in an extra js file(minifed of course) and include the dynamic parts of every page in differend(again minified) js files.

So if on most pages i need a 100kb extra(the dynamic part), how do i test the time for a cached HTTP request? Are there any numbers, did anybody test something like this already?

Thanks for the great answers already!

+1  A: 

in short, there is no rule of thumb here. Depending on your webserver settings you may want to try optimizing by merging files to one larger file... I know apache could be configured to use same connection to stream several files. Best thing to do is use a benchmarking tool such as apache AB to simply test your changes.

As for jquery stuff though, you may include your scripts from a publicly located domain such as google to 1) avoid connections 2) many people have them cached in browser already.

ie: http://code.google.com/p/jqueryjs/

Mohammad
I was hoping for a rule of thumb, one that you can use in general.And i generally dont like grabing everything from google, but thats a personal thing :)
Tschef
there isnt one :) testing is your best friend. I have sites with different scenarios simply after testing.
Mohammad
Testing is the one friend i want to avoid at all cost at every webserver i install my software. So its really just a guess for whom i should optimize. Broadband or "slow"band? Life is full of choices :)
Tschef
A: 

You'll really have to do your own analysis based on your own traffic. Initial load times matter too, so if users are landing on a page with a single JS, you may want to split that out. However, if users end up clicking around on your site a bit, the net benefit to loading it all at once is obvious.

That said, my users land on "content" which needs more scripts, and so I usually lean towards minimizing what I can on the assumption that users will click around.

I'll leave the argument about linking to google's copy of your scripts to a link to a previous discussion:

http://stackoverflow.com/questions/936399/should-i-link-to-google-apis-cloud-for-js-libraries/936467#936467

Paul McMillan
Thanks for the link.So actually you say you prefer my option number 2 with 1 cached file?
Tschef
+1  A: 

This is big complex subject. They write whole books on this subject ;)

For resources (javascript, css etc) it is sometimes better to download them individually. The browser will download them in parallel. if page a needs resources x y z but page b only needs x and z, separating them out is a good thing. Other times a resource that is needed on every page might be better downloaded all at once. It depends.

But with javascript, the browser downloads the JS first before it renders the page (if the script tag is in the head section) so you would see better performance if you add a defer attribute, or include at the bottom of the page, and trigger your javascript with a body=onload.

Remember too you can set caching headers on resources so the browser will cache them in memory or disk. This makes a huge difference in many cases.

There are really no hard and fast rules, just some guidelines. Your best bet is to test! what works better over dialup doesn't work as well over broadband.

Fiddler is a nice program that will show you your loading times if you are on a modem for instance.

Byron Whitlock
i have the javscript of course at the bottom of the page, no matter if there is one file or two.I dont really test on dialup anymore, and i also refuse to quirk out IE6 explorer tweaks. Im over this :)Thanks for the fiddler hint!
Tschef
A: 

I think how you handle this situation heavily depends on the type of traffic your site gets. If it is a site where people are only hitting a few (less than 3) pages and leaving then you can split up files more liberally with the assumption that you are giving users only the minimum for what they need. However, if your site gets users who are viewing a lot of pages then just bundle most of it up and ship it over once.

Also, take a look at where the javascript is being used before putting it into the javascript package. If it is only used a page or two that aren't frequently visited then you can make it a separate file.

In practice, since you are gzipping your scripts when they are sent out (your doing that right?) then its often faster to just include scripts since you avoid the extra round-trip-time. Like Byron mentioned, downloading javascript blocks everything else from being loaded (unless its done asynchronously) so you want to do your best to minimize that time.

Start playing around with the Net tab in Firebug to see how performance is affected.

jkupferman
i edited my post above to maybe clarify, thanks for your answer.
Tschef