views:

110

answers:

3

Hi

What is better.

You have have say 20 javascript files. 10 are shared among all pages. So they would be in the master page. Now how about these other 10? Say one pages uses 4/10 another uses 4/10 and another uses 2/10.

Would it a) be better to combine them all into one file(dynamically of course) so only one http request would be made.

or

b) have 10 combined in the master page, then all of the other pages would have there combined.

So in some my pages I would be looking at 4 or 5 java-script requests

  1. one from master
  2. one for jquery from google CDN
  3. one for jquery ui from google CDN
  4. one for jquery validate from MS CDN
  5. one for anything for that page.

Most of my scripts already use jquery live because I use jquery ajax tabs and I have to load all the scripts for each of the tabs at the same time.

If not every time you would go to another tab it would download a new set of the javascript so it would start doing binding events such as click X time where X is how many times the user loaded up the same tab.

+1  A: 

Depending on the language / server you are using there are lots of tools that dynamically combine the javascript files required on a page into a single request just for that page. Based on your reference to the Microsoft CDN, it sounds like you're using ASP.NET. Here's a combiner that might work for you that will combine the requests for your local JS files into a single request. If it was me, what I would do is load:

  1. JQuery & JQuery UI from Google
  2. JQuery Validate from MS CDN
  3. Your local JS files combined (using a tool such as above) into one download.

That way you get 3 parallel downloads.

Keltex
Well I use something called http combiner but my list is as few requests that I can get it down to(what is very little). I could combine 2-3(all the CDN) but then I think I would lose out on potential cached versions so thats why I left it.
chobo2
Remember that requests from different domains run parallel to one another. So the Google / MS / local site request will also run at the same time. Also once you've downloaded JQuery / JQuery UI / JQuery Validate once for a particular user, they will most likely have that in their cache. So the key is to get your local resource request down to one request as small as possible.
Keltex
I will take a look at that not sure how it differs from the one I am using http://code.msdn.microsoft.com/HttpCombiner . Now for one (the google stuff) are you combing them or just using the links? Yep I know that different domains run parallel thats why I was trying to find a 3rd site for the jquery U.I that way I would have those 3 go off at the same time.
chobo2
Keltex
Would you still recommend combing those files dynamically or does that mess up with the caching?
chobo2
You can set up dynamically combined files to cache correctly. Just make sure that the file that's doing the combining sets cache-control headers, an expires header, and maybe even an etag.
Jordan
How do you do that? Is that with that library Keltex recommended?
chobo2
@Jordon. I mean if User A goes to a site that has jquery hosted on google then User A comes to my site will they get the cached version? Or is it treated as a new copy?
chobo2
@chobo2 - They get a cached version.
Keltex
So is there any advantage of combing it? Or do they just get the same cached version so no advantage?
chobo2
A: 

Depends on how big your non-shared files are. If they're not too big, combine them. Requests usually take longer than transferring. Of course if you really care a lot, you should do your own benchmarks.

But if in doubt, combine.

Edit: I just read your question again. Option B is better, as to cache the master combined version.

Mark
How big is too big?
chobo2
Rule of thumb for me is about 75kb per file. More than that and it would be better to split into at least 2 files for parallel HTTP downloads.
Jordan
A: 

Depending on the size of your app, you will likely need to divide your scripts at some level.

On larger apps, I will serve all shared scripts (i.e. jQuery and other libraries/plugins) as a single minified file, and all page-specific scripts as separate minified file. This way you minimize the amount of unused JavaScript over the wire and you only make two requests per page (parallelized if you have a CDN available).

Christopher Cliff