views:

36

answers:

3

Hi all,

I was thinking about external HTML files, and it occured to me, that if I group the functions from several HTML pages, in one JavaScript this will lead to extra clientside processing.

Basically, I would like some idea of whether this is correct.

Here is my thinking. Suppose I have one JavaScript file for five pages. If the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages. The final sum is the user's browser loaded about 5 times as much JavaScript as he would have normally.

I think most people group there JavaScript by common functionality. So you can have the JavaScript file with several pages, however you may not use all the JavaScript on every page. So all the JavaScript you don't use on every page is run/loaded without need.

I have a sub-question. I know you don't have to redoanload the JavaScript file for each page. Is the JavaScript file run each time? Is the JavaScript reloaded? By reloaded, I mean what kind of over head is there for each time the browse has to get a file out of the cache?

Thanks, Grae

+1  A: 

If I have a file of 200 lines, and seperate it out to 5 files of 40 lines each, the total number of lines remains at 200 BUT. remember that, if I pulled files 1-4 on the previous page, I only now need to pull file 5 since 1-4 are in my cache. additionally, most modern browsers are goint to thread those requests so instead of a single large file download for a single file, I get 5 threaded downloads of smaller files.

the overhead for the browsers would be pretty browser specific in how they handle it and above my head on exact implementation.

FatherStorm
Actually, browsers won't always download resources asynchronously. It depends on how you've got your HTML structured and what types of files they are, among other things.
coreyward
That's why I qualified the "most" browsers although i was most specifically thinking of network.http.pipelining in firefox.
FatherStorm
Really my questoin is about client processing, though. I am asuming that the user has been to all the pages before, and downloaded the needed JavaScript files. Once, I am at that state, it seems like have a different JavaScript file for each page is the fastest solution.
Grae
+1  A: 

the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages

If caching is set up correctly, the contrary will be true: The file will be loaded only once, at the beginning of the user's visiting the site. The overall amount of data to load will be reduced in most cases.

The JavaScript code for all four pages will be loaded in the browser's memory somehow, maybe even pre-parsed (I don't know the exact specifics of this), but that part of script processing is totally negligible.

It could still be wise to split your JS library into chunks, if they are totally separate on every four pages and really huge - it will depend on your script's structure. But mostly, having one external file, and therefore one slightly bigger request the first time but none afterwards, is the preferable way.

For your sub-question, take a look at Firebug's "Net" tab. It will show you which resources it loads and from where, and how long it takes to process them.

Pekka
I would look at that Net tab, but firebug always crashes for me. I will look for the same thing in developer tools.
Grae
+1  A: 

It's better to pack the javascript for all pages into one file. The file will be cached and not downloaded again by the browser for consecutive requests. The reason is that making a web request is far more expensive for your server and the client than for the client to parse the javascript-file.

Browsers are so fast these days that you don't have to worry about the client having to load some extra javascript that might not be used for that specific page.

To make your site fast, you should focus on keeping the amount of requests to an absolute minimum.

Merrimack
This is, strange as it may seem, not necessarily correct. The author of [LabJS](http://labjs.com) has done some testing that strongly suggests loading two separate files is *slightly* faster than loading one.
Pointy
I kind of agree, you can slow a webpage down with JavaScript, fairly easy. Especially, if you have some framework, such as JSF making lots of it for you.
Grae
unless certain pages need slightly different code while maintaining the core of the other js. also, I'm not sure i'd want to have completely disparate js functional code in the same file.
FatherStorm
@Pointy, LabJS indicate that this is true if you defer loading of scripts until everything else has actually been downloaded. From my experience you gain a lot of page load speed by combining scripts to one file and also pack them using a javascript packer. Making requests is expensive, especially if you are geographically distant from the source site.
Merrimack
@Merrimack - Personally I agree with you. It takes some effort to put together a build process that leaves you with one minified script, but even more to make *two* minified scripts :-) I was just making that note for completeness sake. The LabJS author (Hi Kyle) is kind-of obsessed with this stuff.
Pointy
Lets look at one script. To me, I think all will agree this will be slower for the first download on the first of my five pages. The benefit comes for the next four pages. They download faster. I assume everyone is still on the same page. However, after the user has been to all five pages, I think the single script is going to be slower again. Everytime the user goes to any of the five pages his browser will have to load five times as much JavaScript it normally would have to. I mean load by the interpreter, not redownload.
Grae
@Grae. It could be much faster on the first load with just one file. If you're trying to download data to australia from europe, where the latency is high, it's faster to download everything in one go than splitting it up in multiple requests.
Merrimack
@Merrimack I was assuming I would only have to download the one smaller file for the first part. Not all five smaller files.
Grae
Anyways, that is kind of besides the point. The main point is once everything is downloaded, than one large file seems like it will be slower.
Grae