views:

73

answers:

5

I read some website development materials on the Web and every time a person is asking for the organization of a website's js, css, html and php files, people suggest single js for the whole website. And the argument is the speed.

I clearly understand the fewer request there is, the faster the page is responded. But I never understand the single js argument. Suppose you have 10 webpages and each webpage needs a js function to manipulate the dom objects on it. Putting 10 functions in a single js and let that js execute on every single webpage, 9 out of 10 functions are doing useless work. There is CPU time wasting on searching for non-existing dom objects.

I know that CPU time on individual client machine is very trivial comparing to bandwidth on single server machine. I am not saying that you should have many js files on a single webpage. But I don't see anything go wrong if every webpage refers to 1 to 3 js files and those js files are cached in client machine. There are many good ways to do caching. For example, you can use expire date or you can include version number in your js file name. Comparing to mess the functionality in a big js file for all needs of many webpages of a website, I far more prefer split js code into smaller files.

Any criticism/agreement on my argument? Am I wrong? Thank you for your suggestion.

A: 

The Javascript should be designed so that the extra functions don't execute at all unless they're needed.

For example, you can define a set of functions in your script but only call them in (very short) inline <script> blocks in the pages themselves.

SLaks
I have always followed the rule of keep your javascript in the head and use the dom to manipulate the document as oposed to embeding script tags in it like server side languages.
John
+1  A: 

Two reasons that I can think of:

Less network latency. Each .js requires another request/response to the server it's downloaded from.

More bytes on the wire and more memory. If it's a single file you can strip out unnecessary characters and minify the whole thing.

duffymo
also depending on the size of the file tcp/ip protocol may help out here, It can slide its window size out for a longer period of time as opposed to initiated and renegotiating several concurrent requests.
John
+4  A: 

A function does 0 work unless called. So 9 empty functions are 0 work, just a little exact space.

A client only has to make 1 request to download 1 big JS file, then it is cached on every other page load. Less work than making a small request on every single page.

bwawok
Sorry, I should make it clearly. Suppose the 10 functions are $(document).ready() kind of function and scan certain dom object to hook up some events. So obviously, the 10 functions will be called and the dom objects will be scanned 10 times, even though each time only 1/10 of dom is found for one webpage.
Steve
A: 

My line of thought is that you have less requests. When you make request in the header of the page it stalls the output of the rest of the page. The user agent cannot render the rest of the page until the javascript files have been obtained. Also javascript files download sycronously, they queue up instead of pull at once (at least that is the theory).

John
+2  A: 

I'll give you the answer I always give: it depends.

Combining everything into one file has many great benefits, including:

  1. less network traffic - you might be retrieving one file, but you're sending/receiving multiple packets and each transaction has a series of SYN, SYN-ACK, and ACK messages sent across TCP. A large majority of the transfer time is establishing the session and there is a lot of overhead in the packet headers.

  2. one location/manageability - although you may only have a few files, it's easy for functions (and class objects) to grow between versions. When you do the multiple file approach sometimes functions from one file call functions/objects from another file (ex. ajax in one file, then arithmetic functions in another - your arithmetic functions might grow to need to call the ajax and have a certain variable type returned). What ends up happening is that your set of files needs to be seen as one version, rather than each file being it's own version. Things get hairy down the road if you don't have good management in place and it's easy to fall out of line with Javascript files, which are always changing. Having one file makes it easy to manage the version between each of your pages across your (1 to many) websites.

Other topics to consider:

  1. dormant code - you might think that the uncalled functions are potentially reducing performance by taking up space in memory and you'd be right, however this performance is so so so so minuscule, that it doesn't matter. Functions are indexed in memory and while the index table may increase, it's super trivial when dealing with small projects, especially given the hardware today.

  2. memory leaks - this is probably the largest reason why you wouldn't want to combine all the code, however this is such a small issue given the amount of memory in systems today and the better garbage collection browsers have. Also, this is something that you, as a programmer, have the ability to control. Quality code leads to less problems like this.

Why it depends?

While it's easy to say throw all your code into one file, that would be wrong. It depends on how large your code is, how many functions, who maintains it, etc. Surely you wouldn't pack your locally written functions into the JQuery package and you may have different programmers that maintain different blocks of code - it depends on your setup.

It also depends on size. Some programmers embed the encoded images as ASCII in their files to reduce the number of files sent. These can bloat files. Surely you don't want to package everything into 1 50MB file. Especially if there are core functions that are needed for the page to load.

So to bring my response to a close, we'd need more information about your setup because it depends. Surely 3 files is acceptable regardless of size, combining where you would see fit. It probably wouldn't really hurt network traffic, but 50 files is more unreasonable. I use the hand rule (no more than 5), but surely you'll see a benefit combining those 5 1KB files into 1 5KB file.

vol7ron
I wouldn't be so quick to disregard the memory leak issue. You have to remember that your site isn't the only site people are going to visit. If everyone's ignoring javascript memory leaks then it's all going to add up. And in a world where Internet Explorer is still the major web browser, memory leaks should be paid close attention to.
rossisdead
+1, good answer. I've seen plenty of cases in complex web applications where combining all the scripts into one file simply wouldn't work (for a bunch of reasons). It all depends on the situation.
slugster
**@rossisdead:** I agree, which is why I think it's one of the biggest reasons to not combine files; however, newer browsers are more adapt to handling this. -- The bigger problem lately with memory issues is from poorly written browser add-ons and plugins. Many are javascript-based and reduce browsing performance.
vol7ron