views:

65

answers:

6

I'm working on a website for work that uses one master layout for the whole site which includes lots (over 40) js files. This website is really slow to render. How much overhead is there for the browser to parse and (for a lack of better technical term) "deal with" all these includes? I know that they are cached, so they are not being downloaded on each page view. However, does each include get parsed and executed anew on every page refresh?

At any rate, I imagine there is some overhead in dealing with all these includes, but I'm not sure if it's big or small.

+2  A: 

The best way to understand is to measure. Try merging those 40 js files into a single one and see if it makes a big difference. Also compressing them could reduce bandwidth costs.

There will be an overhead of having multiple includes but as you say those pages are cached and the overhead should be only on the first request. I think that if we ignore this initial overhead the performance difference won't be enormous compared to the time spent in those scripts manipulating the DOM, etc...

Darin Dimitrov
A: 

You could try to put all of the .js files into one file and then compress it.

This will lower the amount of requests made by the browser by 39 as well :).

Hope this helped.

Kevin
+2  A: 

it depends on what they do - to test you could do this before they are all loaded:

<script>
  var test_start_time = (new Date()).getTime();
  </script>

and this after:

<script>
    alert("took: " + (((new Date()).getTime()-test_start_time)/1000) + " seconds");
  </script>
Adam Butler
A: 

The impact may be important. Take into account that script downloading blocks page rendering A couple of things you may try:

  • Combine as many scripts as you can so you download less files
  • Minimize and compress combined js files
  • Try to put as many references as you can at the bottom of the page so they don't block the rendering (this is not easy and must be done carefully, you might end up allowing interaction with some controls before the necessary javascript is downloaded).
  • Implement paralell download for js files (by default they are downloaded sequentially). Here you have some examples about that
Claudio Redi
+1  A: 

Definitely compare and contrast - that will be the most reliable judge.

Nevertheless, I do my best to only load one or two JS files in the head section, then I use jquery to test for certain elements which might require additional scripts or css once the DOM is loaded. For example, I use the source highlight js library to stylize pre tags:

if($('pre').length > 0) {
  $.getScript(svx_cdns+'pkgs/shjs-0.6/sh_main.min.js', function() {
    $('<link>', {
      'rel':  'stylesheet',
      'type': 'text/css',
      'href': svx_cdns+'pkgs/shjs-0.6/css/sh_vim.min.css'
    }).appendTo('head');
    sh_highlightDocument('/s/pkgs/shjs-0.6/lang/', '.min.js');
  });
}

That way the page loads very quickly, and then "adjusts" afterwards.

Docunext
A: 

Even if the files are cached, there's still a request to check if the file has been modified. You can change your caching strategy and set your files never to expire. That way the browser will not even ask if it's been modified. That will mean you'll need to add a cache buster to all your urls. Look at firebug's net tab to be sure. I get a 304 Not modified with all my css/js/imgs.

The files are going to have to be parsed everytime, but that's probably not the bottleneck.

Try copying all your js into one file. One of our screen was including over 100 js files. We created a unified minimized file and our screen load time went from 10 seconds to less then 3.

Juan Mendes