views:

775

answers:

10

I'm trying to profile the performance of a web site that I'm fairly confident is being slowed down by the loading of JavaScript files on the page.

The same JavaScript files are included several times on the page, and <script /> tags are scattered throughout the page instead of being included at the bottom.

As I suspected, when looking at FireBug's "Net" tab, most of the time (not all) when JavaScript is being loaded, no other files are requested. The browser waits for the JavaScript to complete loading.

There are a few exceptions however. There are a few occasions where JavaScript is loaded, but then at the same time, other resources appear to get loaded, such as other JavaScript files and images.

I always thought that JavaScript blocks the loading of other resources on the page. Am I incorrect in thinking this, or does this behavior vary depending on the browser or browser version?

UPDATE:
To those who have explained how loading a script blocks the loading of other resources, I'm already aware of this. My question is why a script wouldn't block the loading of other resources. Firebug is showing that some JavaScript files do not block loading other resources. I want to know why this would happen.

A: 

I believe that the content is downloadeded, but not rendered until the JavaScript is finished loading.

This is, from the server's POV, not much of a deal, but to the user it can make a huge difference in speed.

MiffTheFox
According to Yahoo!'s best practices page (that I linked to in my question), browsers don't download content until the JavaScript finishes loading.
Dan Herbert
Javascript requests are blocking, though there are ways around that.
annakata
A: 

If you think about it a tag has to finish processing before you can continue to render content. What if the tag used document.write or some other wonderfully dumb thing? Until anything within the script tag has finished running the page can't be sure what it's going to display.

Scimon
+7  A: 

Javascript resource requests are indeed blocking, but there are ways around this (to wit: DOM injected script tags in the head, and AJAX requests) which without seeing the page myself is likely to be what's happening here.

Including multiple copies of the same JS resource is extremely bad but not necessarily fatal, and is typical of larger sites which might have been accreted from the work of separate teams, or just plain old bad coding, planning, or maintenance.

As far as yahoo's recommendation to place scripts at the bottom of the body, this improves percieved response times, and can improve actual loading times to a degree (because all the previous resources are allowed to async first), but it will never be as effective as non-blocking requests (though they come with a high barrier of technical capability).

Pretty decent discussion of non-blocking JS here.

annakata
Where in the bottom of the page if I may ask?The head tag should be before the body.
the_drow
@the_drow - I'm not sure I understand what you're asking, but if you refer to scripts at "the bottom" it means placing them literally as the last tags within the body definition, not within the head at all. Note, that this is not what I'd recommend first, but it is a reasonable second best.
annakata
+2  A: 

I'm not entirly sure that Firebug offers a true reflection of what is going on within the browser. It's timing for resource loading seems to be good but I am not sure that it is acting as a true reflection of exactly what is going on. I've had better luck using HTTP sniffers/proxy applications to monitor the actual HTTP requests coming from the browser. I use Fiddler, but I know there are other tools out there as well.

In short, this many be an issue with the tool and not with how resources are actually being loaded ... at least worth ruling out.

James Conigliaro
A: 

Browsers usually have a set number of connections opened to a single domain.
So, if you load all your scripts from the same domain you will usually load them one after the other.
But, if those scripts are loaded from several domains, they will be loaded in parallel.

Itay Moav
A: 

The reason the browser is blocking during JavaScript downloading is that the browser suspects that there will be DOM nodes created inside the script.

For example, there might be "dcoument.write()" calls inside the script.

A way to hint to the browser that the script does not contain any DOM generation is with the "defer" attribute. So,

<script src="script.js" type="text/javascript" defer="defer"></script>

should allow the browser to continue parallelizing the requests.

References:

http://www.w3.org/TR/REC-html40/interact/scripts.html#adef-defer

http://www.websiteoptimization.com/speed/tweak/defer/

John Gietzen
I think defer is only supported by IE yet. Didn't check last browser builds though, but I'm pretty sure it doesn't work with FF2.
streetpc
I read somewhere that firefox 3.1 should support defer.
streetpc
+1  A: 

I suppose you're using Firefox 3.0.10 and Firebug 1.3.3 since those are the latest releases.

The Firebug 1.4 beta has done many improvements on the net tab, but it requires Firefox 3.5. If you want to test it in Firefox 3.0, use one of the previous 1.4 alpha versions. But even with the improvements I still struggle to understand the result. I wish the Firebug developers would document more precisely what each part of a download means. It doesn't make sense to me why queuing is after connecting.

My conclusion was not to trust the results in Firebug, and ended up using the WebPageTest. Which was surprisingly good to come from AOL ;-)

Also, what kind of resources are being loaded at the same time as the javascript? Try tracing down the resources that are loaded at the same time, and see if it's referenced in a css/iframe/html-ajax. I'm guessing the reason why nothing else is loaded, is because the browser stops parsing the current HTML when it sees a script tag (without defer). Since it can't continue parsing the HTML, it has nothing more to request.

If you could provide a link to the page you're talking about. It would help everyone to give a more precise answer.

gregers
Unfortunately, this is a private, internal site so I can't give a link to a test page since there are no public-facing pages.
Dan Herbert
A: 

As others have stated, the script is probably loading other resources through DOM injection.

Script.aculo.us actually loads its child components/scripts itself by doing this -- injecting other <script> tags into the DOM for them.

If you want to see whether or not this is the case, use Firebug's profiler and take a look at what the scripts are doing.

cwash
A: 

Like others said, one non-blocking way is to inject <script> tags in the page head.

But firefox can also execute loaded <script>s in parallel: Copy the two lines below:

http://streetpc.free.fr/tmp/test.js
http://streetpc.free.fr/tmp/test2.js

Then go to this page, paste in the input textarea, click "JavaScript", then "Load Scripts" (which builds and adds a <script> child element to the head).

Try that in FF : you'll see "test2 ok", move the dialog box to see "test ok". In other browsers, you should see "test ok" (with no other dialog behind) then "test2 ok", (except for Safari 4, showing me tes2 before test).

streetpc
A: 

Firefox 3 has introduced connection parallelism feature to improve performance while loading a webpage, I bet this is the source of your problem ;)

When you open a web page that has many different objects on it, like images, Javascript files, frames, data feeds, and so forth, the browser tries to download several of them at once to get better performance.

Here's the ZDNET blogpost about it.

SleepyCod