views:

181

answers:

5

We're looking at splitting our .js to be served from two domains with the intent that that would enable concurrent loading.

Question: Can we a) use subdomains for that purpose and b) will that concurrent loading also hold true over https?

For instance, we'd like to request two files as such:

https://www.example.com/firstfile.js
https://subdomain.example.com/secondfile.js

Doable? Alternatives?

A: 

a) Yes. Use document.domain to avoid Same Origin Policy issues.

b) I don't know, but I can't think of any reason why it shouldn't.

Pekka
Thanks, Pekka. Could you clarify what document.domain solves vs. just requesting a file from two different urls directly?
DA
Nothing directly, `document.domain` just ensures that the script from the domain 1 doesn't get security problems accessing content on domain 2 (known as Same Origin Policy).
Pekka
A: 

The problem caused by scripts is that they block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. While a script is downloading, however, the browser won't start any other downloads, even on different hostnames. (source)

Sounds problematic.

camomileCase
@camomileCase: that's the exact reason we're doing this. Note the 'per hostname' part of that. The idea is that by requesting it from multiple domains, we eliminate that problem for those large(r) .js files.
DA
@DA this should be possible but I doubt whether this will give you any performance increase: The scripts' *execution* will still run one after another, and *downloading* only happens once, then the data is cached.
Pekka
The last sentence is the most important one: "While a script is downloading, however, the browser won't start any other downloads, even on different hostnames."
camomileCase
+1  A: 

As far as I am aware, it won't work. Scripts are set up to block parallel downloads. The reason for that is that parallel loading of scripts can cause race conditions in your javascript. Minify or on demand loading are your best options.

Matt
Please correct me if I am wrong, but I thought parallel download issue was restricted to a single domain. (FYI, we're are minify-ing as well)
DA
There is a difference between parallel *downloading* and parallel *execution*, isn't there? I can't see why the former wouldn't work; the latter is definitely impossible for the reasons outlined by Matt.
Pekka
I'm not worried about Parallel Execution. All execution will be wrapped within jQuery's $(document).ready. We're just trying to eek out any performance gains we can find in terms of server requests and download speed.
DA
Yahoo's performance suggestions have a good note on this:http://developer.yahoo.com/performance/rules.htmlReally what it comes down to is the nature of javascript. If you load two files in parallel you can get undefined behavior because one could act on something the other changes, and the order the files complete in could lead to different results.If you want to speed it up look into dynamic loading or on demand loading:http://www.nczonline.net/blog/2009/06/23/loading-javascript-without-blocking/
Matt
Hmm...I understand the concern of that. However, we're not actually calling any of the javascript until $(document).ready. So, at least in theory, none of the .js should be called until the document has fully loaded and all dependant .js file have been retrieved, correct? Specifically, what we hope to do is request jQuery core and jQueryUI separately, since those have the largest file size overhead.
DA
@Matt, not true. SCripts may be downloaded parallelly but never executed. They will be executed in the order of their `<script>` tags.
Pekka
@DA, I perfectly understand and it is a valid case look into the second link I posted. They explain ways to use a small initial script to load other ones without blocking, have to admit I haven't had the time to completely read through and try it. It seems like it would fit your needs though.@Pekka, please check the references that I posted. The behavior on scripts is just different then other components.
Matt
A: 

An alternative presented in the book "faster web sites" or "even faster web sites" (which i recommend for you to read) suggests loading the javascript files diagrammatically using a javascript function/method that will append child nodes to the element.

You might want to do some research on the topic but it is a good practice which you might want to consider.

regards,

andreas
+1  A: 

I think you have to consider the latency of the network (a kind of lost time that adds up for every call to make the round trip). The latency is what kills HTTP calls responsiveness.

Personally I follow the trend to reduce the number of http calls.
I merge all my files in one (+ minimise + gzip)

Mic
I agree on the minimize + gzip part. Also, I'd ensure the server populated all relevant headers to improve cache management (Content-Expires and alike), so friendly browsers would only hit the server once.
Romain