views:

321

answers:

3

Following on Steve (YSlow) Souder's evangelism, my site (LibraryThing.com) splits requests across domains to facilitate parallel loading. We do CSS, JS and images; you can also do Flash, etc. We also use Google's version of Prototype, which is cross-domain, not just cross-subdomain.

This is all great for speed, but for a small percent of users, it's going wrong. I think the problem is overzealous security settings, probably in IE, but perhaps in other browsers and/or upstream systems as well. I'm amazed Souders and others don't discuss this, as we get it a lot.

The question is: What is the best way to handle this?

Right now, when it hits the bottom of the page we're checking to see if some JS variable, declared in a script that should have loaded, is set. If it isn't set, it gets it from the main domain and sets a cookie so next time it won't load it from the subdomain. But we're only reloading the JS at the bottom, so if the CSS also failed, you're looking at junk.

Does anyone have a better or more generalized solution? I'm thinking that there could be a general "onload" or "onerror" script that sets the cookie AND loads the content?

A: 

Do you have a specific user-agents list that present this behaviour? Maybe Apache conf could solve this problem? (or create a new problem for you to solve :-) ).

Watch out for the cookie frenzy - the more you add cookies (moreover, on the main domain), the more your clients will have to send it along their requests.

Souders talked about it too, but it's always good to check your clients browsers sent/received ratio for requests.

Brian Clozel
>User-agentsNo. I think we've replicated it before using security settings in IE. I'm assuming its a general problem with security. Some very tight settings assume that a page shouldn't fetch from outside its domain--JS most of all, but also CSS. Heck, you can set your browser not to fetch images from outside, in case you're really paranoid.>CookiesThat's very true. Indeed, that's one reason to use a secondary domain—fetch all your static content from a cookie-less domain, as Souders suggests.
LibraryThingTim
A: 

I'm going to take some wild guesses about your problem.

Cache. Did you make these changes in the script files these problem users could have older versions. IE 6 is extremely bad with overzealous caching.

I notice your scripts don't have a build # in the url, XYZ.js?version=3 will force the browser not to use the old cached scripts like XYZ.ks?version=2. (Applies to Images/Css as well)

You also have inline javascript mixed in with your HTML which would also get cached.

3 domains is likely overkill unless your site has a TON of content on it (huge pages)

DNS lookups can be expensive and have very long timeout values.

I wouldn't be comfortable putting my sites javascript on a separate domain because of the possible security conflicts. You have to keep your javascript/ajax calls in sync with domains. Seems like more of a hassle than it's worth.

I've been using i.domain.com and domain.com for 5+ years with no issues.

I bet putting the JS back on the main domain will fix your problems. It will certainly make it less complex and easier to deal with.

But done properly, your 3 domains should work. Unfortunately I don't have enough info in this question to find the issue.

Chad Grant
Let's say LibraryThingTim uses only one domain to "solve" his problem.If he tries to use a cross-domain loader for his JS (Dojo and other frameworks give you this possibility, even with ONE domain - and trust me, it speeds up things anyway) - then same problem again.Security policies can also apply because of the way you're loading your JS.
Brian Clozel
+1  A: 

If this behavior always affects JS files at least, one option would be to keep a cookie indicating whether the user's browser has been tested for this behavior yet. If they've not been tested, insert (as the first script element in the tag) a reference to a cross-domain script that simply sets this cookie to "success". Then immediately afterward have some inline JS that will check for this cookie, and if not set, set to "failed" and reload the page.

Then on the server-side just check for the same cookie, and ensure cross-site requests aren't sent to anyone with a "failed" result.

This approach should ensure that users with browsers that do support cross-site requests don't see any odd behavior, but should immediately fix the problem for other users at the cost of an automatic refresh the first time they visit.

bdonlan