views:

10449

answers:

8

What would be a good way to attempt to load the hosted jQuery at Google (or other Google hosted libs), but load my copy of jQuery if the Google attempt fails?

I'm not saying Google is flaky. There are cases where the Google copy is blocked (apparently in Iran, for instance).

Would I set up a timer and check for the jQuery object?

What would be the danger of both copies coming through?

Not really looking for answers like "just use the Google one" or "just use your own." I understand those arguments. I also understand that the user is likely to have the Google version cached. I'm thinking about fallbacks for the cloud in general.


Edit: This part added...

Since Google suggests using google.load to load the ajax libraries, and it performs a callback when done, I'm wondering if that's the key to serializing this problem.

I know it sounds a bit crazy. I'm just trying to figure out if it can be done in a reliable way or not.


Update: jQuery now hosted on Microsoft's CDN.

http://www.asp.net/ajax/cdn/

A: 

I wouldn't rely on a third party to host an important script, even if it is Google.

One reason: It is one more domain to have to resolve on your site.

Another reason: Google has had their share of downtime. They are not perfect, you know. Would you want your site being unusable because a third party stopped serving crucial files? I wouldn't.

Bryan Migliorisi
That's why I asked the question. :-)
Nosredna
Most people should already have Google.com pre-cached in their DNS tables. Resolving domains *shouldn't* be a concern anyway though since it is considered a good practice to use several domains on one page to speed up loading of your site anyway.
Dan Herbert
An extra DNS lookup is actually negligible compared the the benefits of fetching a resource on a different domain (more http requests total) and most likely closer to the user (because Google has datacenters around the world).
Ryan Doherty
Yes, but too many extra DNS lookups will cause a slowdown.
Nate Bross
Based one what? Google's DNS will almost certainly be cached
micmcg
Not only will Google's DNS most likely already be cached, but since many sites already use this file, the file itself has a higher chance of being in your browser's cache.
Amir
spreading your downloads among multiple domains improves performance. web-browsers will make only up to 2 concurrent connections to single domain. that's why many websites are hosting resources on multiple domains (including stackoverflow.com) to overcome this limitation
lubos hasko
@lubos - not related to Google's hosting which is likely to be cached, but spreading across domains is only beneficial up to a point because the DNS lookups do indeed take time. Only IE7 limits 2 connections per domain, newer browsers have higher limits. For IE7, Yahoo!'s research shows 2-4 (but not more) domains to be best: http://yuiblog.com/blog/2007/04/11/performance-research-part-4/
orip
In South Africa if your international bandwidth is finished you can only browse local South African sites, if a site use the google/code even addsense and analytics it would cause your site to not display properly on a clients computer
Gerald Ferreira
+132  A: 

One more reason to not use Google-hosted jQuery is that in some countries, Google's domain name is banned.

But this is how you achieve it:

<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js"&gt;&lt;/script&gt;
<script type="text/javascript">
if (typeof jQuery == 'undefined')
{
    document.write(unescape("%3Cscript src='/path/to/your/jquery' type='text/javascript'%3E%3C/script%3E"));
}
</script>

This should be in your page's <head> and any jQuery ready event handlers should be in the <body> to avoid errors (although it's not full-proof!).

Rony
Yes. As in Iran. "I'm not saying Google is flaky. There are cases where the Google copy is blocked (apparently in Iran, for instance)."
Nosredna
I assume you would want to write an alternate script tag in that document.write('') ?
John JJ Curtis
This may not work because when the browser does the check for the jQuery object, it may be currently downloading the jQuery file and not done parsing it yet. Then you'll end up including it twice.
Ryan Doherty
that's a nice point I am not sure how to force these steps to behave in a synchronous manner
Rony
@Rony, yeah that's the trick.
Nosredna
there must be some way in JavaScript to check the existence of a file and if it returns a 400 class status message then the local script can be downloaded, but I am not sure how that works it tried some examples but its not working for me
Rony
@Rony, see my edited question. You can use google.load to load jQuery off Google's site. That might help forcing the problem into being synchronous.
Nosredna
@Nosredna the problem with this in turn is that which file contains the google.load method, that is again hosted with google
Rony
Couldn't you have a local version of google.load?
Nosredna
yes you can copy the source from http://www.google.com/jsapi
Rony
I believe that using a self hosted version of jsapi will make some Google services unusable. For example, ClientLocation.
Ionuț G. Stan
In reference to making it synchronous: the fact that Google doesn't even attempt to address this issue in their typical jsapi include setup makes me think that it's not only extremely unlikely to be fixable, but also of minimal impact. (see http://code.google.com/apis/ajax/documentation/#GettingStarted - third code block)
Matchu
Aren't javascript downloads blocking (synchronous) already? Seems to me the double-copy issue would therefore not be a problem.
Matt Sherman
Javascript downloads should be synchronous already, as Matt Sherman said. Otherwise, many problems would occur if the page tried to execute an inline script that relied on a library that was only half downloaded, or a library extension was executed without the library fully downloaded and executed.That's also one reason why Yahoo YSlow recomends placing javascript at the end of pages; so that it doesn't block the downloading of other page elements (including styles and images). At the very least, the browser would have to delay execution to occur sequentially.
GApple
Small fix from a validator fanatic: The string '</' is not allowed in JavaScript, because it could be misinterpreted as the end of the script tag (SGML short tag notation). Do '<'+'/script>' instead. Cheers,
Boldewyn
This example will not work. 1) if Google ajax library is not available it'll have to time out first before failing. This may take a while. In my test of disconnecting my computer from the network it just tried and tried and tried and didn't timeout. 2) if (!jQuery) will throw an error because jQuery is not defined so Javascript doesn't know what to do with it.
RedWolves
To test if jQuery was loaded, (!window.jQuery) works fine, and is shorted then the typeof check.
Jörn Zaefferer
Still works great ^^
path411
I think the expression is "fool-proof" but I could be making this up.
sova
A: 

You can test the effects of both coming through by including jquery twice in your website right now and seeing what that does.

DDaviesBrackett
A: 

I don't think you can do this reliably on the client side. One way to achieve this is to use a simple ashx handler to go download the google version and spit it out to your client side code.

In your ashx handler you can test the google version, if it happens to be down you can insert your own version from a file on your server. This way, you get the latest up-to-date version from google while having the ability to override and insert your own version.

Nate Bross
Yeah, but that eliminates the advantage of the Google one likely already being cached on the client's machine (because so many sites are already using the Google one). I might as well not even use the Google one if I'm just going to pass it along instead of hand off the bandwidth to Google.
Nosredna
I was only addressing the up-to-date version issue. You are correct this will not improve performance.
Nate Bross
Yeah. Good point on the versioning. Although I always lock to a version I've tested with.
Nosredna
+32  A: 

This seems to work for me:

<html>
<head>
<script type="text/javascript" src="http://www.google.com/jsapi"&gt;&lt;/script&gt;
<script type="text/javascript">
// has the google object loaded?
if (window.google && window.google.load) {
    google.load("jquery", "1.3.2");
} else {
    document.write('<script type="text/javascript" src="http://joecrawford.com/jquery-1.3.2.min.js"&gt;&lt;\/script&gt;');
}
window.onload = function() {
    $('#test').css({'border':'2px solid #f00'});
};
</script>
</head>
<body>
    <p id="test">hello jQuery</p>
</body>
</html>

The way it works is to use the google object that calling http://www.google.com/jsapi loads onto the window object. If that object is not present, we are assuming that access to Google is failing. If that is the case, we load a local copy using document.write. (I'm using my own server in this case, please use your own for testing this).

I also test for the presence of window.google.load - I could also do a typeof check to see that things are objects or functions as appropriate. But I think this does the trick.

Here's just the loading logic, since code highlighting seems to fail since I posted the whole HTML page I was testing:

if (window.google && window.google.load) {
        google.load("jquery", "1.3.2");
    } else {
        document.write('<script type="text/javascript" src="http://joecrawford.com/jquery-1.3.2.min.js"&gt;&lt;\/script&gt;');
    }

Though I must say, I'm not sure that if this is a concern for your site visitors you should be fiddling with the Google AJAX Libraries API at all.

Fun fact: I tried initially to use a try..catch block for this in various versions but could not find a combination that was as clean as this. I'd be interested to see other implementations of this idea, purely as an exercise.

artlung
What is the advantage of using google.load in this situation, rather than loading ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js directly, like Rony suggested? I guess loading it directly catches issues with removed libraries as well (what if Google stops serving JQuery 1.3.2). Furthermore, Rony's version notices network problems AFTER www.google.com/jsapi has been fetched, especially when jsapi has been loaded from cache? One might need to use the google.load callback to be sure (or maybe there's some return value to include the google.load in the if(..)).
Arjan
If one is testing for the presence of Google.com, one could make a network call, or one could check for the presence of the "gatekeeper" object. What I'm doing is checking for the google object and its "load" function. If both of those fail, no google, and I need the local version. Rony's version actually ignores the www.google.com/jsapi URL entirely, so I'm not sure why you indicate that it will have been fetched.
artlung
Arjan
ah, I see how I caused the confusion. "Rony's version notices network problems AFTER www.google.com/jsapi has been fetched" should better read: "Your version does not notice network problems AFTER www.google.com/jsapi has been fetched".
Arjan
We've recently switched to using Google as our jQuery host; if we get any bug reports from blocked users, I'll be using a variant of your answer to refactor our client code. Good answer!
Jarrod Dixon
+8  A: 

There are some great solutions here, but I'll like to take it one step further regarding the local file.

In a scenario when Google does fail, it should load a local source but maybe a physical file on the server isn't necessarily the best option. I bring this up because I'm currently implementing the same solution, only I want to fall back to a local file that gets generated by a data source.

My reasons for this is that I want to have some piece of mind when it comes to keeping track of what I load from Google vs. what I have on the local server. If I want to change versions, I'll want to keep my local copy synced with what I'm trying to load from Google. In an environment where there are many developers, I think the best approach would be to automate this process so that all one would have to do is change a version number in a configuration file.

Here's my proposed solution that should work in theory:

  • In an application configuration file, I'll store 3 things: absolute url for the library, the url for the js api, and the version number
  • Write a class which gets the file contents of the library itself (gets the url from app config), stores it in my datasource with the name and version number
  • Write a handler which pulls my local file out of the db and caches the file until the version number changes.
  • If it does change (in my app config), my class will pull the file contents based on the version number, save it as a new record in my datasource, then the handler will kick in and serve up the new version.

In theory, if my code is written properly, all I would need to do is change the version number in my app config then viola! You have a fallback solution which is automated, and you don't have to maintain physical files on your server.

What does everyone think? Maybe this is overkill, but it could be an elegant method of maintaining your ajax libraries.

Acorn

Acorn
If you're doing all that work *just* for jQuery, then I'd say it is overkill. However, if you already have some of those components in place for other pieces of your app (e.g. if you already load scripts from a DB) then it looks pretty nice.
Michael Haren
A: 

This is just a thought, but could you maybe include the jQuery serverside? So, you have a serverside file called jQuery.php/jQuery.aspx/jQuery.asp (whatever server side language you use). Then you just include that file in your head in a script tag like so ... <script type="text/javascript" src="jQuery.php"></script>. Inside your server side javascript file you could do an http request to Google to get their copy of jQuery and if the http request fails you get your own copy. Then you only have it included once and you get your failover. Thoughts?

Fred Clown
There is no benefit to this method over hosting JQuery by yourself. The main reason to use Google's JQuery is that people have it cached already in their browser, so no download at all is required. In this case, the browser would have to cache your server-side script, since it's coming from your domain.
zombat
Yeah, that's true. I was thinking too hard I guess and over-thought it to the point of failure.
Fred Clown
Yeah, it defeats the whole purpose. Good thinkin tho
Jonah
A: 

Don't use it! Why give Google all that precious data for free?

AnApprentice
My first reaction was: Duh! Then I mulled it over and discovered that you are right but I am afraid not in the sense you mean... :-) Nosredna's clients are giving their IP addresses AND their Google cookies (!) to Google when they surf to Nosredna's web site and therefore with the referrer Google can know who uses Nosredna's web site.
nalply
Sneaky little Google.
Jonah
is it for free, when you let Google pay the traffic for the API hosting?
BerggreenDK