tags:

views:

1166

answers:

3

Hello,

I have a URL which I want to open in a FancyBox (or any other overlay-type pop-up). I don't know in advance whether URL is good, so I want to test it. If invalid - I won't attach the fancyBox plugin to that particular URL, it will just be a regular link. How can I test a URL before attaching a plugin to it? Tried doing something like:

$("a.overlay").each(function() {
    var xhr = $.get(this.href, function(data, status) {
        // never executed in case of a failed request
    });
    if (xhr.status && xhr.status === 404)) {
        // not good, do nothing
    } else {
        // can attach plugin here
    }
});

The problem is, xhr will not always be defined because JS doesn't wait for the request to complete. Similarly, I cannot use the callback function, because it doesn't seem to be executing in case the request fails (I can see it in Firebug, but that's not very useful).

Thanks and have a good weekend everyone.

+3  A: 

You can do it like this, although I would question how wise it is to make a request for each link....

$("a.overlay").each(function() {
    var $a = $(this);
    $.ajax({
        type: 'GET',
        url: this.href,
        success: function() {
            // attach plugin here
        },
        error: function() {
            // not good, log it
        }            
    });
});

If you're not going to do anything with the contents of the page, you could switch 'GET' with 'HEAD' to only get the headers of the page requested, which would be faster and let you know what you want.

Paolo Bergantino
Yes, I could use .ajax (and it would be one, maximum two such links per page). Just wondered if there was perhaps some other trick that I've overlooked. To answer your question, I'd want to log the case if URL is not good.
dalbaeb
+2  A: 

Check out this jquery plugin. It does a HEAD request. Then you just need to check for errors during the request and make sure that the returned status code == 200.

Eli
Hmm. A very interesting plugin indeed. Seems to be more lightweight than pulling the whole page. Thanks!
dalbaeb
+1  A: 

Paolo gave you the answer to what you asked for- But it seems wasteful to me for every client to check the quality of the link every time. Not only does it waste the target's bandwidth, it waste's the client's time.

This is something that could and should be done once, (perhaps daily) on the server side. From there you should generate appropriate html/js. From an engineering standpoint, it just seems like a wiser approach to me.

I don't know what sort of serverside framework you're using, if any, but any of the ones I've worked with have fairly straightforward http clients built in, and chronjob/scheduled job facilities.

alternatively you could author the page such that the first request of the day does the check, and the results are cached to the disk (or memory) for future requests throughout the day. That first load might go a bit slow though.

Breton
I don't think it has anything to do with the server side. This is a front-end related issue, where I have to decide how to present the information to the user. Besides, the problem with doing it in the back is, developers have no time to take care of this, and the URL could be incorrect because it's entered by the marketing team using a CMS (which, again, does not have a feature built in to check the correctness of the URL). So taking care of this in the front is probably the most minimal-effort approach.
dalbaeb