views:

51

answers:

3

I'm implementing click tracking from various pages in our corporate intranet in order to add some sorely needed crowd-sourced popular link features ("most popular links in your department in the last 24 hours", etc.)

I'm using jQuery's .live() to bind to the mousedown event for all link elements on the page, filter the event, and then fire off a pseudo-ajax request with various data to a back-end server before returning true so that the link action fires:

$("#contentarea a").live("mousedown", function(ev) {
    //
    // detect event, find closest link, process it  here
    //
    $.ajax({
        url: 'my-url',
        cache: false,
        dataType: 'jsonp',
        jsonp: 'cb',
        data: myDataString,
        success: function() {
            // silence is golden -- server does send success JSONP but 
            // regardless of success or failure, we allow the user to continue
        }
    });

    return true; // allow event to continue, user leaves the page.
}

As you can probably guess from the above, I have several constraints:

  • The back-end tracking server is on a different sub-domain from the calling page. I can't get round this. That's why I am using JSONP (and GET) as opposed to proper AJAX with POST. I can't implement an AJAX proxy as the web servers do not have outbound network access for scripts.
  • This is probably not relevant, but in the interest of full disclosure, the content and script is inside a "main content" iframe (and this is not going to change. I will likely eventually move the event listener to the parent frame to monitor it's links and all child content, but step 1 is getting it to work properly in the simplified case of "1 child window"). Parent and child are same domain.
  • The back-end is IIS/ASP (again, a constraint -- don't ask!), so I can't immediately fork the back-end process or otherwise terminate the response but keep processing like I could on a better platform

Despite all this, for the most part, the system works -- I click links on the page, and they appear in the database pretty seamlessly.

However it isn't reliable -- for a large number of links, particularly off-site links that have their target set to "_top", they don't appear. If the link is opened in a new tab or window, it registers OK.

I have ruled out script errors -- it seems that either:

(a) the request is never making it to the back-end in time; or

(b) the request is making it, but ASP is detecting that the client is disconnecting shortly afterwards, and as it is a GET request, is not processing it.

I suspect (b), since latency to the server is very fast and many links register OK. If I put in an alert pop-up after the event fires, or set the return value to false, the click is registered OK.

Any advice on how I can solve this (in the context that I cannot change my constraints)? I can't make the GET request synchronous as it is not true AJAX.

Q: Would it work better if I was making a POST request to ASP? If (b) is the culprit would it behave differently for POST vs GET? If so, I could use a hidden iframe/form to POST the data. however, I suspect this would be slower and more clunky, and might still not make it in time. I wouldn't be able to listen to see if the request completes because it is cross-domain.

Q: Can I just add a delay to the script after the GET request is fired off? How do I do this in a single-threaded way? I need to return true from my function, to ensure the default event eventually fires, so I can't use setTimeout(). Would a tight loop waiting for 'success' to fire and set some variable work? I'm worried that this would freeze up things too much and the response would be slowed down. I assume the jQuery delay() plugin is just a loop too?

Or is something else I haven't thought of likely to be the culprit?

I don't need bullet-proof reliability. If all links are equally catchable 95% of the time it is fine. However right now, some links are catchable 100% of the time, while others are uncatchable -- which isn't going to cut it for what I want to achieve.

Thanks in advance.

A: 

I would try to return false from the link event handler, remember the URL and navigate away only when JSONP request succeeds. Hopefully it shouldn't add too much latency. Considering you are on the inranet, it might be OK.

Igor Zevaka
Thanks -- I had thought of that. Problem is, some of the links have javascript events already tied to them (e.g. show in thickbox). Is there some way I can store the entire event and then get it to execute later? Note that they're not onclick events, they are also dynamically applied using jQuery.
Jhong
A: 

I would try a different approach. You can bind to a different event like:

$(window).unload(function(event) {
  // tracking code here
});
kingjeffrey
Thanks -- good thought. But given that I still need to return true, and can't send the request synchronously, wouldn't this still be subject to "cutting off"?Also some links open in new windows or in a thickbox-style popup. I assume these wouldn't fire window.onunload, and I need to track them too.
Jhong
Yeah, `unload` may not be the event for you. Without being able to send it synchronously, the only option I can think of is a some kind of delay loop to artificially slow your code. But that is less than desirable.
kingjeffrey
I think I'm going to have to try this. Just a bit scared of locking people's systems up. Many users are still on IE6 on quite old laptops.If I loop waiting for the request to complete or timeout, will background threads still execute? I have some on-page animations, etc, it would be nice for them to carry on in the half-second or so it takes for my loop.
Jhong
Other processes should continue to run, but in lower preforming browsers, you may see them lock up. If it's not true ajax, how will you know when the request is complete? Do you just cycle through a basic loop until a setTimeout fires, and hope? That is a tough set of constraints you are working with!
kingjeffrey
JSONP with GET can receive a response. That's why I'm reticent to use POST/iframe because the response would be sandboxed.that said, I don't act on the response -- just let the user go onto whatever they were clicking.
Jhong
I am thinking a combination of your original response and a loop will work.Since it is only events that fire window.unload that fail, I can also hook into onunload and only add a delay there. I can probably listen for a maximum of say, 1 second, before giving up and moving on.I'm thinking also that wrapping an artificial loop around the ASP code might stop it from abandoning execution.I'll let you know the result. Probably "all of the above".
Jhong
If you get a response, why can't you set `async: false` and avoid the loop hack?
kingjeffrey
A: 

Solved!

The short answer is: there is no reliable way to do this cross-domain with a GET request. I tried all sorts, including storing the event and trying to replay the event later, and all manner of hacks to try to get that to work.

I then tried tight loops, and they weren't reliable either.

Finally, I just gave in and used a dynamically created form that POSTed the results, with the target set to a hidden iFrame.

That works reliably -- it seems the browser pauses to finish its POST request before moving on, and ASP honours the POST. Turns out it's not 'clunky' at all. Sure, due to the browser security model I can't see the result... but it doesn't matter in this case.

I am now kicking myself that I didn't try that option first.

Jhong