views:

1139

answers:

4

I'm using jQuery.getJSON(...) to make a call/process response for a slightly large data set. The response time being a couple of seconds is expected(there's an animated loading graphic to placate the user).

All being said, the loading graphic, response, process, etc are working fine in all browsers. In Internet Explorer(6/7/8), however, the "Stop running this script" error appears. If allowed to proceed, the script completes with no issue.

$(document).ready(function() {
    $("#tree").treeview({ collapsed: true, animated: "slow" });
    $("#tree").hide();

    $("#loadingGraphic").fadeIn("slow");

    $.getJSON("mygenerichandler.ashx", function(data) {
        var items;
        for (var i in data) {
            items = $(buildHierarchy(data[i])).appendTo("#tree");
            $("#tree").treeview({ add: items });
        }

        $("#loadingGraphic").fadeOut("slow", function() {
            $("#tree").slideDown("slow");
        });
    });
});

function buildHierarchy(data) {
    var li;

    li = "<li><span class='folder'>" + data.Name + "</span><ul>";

    for (var i in data.Folders) {
    //recursive call to fill out children
        li += buildHierarchy(data.Folders[i]);
    }

    for (var i in data.Files) {
        li += "<li><span class='file'>" + data.Files[i].Name + "</span></li>";
    }

    li += "</ul></li>";

    return li;
}

I realize this Internet Explorer has a preference you can set via the Windows registry, however, I'm curious how other developers handle expected large or slow responses back in an AJAX request.

EDIT: Updated code snippet

+2  A: 

The slow script notification most likely is not for the actual request, but for the script you are running to process the data received by the request. This could also be the jQuery code that parses your JSON.

If you post your script that is "manipulating the data" (i.e. the commented portion in your snippet), we could perhaps help in optimizing it.

[Edit] You're right. You should consider lazy loading of the tree. How many root nodes do you usually have? You may have some luck taking the appendTo() out of the loop. Either build the whole HTML in one go and do a massive appendTo() or use an intermediate div not attached to the DOM to accumulate the nodes and then append to the main #tree element.

var tempDiv = $('<div></div>');
for (var i in data) {
    tempDiv.append($(buildHierarchy(data[i]));
}
$("#tree").append(tempDiv);
$("#tree").treeview({ add: tempDiv }); //don't know if this works? Maybe tempDiv.children() ?
Chetan Sastry
Point well taken. I've updated the script snippet above with the full code sample, although I may need to restructure the way requests are being made(ie don't populate the entire tree at once).
Alexis Abril
A: 

You could refactor the code so that you called setTimeout(0) before each recursive call so that the UI thread gains control and can be responsive. Google 'setTimeout threading' for more info.

You could also break up the large data set into chunks handled by successive requests. In IE8, you might want to double-check that the native JSON.parse() method is being used, since it's faster than a JS implementation.

Annie
+1  A: 

It's not the request time that is the problem. The request is asynchronous, so there is no script running while you are waiting for the response.

It's the script that is processing the result that is taking too long. Send it to a function that handles a part of the data, and calls itself using a timer to return the control to the browser for a moment:

function handleData(data, offset) {
  // pick the next 1000 items or so:
  var size = Math.min(1000, data.length - offset);
  // process [offset .. offset+size-1] of the data
  // ...
  offset += size;
  if (offset < data.length) {
    window.setTimeout(function() { handleData(data, offset); }, 1);
  } else {
    $("#loadingGraphic").fadeOut("slow", function() {
      $("#tree").slideDown("slow");
    });
  }
}

Call it with:

$.getJSON("mygenerichandler.ashx", function(data) {
  handleData(data, 0);
});
Guffa
+1  A: 

My guess is that it's not the loading of the data nor the processing of the data your doing in the code. I think it's the transformation of the JSON data received over HTP to a Javascript Object.

IE basically does an eval() to build a hash from string data. This is very, very, very inefficient for long strings. I suspect there's a Schlemiel the Painter algorithm behind it. I had exactly the same thing a year or two ago and solved it by removing unused or redundant data from the JSON string.

If you can't shorten the string by removing elements, you can try to chop up the string on the server by breaking it up into component objects ('pages' if you will) and fetch them one by one. That way you don't have to pay the inefficient processing tax for long strings and process several short strings instead.

Michiel Kalkman