I would ditch the "each" function in favour of a for loop since it is faster. I would also add some waits using the "setTimeout" but only every so often and only if needed. You don't want to wait for 5ms each time because then processing 3500 records would take approx 17.5 seconds.
Below is an example using a for loop that processes 100 records (you can tweak that) at 5 ms intervals which gives a 175 ms overhead.
var xmlElements = $(xmlDoc).find('Object');
var length = xmlElements.length;
var index = 0;
var process = function() {
for (; index < length; index++) {
var toProcess = xmlElements[index];
// Perform xml processing
if (index + 1 < length && index % 100 == 0) {
setTimeout(process, 5);
}
}
};
process();
I would also benchmark the different parts of the xml processing to see if there is a bottleneck somewhere that may be fixed. You can benchmark in firefox using firebug's profiler and by writing out to the console like this:
// start benchmark
var t = new Date();
// some xml processing
console.log("Time to process: " + new Date() - t + "ms");
Hope this helps.