I'm working on a utility web app that to help manipulate some domain-specific XML data.
The flow goes like this:
- Load XML file
- Parse XML file using the browser's native XML objects (not jQuery!) and convert into JavaScript object.
- Store resulting object using $(document).data()
- Iterate through object and extract additional information, storing that in another $(document).data() slot
window.setTimeout()
to split the work up into chunks.
Here is the function:
function explodeDataStep(index, max) {
var data = $(document).data('data');
var lists = $(document).data('lists');
$debug('explodeDataStep', index, $(document).data('data'), $.data(document));
var count = 0;
for (index; index < data.vehicles.length; index++) {
var vehicle = data.vehicles[index];
if ($.inArray(vehicle.make, lists.make) < 0) lists.make.push(vehicle.make);
if ($.grep(lists.model, function(v) { return v.make == vehicle.make && v.model == vehicle.model; }).length == 0) lists.model.push({ make: vehicle.make, model: vehicle.model });
if ($.inArray(vehicle.module, lists.module) < 0) lists.module.push(vehicle.module);
if ($.inArray(vehicle.doorlock, lists.doorlock) < 0) lists.doorlock.push(vehicle.doorlock);
if ($.inArray(vehicle.doorlockCombo, lists.doorlockCombo) < 0) lists.doorlockCombo.push(vehicle.doorlockCombo);
if ($.inArray(vehicle.tHarness, lists.tHarness) < 0) lists.tHarness.push(vehicle.tHarness);
count++;
if (count >= max) {
index++;
updateExplodeDataStatus(index);
window.setTimeout(explodeDataStep, 10, index, max);
return;
}
}
finishExplodeData();
}
For some reason, when the index gets up to around 480, I'm noticing that some of the data stored in $(document).data('data') just disappears, and I can't for the life of me figure out why.
So, here are some questions that may lead to the answer:
- Is using
window.setTimeout()
in this fashion an incredibly bad idea? - Are there limits as to how much can be stored using jQuery.data()? My XML file is ~100KB.