views:

771

answers:

5

I have a webpage on my site that displays a table, reloads the XML source data every 10 seconds (with an XmlHttpRequest), and then updates the table to show the user any additions or removals of the data. To do this, the JavaScript function first clears out all elements from the table and then adds a new row for each unit of data.

Recently, I battled thru a number of memory leaks in Internet Explorer caused by this DOM destroy-and-create code (most of them having to do with circular references between JavaScript objects and DOM objects, and the JavaScript library we are using quietly keeping a reference to every JS object created with new Element(...) until the page is unloaded).

With the memory problems solved, we've now uncovered a CPU-based problem: when the user has a large amount of data to view (100+ units of data, which equals 100 <tr> nodes to create, plus all of the table cells for each column), the process ties up the CPU until Internet Explorer prompts the user with:

Stop running this script?
A script on this page is causing Internet Explorer to run slowly. If it continues to run, your computer may become unresponsive.

It seems that running the row-and-cell-creation code times 100+ pieces of data is what is causing the CPU usage to spike, the function to take "too long" (from IE's perspective) to run, thus causing IE to generate this warning for the user. I've also noticed that while the "update screen" function runs for the 100 rows, IE does not re-render the table contents until the function completes (since the JS interpreter is using 100% CPU for that time period, I assume).

So my question is: Is there any way in JavaScript to tell the browser to pause JS execution and re-render the DOM? If not, are there any strategies for handling creating large amounts of DOM nodes and not having the browser choke?

One method I can think of would be to handle the "update table" logic asynchronously; that is, once the Ajax method to reload the XML data is complete, put the data into some sort of array, and then set a function (using setInterval()) to run which will handle one element of the array at a time. However this seems a little bit like re-creating threading in a JavaScript environment, which seems like it could get very complicated (i.e. what if another Ajax data request fires while I'm still re-creating the table's DOM nodes?, etc.)


update: Just wanted to explain why I'm accepting RoBurg's answer. In doing some testing, I've found that the new Element() method in my framework (I'm using mootools) is about 2x as slow as the traditional document.createElement() in IE7. I ran a test to create 1000 <spans> and add them to a <div>, using new Element() takes about 1800ms on IE7 (running on Virtual PC), the traditional method takes about 800ms.

My test also revealed an even quicker method, at least for a simple test such as mine: using DocumentFragments as described by John Resig. Running the same test on the same machine with IE7 took 247ms, a 9x improvement from my original method!

+1  A: 

I have experienced similar problems at round 3000 table rows of complex data, so there is something not entirely right with your code. How is it running in firefox ? Can you check in several different browsers.

Are you binding to onPropertyChange anywhere ? This is a really dangerous ie event that has caused me severe ie-specific headaches earlier. Are you using CSS selectors anywhere ? These are notoriously slow on ie.

krosenvold
Negative to your last two questions. Firefox is ok, but noticeably unresponsive during the update. I'll investigate improving the code itself
matt b
+1  A: 

You could create a string representation and add it as innerHTML of a node.

Mehrdad Afshari
+4  A: 

100 <tr>'s isn't really that much... are you still using that framework's new Element()? That might be the cause of it.

You should test the speed of new Element() vs document.createElement() vs .innerHTML

Also try building the dom tree "in memory" then append it to the document at the end.

Finally watch that you're not looking at .length too often, or other bits and bobs like that.

Greg
I've read that building up a sub-tree and then adding it to document at the end can leak objects as well - have you had that experience?
matt b
Hmm nope. I've had other leaks but never heard of that one.
Greg
my sources: http://msdn.microsoft.com/en-us/library/bb250448.aspx and http://www.codeproject.com/KB/scripting/leakpatterns.aspx look under "cross-page leaks" - note though that these are from 2005
matt b
A: 

You could cloneNode(false) the element to be repeated, and then use that for the loop, instead of generating the element each time.

Luca Matteis
A: 

Avoid using one large for loop to render all of the data in one big step. Look at breaking up the data into smaller chunks instead.

Render each smaller chunk with a while loop. After that chunk is done call the next chunk with setTimeout [time 1ms] to give the browser a "breather".

You can avoid using the while loop all together and just use a setTimeout.

Using the setTimeout technique WILL slow down the execution speed a bit, but you should not get the warning message.

Another thing to look at not to append each element individually to the document. Look into creating a new tbody, appending the new rows to the new tbody, and append the tbody to the table.

There are a lot of other things that can cause a slow application. It can be fun to flush them all out.

There is a neat study of the warning here:http://ajaxian.com/archives/what-causes-the-long-running-script-warning

epascarello