views:

1161

answers:

4

I thought this might be a fast way to remove the contents of a very large table (3000 rows):

$jq("tbody", myTable).remove();

But it's taking around five seconds to complete in firefox. Am I doing something dumb (aside from trying to load 3000 rows in to a browser)? Is there faster way to do it?

+14  A: 
$(myTable).empty();

That's as fast as you get.

Seb
Hmmm. Frustrating. I would think that deleting would be much faster than insertion. Kind of makes me want to do really ugly stuff like just hide the table and create a new one when I want to update it.
morgancodes
Yeah, well... HTML was not created to show 3k rows in a page :) Can't you think of any paginated solution? That would make it much quicker. Sure it would demand more work, but it will be a much richer user experience.
Seb
I built it paginated originally, but the client insisted on scrolling :( Good news is 3k rows is an edge case. More common will be a few hundred.
morgancodes
+2  A: 

Two issues I can see here:

  1. The empty() and remove() methods of jQuery actually do quite a bit of work. See John Resig's JavaScript Function Call Profiling for why.

  2. The other thing is that for large amounts of tabular data you might consider a datagrid library such as the excellent DataTables to load your data on the fly from the server, increasing the number of network calls, but decreasing the size of those calls. I had a very complicated table with 1500 rows that got quite slow, changing to the new AJAX based table made this same data seem rather fast.

artlung
Thanks artlung. Doing something a bit like that actually, getting all the data at once from the server, but only drawing table rows when needed.
morgancodes
Sounds like a good call. I am wondering if worrying about the number of rows in a table in a browser will always be an issue, or if as memory for most computers go up this will be less of an issue.
artlung
Memory isn't a problem with the amount of data I'm loading. The bottlneck is DOM manipulation.
morgancodes
I think we're saying the same thing. The more data you load, the more DOM nodes you load, to me these are related in terms of memory needed. I hope your situation has improved, regardless.
artlung
+3  A: 

Huge Improvement in Performance with Patch

I applied The patch for the bug and got a huge speed boost. Previously, removing approximately 4800 child elements took 42s. It now works in less than 1s. YMMV, but the changes looked pretty surgical and safe. Will report back if I see trouble.

As there were other diffs in the development version, this had to be done "by hand."

Michael Mikowski
Cool! Thanks for reporting back!
morgancodes
If you like, I could send you a copy of the patched code. It really is a few orders of magnitude faster.
Michael Mikowski
Nigthly Builds of Jquery v1.4a2pre have this patch. http://docs.jquery.com/Downloading_jQuery#Nightly_Builds
Luis Melgratti
A: 

if you want to remove only fast.. you can do like below..

$( "#tableId tbody tr" ).each( function(){
  this.parentNode.removeChild( this ); 
});

but, there can be some event-binded elements in table,

in that case,

above code is not prevent memory leak in IE... T-T and not fast in FF...

sorry....

nayasis