views:

123

answers:

11

Basically I've got a huge table, which gets even bigger as user scrolls down (auto preloading subsequent rows). At some point browser becomes sluggish, it starts to hang for a moment as I click around or try to scroll and more sluggish it becomes, the more rows it gets. I wonder if there is any limit on number of elements that page can hold? Or maybe it's just my javascript leaking somewhere (although I've got only one event handler, attached to the tbody of the table - and a script that parses bubbled mousedown events).

Update: Delay becomes noticeable after a thousand of loaded rows. The speed of scroll itself is pretty bearable, but for example highlighting of the clicked row (with the help of single event handler on tbody) is painful (it takes at least 2-3 seconds and delay grows with the number of rows). I observe delay on all browsers. It's not only me, but almost everyone who visits the page, so I guess at some extent it affects every platform.

Update: I came up with simple example here: http://client.infinity-8.me/table.php?num=1000 (you can pass whatever number you want to num), basically it renders a table with num rows and has a single event handler attached to a parent table. I should conclude from this, that there actually is no noticeable dropdown in performance, caused by number of child elements. So it's probably a leak somewhere else :(

A: 

I don't think there is a limit. However, the longer a HTML file is, the more resources, your computer will need. But the table has to be very large then...

JochenJung
A: 

The limit is really determined by the user agent and client machine being used. HTML, in the same way as XML, is a tree format of data. Therefore the more elements, the further through the tree the client browser has to search to render the page.

I had issues adding more than 100 tables to a div (as an old workaround to IE6 not being able to create table elements dynamically).

Russell
What was the issue with IE6 not being able to create table elements dynamically? There are 2 issues I know of: 1 is that if you create using `.appendChild()` you need to ensure you have/add a `tbody` if you want to add `TR` rows. The second is that you can't use innerHTML e.g. `TableElem.innerHTML = '<tr><td>....</td></tr>';` a bug that exists in IE today due to Eric Vasilik's 14 year old bug: http://www.ericvasilik.com/2006/07/code-karma.html
scunliffe
IE 6 didn't allow me to use document.createElement("table"). I think it was the issue with having a tbody element as well. This was quite a while ago (we dont drink 9 year old milk :P ).
Russell
+3  A: 

If you have got JS on each table row then old computers will not handle that. For HTML itself you shouldn't worry much.

You should worry about fact that normal human being doesn't like large tables this is what pagination is made for. Separate it using paging for better usability nor other concerns.

Think of book that doesn't have pages but one large page, would you like to read it? Even if your eyes (PC in our case) can handle it.

eugeneK
+2  A: 

I don't think there is a limit defined by the standard. There might be a limit hard-coded in each browser implementation, though I would imagine that this limit is likely to be billions of elements. Another limit is the amount of addressable memory.

To solve your problem: As well as automatically loading elements as you scroll down, you could automatically unload the ones that have scrolled up off the screen. Then your program will remain fast even after scrolling a lot.

You may also want to consider an alternative interface such as paging.

Mark Byers
I guess unloading is the choice (at least it will work with what I have without a big rewrite). But I'd like to know exactly what's happening to have it in mind for future, probably it's a topic for other questions, with this one I wanted to figure out if there are any limits on number of elements within a single page. To be able to approximately estimate possible boundaries.
jayarjo
A: 

What browser are you using? Some browsers can handle things better than others - for instance, if you're using IE, I wouldn't be surprised if it's sluggish - it doesn't handle javascript events as well as webkit based browsers.

The other thing is obviously your computer (RAM and CPU). But to be honest, most computers shouldn't have a problem with that unless we're talking 10000+ rows... and even then...

Can you post some code?

xil3
A: 

Given that there exists a multitude of browsers and rendering engines and all have different performance characteristics and also given that those engines get steadily improved in regard to performance and computer hardware gets faster all the time: No there is no fixed upper limit what a browser can handle. However, there are current upper limits on specific hardware for specific versions of browsers.

If you do not define more what your hardware and browser are its hard to help you. Also, noone can make any suggestion in regard to the possible performance of your Javascript if you don't post the code. Even if it's just one event handler, if it loops infinitly or if its called for each element, it can considerably slow down the rendering process.

inflagranti
A: 

Nevermind RAM or CPU usage, what's the actual size of the file after it preloads?

If your table is really that huge, you could be forcing your users to download megabytes of data - I've found that some machines tend to hang after 2-4MB of data.

Damien Dennehy
It loads data dynamically, generally it was not meant to be limited at all (as I wasn't aware of any constraints, I thought if there would be any limits they will be over millions or billions). Right now the hang becomes noticeable after 1 or 2 thousands of rows.
jayarjo
+2  A: 

Another thing you should look at is table sizing. If you have your table styled with table-width:auto; the browser has to measure every single element in the table to size it. This can get insanely slow.

Instead, choose a fixed width or at least style the table using table-width:fixed.

Dave Markle
+1 for providing assistance in minimising any limitations or latency experienced by the OP without mentioning CPU, RAM or browser. :)
Russell
Thanks for interesting observation, are such thing measured or benchmarked anywhere? My table has fixed width and I observe this rising delay in all browsers, especially noticeable it is in IE7/8 and FF (with FireBug turned off). As to CPU - I'm not sure how to measure it, do you got a link to a valuable resource on the topic?
jayarjo
A: 

Depends. IE for example will not start rendering a table until all the content is loaded. So if you had a 5,000 row table it needs to load all 5,000 rows of data before rendering any of it where as other browsers start rendering once they have partial data and just adjust a bit (if needed) as the table grows.

Generally speaking rendering slows with the quantity of nodes but also with the complexity of nodes... Avoid nested tables, and if at all possible, try to break up huge tables into chunks... E.g. Every 100 rows (if possible)

scunliffe
I believe IE, just like all other browsers, does render the table during download if you have specified column widths in the html. If the widths are not specified, the browser has to download the whole table in order to be able to calculate the widths.
PauliL
Incorrect. You can see IE's behavior in action by creating a table (say 20 rows), specify any widths you want, and insert a `script` tag at random locations in your table's `td` tags with an `alert('in row X');`. You will get the alerts in IE (all of them) **before a single part** of the table renders. Where as if you load this page in Firefox, Chrome, Safari, or Opera you will get the alerts simultaneously as the table renders.
scunliffe
A: 

There really isn't a reason in the world for publishing an entire huge dataset on a single page. If the requirement is to provide the user with all that data, then you should export it to a file that can be read by some better software than a browser.

Instead, I suggest that you make an AJAX driven page, where you let the user see a portion of the data and if they need to see more, you would just download that portion of the dataset and replace the current dataset on the page. This is pagination. Google search is an excellent example of this.

Gert G
A: 

If there are any limits, it depends on the browser. But the problem you have it not about limit, since the browser still displays the page.

Big tables are always problem with browsers. Rendering a large table takes lots of time. Therefore, it is often good idea to split a large table into smaller tables.

Further, you probably want to specify the column widths. Without that, the browser has to download the whole table before it can calculate width of each column and render the table. If you specify the widths in the HTML code, the browser can display the page while it is still downloading. (Note: specifying the width of the whole table is not enough, you need to specify the width of each column.)

If you add a single line into a big table using Javascript, the browser most likely has to render the whole table again. This is why it becomes so slow. If you have smaller tables, only the one small table needs to be rendered again. Better still, if you load one sub-table at a time instead of just one line.

But the most effective method is to split the data into multiple pages. The users probably prefer that, too. That is why for example Google displays only so many results on each page.

PauliL