views:

121

answers:

3

I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?

Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.

Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.

Edit: Pagination is not possible in this scenario.

Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.

+1  A: 

I quite like the 'Live Image Search' which provides more data as you scroll down.

daddywoodland
+1  A: 

If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.

You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.

Matt Hamsmith
+1  A: 

Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.

In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.

wtaniguchi
Rendering tables is slow. The size depends on very much factors, and it takes a quite a few recalculations to get the final dimensions.
Dykam
It is always a good practice to separate data from display. Keep your Data in an array, then only update the table fields that changed. You can easily bind a cell to an array value. For instance: array[4][2] contains the value from the 4th row/2nd column.
Mike
Checking for updated/dirty values would take longer than updating it altogether I guess. But I guess mapping a position (array[4][2]) to the DOM Element (2nd td from the 4th tr) would make it faster, since you won't have to lookup it over and over again.
wtaniguchi