views:

342

answers:

3

I'm using a jQuery plugin called Tablesorter to do client-side sorting of a log table in one of my applications. I am also making use of the tablepager add-in.

I really like the responsiveness that client-side sorting and paging brings to the party. I also like how you don't have to hit the web server or database repeatedly.

However I can see that, in time, the log I'm displaying could grow quite large. I'm sure there comes a point where client-side paging and sorting is going to be impractical. What point will this technique begin to collapse under it's own weight? 500 records? 2000 records? 10,000 records?

EDIT: In nutshell, what criteria would you use to determine if you are going to use client-side sorting/paging as opposed to server-side paging? Does the size of expected result set factor into your decision? Where is the tipping point?

+2  A: 

This technique will probably collapse when the browser or client host can't take it.

Use server-side pagination to prevent this.

I would first consider the amount of data I am sending to the client, which in turn causes the Loading time factor.

Say if each row of the table is 200 bytes, and I am sending 10000 rows to the client (which allows client sorting and pagination), I am sending 200 * 10000 = 2,000,000 bytes, aka 2 MB. This will take the browser quite some time to load it from server, then some time for the sorting plugin to sort everything, then pagination some time to page it.

In fact, you server load will increase with the need to send ALL the rows to the client.

Normally with so much data and iteration for Javascript to handle, the browser (Firefox or similar) will lock up and look as though it is crashing.

If you use server side sorting + pagination, the client sees the accurate and up-to-date information. Also say you have the same 10000 rows, each 200 bytes. You have 20 rows per page. You are only sending 20 * 200 = 4000 bytes, which is 4 KB, relatively small and can be handled by the browser/server.

thephpdeveloper
I was looking for a answer that gave me a little more guidance. I'll edit my question to clarify that.
Aheho
updated. thanks for more info; that's what we needed.
thephpdeveloper
+2  A: 

A few hundred is probably okay, depending on the number of columns. This will most certainly break down when you're dealing data on the order of 10^3 (thousands).

These have been my empirical findings across different browsers, but I was usually on beefy hardware. I would limit your data set to hundreds.

Stefan Kendall
+2  A: 

However I can see that, in time, the log I'm displaying could grow quite large. I'm sure there comes a point where client-side paging and sorting is going to be impractical. What point will this technique begin to collapse under it's own weight? 500 records? 2000 records? 10,000 records?

This really depends on a lot of different things like table size number of columns, and which browser and version the person is using. I routinely can sort up to 1000 records before seeing a real problem. If you start approaching this number, I would definitely start looking at server side sorting. With AJAX, server side sorting can be quite efficient and have a decent user experience.

The best way is to look at your particular situation is to try it out and see. Browsers although not really designed to handle really large amounts of data like that can still handle it. The user experience will be abysmal, but the number of records it can handle are quite high.

Kevin
Indeed, it's (almost) totally dependent on the memory size of the client and speed of the hardware (and even speed of the browser's JavaScript engine).
Marcel Korpel