views:

49

answers:

3

I have this Perl CGI program and I allow the user to select the number of data to view on this HTML table. I go through a foreach ... foreach ... and print each row.

There is an issue with the Perl CGI script when it prints over 3,000 rows of data and my Firefox window becomes unresponsive. I am also linking dataTable jquery.

What approaches can I do to prevent freezing of browser window?

+1  A: 

Possible answers are (in order of personal preference):

  1. paginating results
  2. not using jQuery events on such a large table (or very sparingly)
  3. build a better browser.
Wrikken
+1 for #3 :) ..
DVK
+3  A: 

Most likely, the browser window freezes due to resource consumption on the browser when displaying a large table; and has nothing to do with your back-end Perl CGI code.

The easiest way to confirm that is to append a log-print statement (e.g. print to STDERR) to the very end of your CGI script and print the timestamp; then you run the script and look at your web server logs to see when the script completed. Almost certainly, you will find it has completed very fast.

Another (though less reliable) indicator that it's the browser that is the bottleneck is observing consumed memory and CPU in your favorite process management program on your OS (TaskManager/ProcessExplorer/ps/top)

Now, as far as the problems with the browser display, Wrikken gave some good suggestions. Avoiding complicated event handling (including jQuery stuff) on a very large table is a Good Idea if you can; use classes instead of IDs where possible.

Some others would be:

  • Use pre-defined pixel width for all columns (and ideally all rows for the bargain). Browsers work a lot faster when rendering a table when they don't have to compute this on the fly for every row.

  • Consider switching to DIV based table instead of TABLE. To be honest I don't know if that would help speed wise or not - it'd be a good SO question or better yet, try it and benchmark. It may not be a philosophically pure solution for presenting tabulated data though.

  • As Wrikken said, paginate. No user can process a 3000 row table efficiently even if it printed blindingly fast, so actually drawing it is rather useless unless it's somehow filterable/hideable ala real spreadsheets.


There's a slight other possibility - part of the delay, IF the table is very large as far as pure volume of HTML text that is produced; and IF the network connection is not fast - could be due to having to download this.

If that's the case, it's VERY easy to test for: enclose your entire table in comment tag (<!-- <TABLE> ... </TABLE> -->, assuming your table has no comment tags inside). Print some small text instead. after the comment tag closes.

Then re-run the page.

If the small text shows in your browser very fast, it means that downloading the table was NOT the culprit (as well as another confirmation that CGI script is not the bottleneck); since you'd be generating and downloading the same large table file but not rendering the actual table on the browser.

If on the other hand it takes a long time to download the page with commented-out table to the end (and CGI was confirmed to be fast via log printing), you need to work on slimming down your page's size - there are many techniques to do that but the are too big to fit on the margins of this answer.

DVK
thanks for the comments.
Gordon
I did a test and cgi scripts have no program creating the thousands of line of data. It seems the "browsers freezes due to resource consumption on the browser when displaying a large table".
Gordon
I will probably have to paginate the data
Gordon
+1  A: 

Obviously Pagination is the best solution here. But there are a number of things that can be done to deal with this kind of issues:

  • See if your web server has compression enabled. Most modern browsers(IE and FF) send mode deflate headers so that if compression is enabled in web server, you get highly compressed response. For my case a response of 1Mb was down to 87 Kb.

  • You can install firebug of not done already and track all details about your script like how much time was taken to generate the response, how much time is taken in response download, how much time is taken in rendring and client scripts. Same is available under "Net tab".

  • I have tested and DIV based tables are not the solution here, Use Divs only for layout and tables for views etc. Some times we use Div based table layout as in IE it can be displayed as soon as its received instead of tables which are displayed only closing table tag is received.(This is the default behaviour as compared to FF where in progressive rendering is used)

  • Once you track timings, it could be issue with your jquery to eat up resources on client side.

  • If you just want to display simple table without much fancy stuff use perl printf to push data to client(browser) which is already formatted in form of table and display it under PRE tag. I am displaying some where 4800 rows with 9 columns and browser remain responsive.

  • Here I am pushing data in batches of 500rows so that response starts comming asap and it seems smooth.

awake416