views:

101

answers:

2

I have an application where I need to retrieve a large amount of rows from the database and then print them all on the screen. I've checked the mysql queries, and that's not where the problem is. The problem is the rows all need to be printed on the same page, with no pagination, and it takes a long time (I'm talking about a table with several thoudands of rows). Is there any way to speed things up? The only thing I've found on Google is using "," instead of "." when using echo. I'll be testing this to see if there is any improvement, but I'm not sure it will make such a big difference.

+2  A: 

PHP is not your bottleneck. Here is a presentation that I've attended about PHP performance by Rasmus Lerdorf, the guy who created PHP. PHP is one of the last things that you have to worry in terms of performance. You can print millions of row entries and do complex calculations without problem. Before throwing it to PHP check your code, your queries, your webserver settings.

As Mark suggested you can use flush to send output to browser while still getting information from db.

Elzo Valugi
This is the video of that presentation http://about.digg.com/blog/rasmus-lerdorf-php-performance
AntonioCS
Thank you for the presentation, it is very useful. I've been checking the code and the queries and did some optimization, but that doesn't seem enough. I'll look into using flush and see if that makes things better.
A. M.
+3  A: 

The problem is unlikely PHP, it's the massive amount of HTML being output to the browser. You could verify this yourself by saving the page that you get, then loading the static file from your hard drive.

If you are outputting a large table, then the browser will often have problems with "redrawing" it as content is loaded. This can be minimized by setting a fixed width for all the columns. Just do this on the first row.

You can also jump out of the PHP block to output most of the HTML, then just print the variables where appropriate. Here's an example, assuming all your data is in the $rows variable:

<?php
  // some code here
?>

<table>
<thead>
<tr>
  <th width="50">Header 1</th>
  <th width="200">Header 2</th>
  <th width="150">Header 3</th>
  <th width="100">Header 4</th>
</tr>
</thead>
<tbody>
<?php foreach ( $rows as $r ) : ?>
<tr>
  <td><?=$r['var1']?><td>
  <td><?=$r['var2']?><td>
  <td><?=$r['var3']?><td>
  <td><?=$r['var4']?><td>
</tr>
<?php endforeach; ?>
</tbody>
</table>

<?php
  // continue here if necessary

Note: if you don't have PHP short tags enabled you'll need to do <?php echo $r['var1'] ?> instead.

Finally, you can try adding gzip compression. It's best done at a server level but in PHP you can do add the line ob_start("ob_gzhandler"); as the very first line of PHP code right after <?php.

If this still does not help, then it's a simple fact that your table is way too big. You'd be much better off using pagination or filtering to keep the size down.

DisgruntledGoat
After several tests it seems that although some optimization here and there speeds things up a little, the main problem is the size of the table. So I'll just have to convince them to do some sort of pagination there. I didn't think about the column width before, there are some columns in the table that don't have a fixed width. I'll change that and see if it helps. Thank you.
A. M.
A minor improvement might be to break it up into smaller tables... every 50 items or so, start a new table. You can style it so it looks seamless.
Mark
Good idea, I'll try that.
A. M.