views:

113

answers:

6

With my limited experience/knowledge I am using the following structure to generate HTML tables on the fly from MySQL queries:

$c = 0;
$t = count($results);

$table = '<table>';

while ($c < $t) {
   $table .= "<tr><td>$results[0]</td><td>$results[1]</td> (etc etc) </tr>";
   ++$c;
}

$table .= '</table>';

this works, obviously. But for tables with 300+ rows there is a noticeable delay in pageload while the script builds the table. Currently the maximum results list is only about 1,100 rows, and the wait isn't long, but there's clearly a wait.

Are there other methods for outputting an HTML table that are faster than my WHILE loop? (PHP only please...)

A: 

This might be a step in the right direction.

http://eaccelerator.net/

Shyam
I don't think we need to cache a 4-line WHILE loop, but thanks anyway...
Andrew Heath
@Andrew On one of my projects I got a similar problem. Huge Table with many cells. Just using an OpCode cache gretly sped up things. That's not yet page or query caching. Just the OpCodes. With query caching, I got another huge improvement and caching cells with expensive to render content yielded some more percent.
Gordon
+1  A: 

The first thing I'd try is - don't concatenate your strings. In every language I have used there has always been a bit of a performance penalty for continually adding strings together and spitting out one huge string at the end.

Try using multiple echo statements instead, if you can.

Another one to try - based on my own experience, I'm pretty sure a for loop (not for each) is faster than a while. I'm afraid I have no numbers or proof to back that up, just what I've observed over the years.

Chris
Both suggestions are ridiculous. You can't have numbers, especially noticeable ones. Same for concatenation.
Col. Shrapnel
True, but these are micro-optimzations that yield no substantial improvement. See http://www.phpbench.com/
Gordon
I think this rather relates to memory consumption than raw speed, indeed foreach is slower but it's so little it's hardly noticable, foreach does however create a copy of the iterable object/array in memory so memory usage increases.
ChrisR
+4  A: 

Hi,

First, probably the slowness is a result of the HTML rendering and not the PHP script. Second, it is not very usable to build very large tables it is better to use paging.

You can improve your script performance in the PHP side in a few ways:

  1. use ob_start(); and ob_get_clean(); - this way the data will be passed to the html at once:

    ob_start();
    // your code here
    echo ob_get_clean();
    
  2. use array and join for strings:

    $str = array();
    $str[] = 'add the strings';
    echo implode($str);
    

BR.

aviv
+1  A: 

This is browser issue, not PHP. Save your table into HTML and open it - the result would be the same.

Just avoid such huge HTML tables. I'd suggest to use some pagination, i.e. to split your table into smaller pages, like SO doing with questions list

Col. Shrapnel
A: 

load it from a xml file. Below is the logic .

You know when a table gets updated. When a table gets updated Write all that has to be displayed in a xml file.When a user loads the page load data from the xml file. This way you do not have to run queries and iterate loops when user is loading the page. It is faster this way. check and post back if it works.

If you feel the above logic is still slow and you have even more data do the following - When the page loads first load 100 tables data. Then fire a ajax request and get the next 100 nodes from xml file and display it. Once this is finished again fire a ajax request get the next 100 nodes from the xml file and show it in the page and so on..

Taha
Perhaps I should have noted that these are customized search results that will have different content, different headers, and even different #s of columns from query to query depending on the users' preference. In this way, I don't think writing to a static file and updating as needed is suitable, as it would literally have to be updated every single time.
Andrew Heath
Then i think you should not be loading 300 + table data. Instead draw only 100 tables and give a more link let the user know from the interface that there are more results for the query and that they have to click the more button.
Taha
do it the way twitter does "More" button at the bottom of the page
Taha
+1  A: 

Are there other methods for outputting an HTML table that are faster than my WHILE loop?

Potentially yes. PHP is a templating language. Use it as such, don't fight it! Native templating is likely to be faster than laboriously sticking strings together manually — and IMO more readable.

<table>
    <?php foreach($results as $result) { ?>
        <tr>
            <td><?php echo htmlspecialchars($result[0]); ?></td>
            <td><?php echo htmlspecialchars($result[1]); ?></td>
        </tr>
    <?php } ?>
</table>

(Note the use of htmlspecialchars. If you don't use this function every time you insert plain text into HTML, your code has HTML-injection flaws, leading to potential cross-site-scripting vulnerabilities. You can define a function with a short name like h to do echo htmlspecialchars to avoid some typing, but you must HTML-encode your strings.)

However: whilst this is prettier and more secure, it is unlikely to be significantly faster in practice. Your client-side rendering slowness is almost certainly going to be caused more by:

  1. network transmission speeds for a lot of table data. You can improve this by deploying with zlib.output_compression, mod_deflate or other compressing filter for your web server, if you have not already.

  2. rendering speeds for large tables. You can improve this by setting the CSS style table-layout: fixed on your <table> element and adding <col> elements with explicit width styles for the columns you want to have a fixed width.

bobince