views:

130

answers:

4

Hi,

Im wondering about speed optimization in PHP.

I have a series of files that are requested every page load. On average there are 20 files. Each file must be read an parsed if they have changed. And this is excluding that standard files required for a web page (HTML, CSS, images, etc).

EG -> client requests page -> server outputs html, css, images -> server outputs dynamic content (20+/- files combined and minified).

What would be the best way to serve these files as fast as possible?

+4  A: 

Before wondering of speed optimization one should wonder of profiling, which consists of two parts:

  • Decide if we ever need any speed optimization.
  • If so - determine certain part of our application that become a "bottleneck" and demands optimization, unlike any other parts.

The last one could lay surprizingly far, far away from one, you dreamed about.

Col. Shrapnel
+1 for profiling first. Would be even better if specifics were mentioned (xdebug, Firebug, etc)
Manos Dilaverakis
A: 

If you are talking about PHP files, use eAccelerator. If you are talking about other files, check the filemtime to see if you have they have changed and if you have to parse them again.

Also, use yslow to determine why your website is slow.

Sjoerd
+1  A: 

You've not provided enough information here to make a sensible answer. By far the biggest benefit is going to come from effective caching of content - but you probably need to look at more than just the headers your are sending - you probably need to start tweaking filenames (or the query part of the URL) to effectively allow the browser to use newer content in place of cached (and not expired) content.

I have a series of files that are requested every page load. On average there are 20 files.

Are these all php scripts? Are they all referenced by the HTML page?

Each file must be read an parsed if they have changed

Do you mean they must be read and parsed on the server? Why? Read this post for details on how to to identify and supply new versions of cacheable static content.

Have you looked at server-side caching? Its not that hard:

<?php

    $cached=md5($_SERVER['REQUEST_URI']);
    $lastmod=time() - @mtime($cache_dir . $cached)
    if ((file_exists($cache_dir . $cached)) && (60 < $lastmod)) {
       print file_get_contents($cache_dir . $cached);
       exit;
    } else {
       ob_start();
   ... do slow expensive stuff
       $output=ob_get_contents();
       print $output;
       file_put_contents($output, $cache_dir . $cached);
    }
?>

Note that server-side caching is not nearly as effective as client-side caching.

Once you've got the caching optimal, then you need to look at serving your content from multiple hostnames - even if they all point to the same server - a browser limits the number of connections it makes to each hostname so requests are effectively queued up when they could be running in parallel.

For solving your problem further, we'd need to know a lot more about your application.

C.

symcbean
A: 

There are several ways of keeping an opcode cache for PHP files, which automatically check for file-modifications. APC is one of them I very much like.

Wrikken