I am working on optimizing my site, and I have had the MySQL slow queries log on for a few days now, but after going through >260M queries, it only logged 6 slow queries, and those were special ones executed by me on phpMyAdmin. I am wondering if there is something to log slow PHP page execution time so that I can find certain pages that are hogging resources, rather than specific queries.
You could wrap your scripts in a simple timer, like this:
/*in your header or at the top of the page*/
$time_start = microtime(true);
/* your script goes here */
/*in your footer, or at the bottom of the page*/
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "It took $time seconds\n";
Note that will add two function executions and a tiny bit of math as overhead.
First, there is xdebug, which has a profiler, but I wouldn't use that on a production machine, since it injects code and brings the speed to a crawl. Very good for testing environments, though.
If you want to measure speeds on a productive environment, I would just to the measuring manually. microtime()
is the function for these things in PHP. Assuming you have a header.php and a footer.php which get called by all php scripts:
# In your header.php (or tpl)
$GLOBALS['_execution_start'] = microtime(true);
# In your footer.php (or tpl)
file_put_contents(
'/tmp/my_profiling_results.txt',
microtime(true) - $GLOBALS['_execution_start'] . ':' .
print_r($_SERVER, true) . "\n",
FILE_APPEND
);
Could you not register a shutdown function that calls an end to the timer? http://us3.php.net/register%5Fshutdown%5Ffunction That way you only need to start the timer wherever you think there might be a problem.
what about auto_prepend_file and auto_append_file, just wrote a post about it http://xrado.hopto.org/post/php-slow-log