views:

91

answers:

5

What is the best way to record errors experienced by the user?

My initial thought was to make a function that recorded the error with a unique number and maybe a dump of the variables into a record on the database.

Is there a better approach? Should I use a text file log instead?

A: 

Well, at least writing to text files on the local system should be less error prone, thus allowing you to catch DB errors too :)

I would prefer to write a decent dump of the current state to a simple log file. In addition to your "own" state (i.e. your application's variables and objects), you might consider doing a phpinfo() to get inspiration as to which environment and request variables to include.

jensgram
+3  A: 

How about overriding the default PHP errorhandler? This site should give some basic information: http://www.php.net/manual/en/function.set-error-handler.php and the first comment on http://www.php.net/manual/en/function.set-exception-handler.php

You might also want to store database errors, perhaps some kind of custom function that allows you to use code like:

<?php
$objQueryResult = mysql_query("query here") or some_kind_of_function_here();
?>

You might want to store the recorded errors in a file, which is outside your public html root folder, to make sure people can't access it by accident. I would also assume, you'd want to store a complete stacktrace in such a file, because then you can actually debug the problem. When overriding the default errorhandlers, please note you don't forget to send a nice message to the user (and exit the script, when needed).

I would recommend storing:

  • $_POST
  • $_GET
  • A complete dump of debug_print_backtrace()
  • Possibly the SQL that triggered this?

I would suggest you to use debug_print_backtrace() to make sure you get a summary of data. The debug_backtrace() function gives about the same information, but it can sometimes just give you too much information. The code you could use to catch backtraces:

<?php
ob_start();
debug_print_backtrace();
$trace = ob_get_contents();
ob_end_clean(); 
?>

To store this, you could use a plain text output, if you don't get too much errors, otherwise perhaps use something like sqlite? - Just don't use the same SQL connection to store the errors, as that might trigger more problems, if you're having webserver to SQL connection errors.

Icheb
+1. In regards to the `mysql_query(...) or ...` you might consider writing a custom `db_query(...)` which itself can handle the error catching and logging. That way, you'll be able to log the failing SQL as well.
jensgram
A: 

PEAR::Log is handy for this kind of logging. e.g.

$logger->alert("your message");
$logger->warning("your message");
$logger->notice("your message");

etc.

You can log to a file or to a database, I wrote a PDO enabled sqlite extension , pretty simple.

These are handy to put into exception handling code too.

PEAR::Log

Records: id, logtime, identity, severity 1-7( ie "Alert"), and your message.

Cups
A: 

I think @Icheb's answer covers it all.

I have tried something new this year in a project that I thought I'd share.

For a PHP based content aggregation / distribution service, an application that runs quietly in the background on some server and you tend to forget, we needed an error reporting system that makes sure we notice errors.

Every error that occurs has an Error ID that is specified in the code:

$success = mysql_query(this_and_that);
if (!$success) log_error ("Failed Query: ".mysql_error(), "MYSQL_123");

Errors get logged in a file, but more importantly sent out by mail to the administrator, together with a full backtrace and variable dump.

To avoid flooding with mails - the service has tens of thousands of users on a good day - error mails get sent out only once every x hours for each error code. When an error of the same code occurs twice within that timespan, no additional mail will be sent. It means that every kind of error gets recorded, but you don't get killed by error messages when it's something that happens to hundreds or thousands of users.

This is fairly easy to implement; the art is getting the error IDs right. You can, for example, give every failed mySQL query in your system the same generic "MYSQL" Error ID. In most cases, that will be too generic and block too much. If you give each mySQL query a unique error ID, you might get flowed with mails and the filtering effect is gone. But wWhen grouped intelligently, this can be a very good setup.

Pekka
A: 

From the usability point of view, the user should not Ever experience errors. Depending on the error you should make different strategies:

  • non catchable errors or difficult to catch from PHP, read the logs for each application
    • Apache
    • MySQL and DB errors, transactions
    • prepare php with "site being updated" or error controllers for emergencies.
  • PHP errors
    • these should be detected through Exceptions
    • silenced but not forgotten, don't try to fix them on the fly
    • log them and treat them
  • interface errors
    • an advice: allow user to submit suggestions or bugs

I know this does't cover all, is only an addendum to the others have suggested.

Elzo Valugi