tags:

views:

532

answers:

2

I'm currently using the following code to scan files that have been uploaded as part of an application form:

$safe_path = escapeshellarg($dir . $file);
$command = '/usr/bin/clamscan --stdout ' . $safe_path;
$out = '';
$int = -1;
exec($command, $out, $int);

if ($int == 0) {
    // all good;
} else {
    // VIRUS!;
}

It works, but is slow. Anyone got any suggestions that would a). speed things up and b). improve the script generally (for instance, I'm not entirely clear on the benefits of exec() vs system(), etc)?

If the speed can't be improved then I'd ideally like to display some kind of interim "Please be patient your files are being scanned" message, but am not sure how to go about that either.

EDIT: Sorry, should have said the scan needs to be done at the time as the application in question won't be accepted without valid (i.e virus-free) files.

+1  A: 

If you don't need to display the results to the user instantly, you could add the file to a Database table for scanning later.

Then, you could fork a new process to scan and update the results in the table. You have a good example here: http://robert.accettura.com/blog/2006/09/14/asynchronous-processing-with-php/.

If you absolutely need to display the results within the same request, then you could do it exactly as I said before but outputting a temp page requesting the results via AJAX; once the scan is over, redirect the user to the results page.

If you don't want to use JavaScript, then a simple meta refresh tag would do the trick.

Seb
A: 

Set up a seperate application, ideally on a different box where you can batch these scans. That box can update it's status into the database where your frontend service can read and report back to the user.

Jesse Weigert