views:

21

answers:

1

I am attempting to download fairly large files (up to, possibly over 1GB) from a remote HTTP server through a PHP script. I am using fgets() to read the remote file line by line and write the file contents into a local file that is created through tempnam(). However, the downloads of very large files (several hundred MB) are failing. Is there any way I can rework the script to catch the errors that are occurring?

Because the download is only part of a larger overall process, I would like to be able to handle the downloads and deal with errors in the PHP script rather than having to go to wget or some other process.

This is the script I am using now:

$tempfile = fopen($inFilename, 'w');
$handle = @fopen("https://" . $server . ".domain.com/file/path.pl?keyID=" . $keyID . "&format=" . $format . "&zipped=true", "r");
$firstline = '';
if ($handle) {
 while (!feof($handle)) {
  $buffer = fgets($handle, 4096);
  if ($firstline == '') $firstline = $buffer;
  fwrite($tempfile, $buffer);
 }
 fclose($handle);
 fclose($tempfile);
 return $firstline;
} else {
 throw new Exception ('Unable to open remote file.');
}
+3  A: 

I'd say you're looking for stream_notification_callback (especially the STREAM_NOTIFY_FAILURE & STREAM_NOTIFY_COMPLETED constants)

Wrikken
+1 didn't know that one.
nikic
Excellent, thanks! (As a side note, I just added that code, and all of the test files have worked, so haven't been able to really test it.)
Wige