views:

52

answers:

1

I'm trying to scrape some pages, from a list on a text file, from a domain and save them onto my server.

I have the following code (with the domain obscured), culling from a text file list of the file directories, and then copying the file names, but with .html appended.

For some reason, its creating the files without actually successfully writing to them. What am I doing wrong?

<?php
$file = fopen("list.txt","r");

while(! feof($file))
  {
 $line = fgets($file);
$url = "http://www.????.com". $line;
$homepage = @file_get_contents($url);
$newname = rtrim(substr(strrchr($line, "/"), 1)) . ".html";
$fh = fopen($newname, 'w') or die("can't open file");
$stringData = $homepage;
fwrite($fh, $stringData);
fclose($fh);

  }

fclose($file);
echo "success!";
?>
+1  A: 

You should remove the @ before file_get_contents. If it is giving you an error, the @ will prevent it from displaying.

Maybe you don't have fopen wrappers installed, or the url returned 404. You can't tell unless you see the warning or error from that call.

Byron Whitlock