I'm trying to scrape some pages, from a list on a text file, from a domain and save them onto my server.
I have the following code (with the domain obscured), culling from a text file list of the file directories, and then copying the file names, but with .html appended.
For some reason, its creating the files without actually successfully writing to them. What am I doing wrong?
<?php
$file = fopen("list.txt","r");
while(! feof($file))
{
$line = fgets($file);
$url = "http://www.????.com". $line;
$homepage = @file_get_contents($url);
$newname = rtrim(substr(strrchr($line, "/"), 1)) . ".html";
$fh = fopen($newname, 'w') or die("can't open file");
$stringData = $homepage;
fwrite($fh, $stringData);
fclose($fh);
}
fclose($file);
echo "success!";
?>