Hello!
I was wondering how I could download a webpage in php for parsing?
Hello!
I was wondering how I could download a webpage in php for parsing?
You can use something like this
$homepage = file_get_contents('http://www.example.com/'); echo $homepage;
Since you will likely want to parse the page with DOM, you can load the page directly with:
$dom = new DOMDocument;
$dom->load('http://www.example.com');
when your PHP has allow_url_fopen enabled.
But basically, any function that supports HTTP stream wrappers can be used to download a page.
You can use this code
$url = 'your url';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
// you can do something with $data like explode(); or a preg match regex to get the exact information you need
//$data = strip_tags($data);
echo $data;
Just to add another option because it is there, while not the best is just to use file. Its another option that I dont see anyone has listed here.
$array = file("http://www.stackoverflow.com");
Its nice if you want it in an array of lines, whereas the already mentioned file_get_contents will put it in a string.
Just another thing you can do.
Then you can loop thru each line if this matches your goal to do so:
foreach($array as $line){
echo $line;
// do other stuff here
}
This comes in handy sometimes when certain APIs spit out plain text or html with a new entry on each line.