views:

21

answers:

1

Am I using TwitterScript to retrieve Twitter data for inside a Flash site. Due to Twitter's crossdomain policy, I need to setup a php proxy... Firstly I made a simple one

<?php 
$url = $_GET['url'];
readfile($url);
?>

but I then get this error

URL file-access is disabled in the server configuration

which is only resolved by getting my host to turn fopen() on, which I don't want to do.

Then I found this

<?php

function get_content($url)
{
$ch = curl_init();

curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_HEADER, 0);

ob_start();

curl_exec ($ch);
curl_close ($ch);
$string = ob_get_contents();

ob_end_clean();

return $string;    
}

#usage:
$url = $_GET['url'];
$content = get_content ($url);
var_dump ($content);
?>

which solves that problem but the data now is the correct XML but looks like:

string(39950) "<?xml version="1.0" encoding="UTF-8"?>
<statuses type="array">
<status>
...
</statuses>"

How do I get the XML data out of that string?

+1  A: 

Your problem is that you're using var_dump() when you want to output the result. Just use echo instead. Also you don't need output buffering to get the content of the site.

Fixed Code

<?php

function get_content($url)
{
$ch = curl_init();

curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);

$string = curl_exec ($ch);
curl_close ($ch);

return $string;    
}

#usage:
$url = $_GET['url'];
$content = get_content ($url);
echo $content;
?>
svens
thanks svens it works perfectly!!
daidai