views:

353

answers:

2

Hi

I am doing some automation for a client and was planning on running some php scripts via cURL in the process. However URLs that work fine in a browser are coming up 404 when hit by cURL. I have seen mention elsewhere that some servers are configured to block cURL in this manner.

Is this a setting I could edit in httpd.conf? Doesn't seem to be in php.ini.

My code is like so:

  $url = "http://site.com/xxx/curl.php?cID=$c->cID&db=$c->db&un=$c->un&/";
  echo "$url";

  // spoofing FireFox 2.0
  $useragent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) 
    Gecko/20061204 Firefox/2.0.0.1";
  $ch = curl_init ($url) ;
  curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1) ;
  curl_setopt($ch, CURLOPT_USERAGENT, $useragent);
  $res = curl_exec ($ch);

When I echo $res I get the code from the default 404 error page. If I copy the echoed $url and paste into browser I get the page fine. Also tried using relative path and full server path.

The user agent spoofing was added today to try to see if saying I wasn't cURL would help but nothing in the response changed.

Any thoughts on how to track down the root cause of this or better yet solve it?

TIA

JG

+1  A: 

It turns out that the kind folks at The Planet have some internal routing issues with this account. Bad hostname settings in /etc/sysconfig/network and bad resolvers in /etc/resolv.conf among other things. This and a number of other sites were recently migrated from one of their servers to another. Not fixed yet but that was why the 404.

Thanks.

jerrygarciuh
A: 

I'm trying to deal with a site, It first send 404 for file_get_contents, By using -A 'Mozilla/5.0 (X11; U; Linux i686; tr-TR; rv:1.9.2.10) Gecko/20100915 Ubuntu/10.04 (lucid) Firefox/3.6.10' I could get land page, I think if I try a bit more, I could get the same content with firefox :D Man you need to try more.

nerkn