views:

334

answers:

1

We have a script which pulls some XML from a remote server. If this script is running on any server other than production, it works.

Upload it to production however, and it fails. It is using cURL for the request but it doesn't matter how we do it - fopen, file_get_contents, sockets - it just times out. This also happens if I use a Python script to request the URL.

The same script, supplied with another URL to query, works - every time. Obviously it doesn't return the XML we're looking for but it DOES return SOMETHINg - it CAN connect to the remote server.

If this URL is requested via the command line using, for example, curl or wget, again, data is returned. It's not the data we're looking for (in fact, it returns an empty root element) but something DOES come back.

Interestingly, if we strip out query string elements from the URL (the full URL has 7 query string elements and runs to about 450 characters in total) the script will return the same empty XML response. Certain combinations of the query string will once again cause the script to time out.

This, as you can imagine, has me utterly baffled - it seems to work in every circumstance EXCEPT the one it needs to work in. We can get a response on our dev servers, we can get a response on the command line, we can get a response if we drop certain QS elements - we just can't get the response we want with the correct URL on the LIVE server.

Does anyone have any suggestions at all? I'm at my wits end!

+1  A: 

Run Wireshark and see how far the request goes. Could be a firewall issue, a DNS resolution problem, among other things.

Also, try bumping your curl timeout to something much higher, like 300s, and see how it goes.

Josh Davis