tags:

views:

58

answers:

2

Hi, all!

So, I am writing a code to get a document from the internet. The document size is around 200 KB. This is the code:

#!/usr/local/bin/perl -w
use strict;
use LWP::UserAgent;
my $ua = LWP::UserAgent->new;
my $url = "SOME URL";
my $req = HTTP::Request->new(GET => $url);
my $res = $ua->request($req);

if($res->is_success){
   print $res->content ."\n";
}
else{
  print "Error: " . $res->status_line;
}

Now, the only problem is I can't mention what the URL is.

However, the output is: "Error: 500 read timeout". When I checked the link externally, the data is being downloaded in under 5 seconds.

I even changed the timeout to 1000s, but it still didn't work. How should I go about finding more information related to the response? The size of the file (around 200KB) is also not big enough to warrant a read timeout. The server is also not a busy one, didn't give a problem whenever I checked the link on the browser.

Thanks.

A: 

Make sure webserver is not configured to drop requests from scripts in this case perl.

Rohit
Downvote is for? This is one thing to check.
Snake Plissken
A: 

When network applications give you trouble, debug using Wireshark.

daxim