I need to make some concurrent XML feed requests in Perl. What is the fastest way to do this?
+4
A:
I would probably use AnyEvent, perhaps like this:
use AnyEvent;
use AnyEvent::HTTP;
sub get_feeds {
my @feeds = @_;
my $done = AnyEvent->condvar;
my %results;
$done->begin( sub { $done->send(\%results) } );
for my $feed (@feeds){
$done->begin;
http_get $feed, sub { $results{$feed} = \@_; $done->end };
}
$done->end;
return $done;
}
my $done = get_feeds(...);
my $result = $done->recv; # block until all feeds are fetched
jrockway
2009-02-24 18:22:53
The link to AnyEvent::HTTP is http://search.cpan.org/~mlehmann/AnyEvent-HTTP/.
gpojd
2009-02-24 19:26:29
+3
A:
I used LWP::Parallel::UserAgent for something similar. An example from the POD:
require LWP::Parallel::UserAgent;
$ua = LWP::Parallel::UserAgent->new();
...
$ua->redirect (0); # prevents automatic following of redirects
$ua->max_hosts(5); # sets maximum number of locations accessed in parallel
$ua->max_req (5); # sets maximum number of parallel requests per host
...
$ua->register ($request); # or
$ua->register ($request, '/tmp/sss'); # or
$ua->register ($request, \&callback, 4096);
...
$ua->wait ( $timeout );
...
sub callback { my($data, $response, $protocol) = @_; .... }
gpojd
2009-02-24 18:26:56
+1
A:
Fastest in terms of execution time or implementation time?
See also WWW::Agent (which is built on top of POE).
Mark Johnson
2009-02-24 19:33:00