views:

243

answers:

1

I'm building software that needs to do massive amounts of file transfer via both HTTP and FTP. Often times, I get faster HTTP download with a multi-connection download accelerator like axel or lftp with pget. In some cases, I've seen 2x-3x faster file transfer using something like:

axel http://example.com/somefile

or

lftp -e 'pget -n 5 http://example.com/somefile;quit'

vs. just using wget:

wget http://example.com/somefile

But other times, wget is significantly faster than lftp. Strangly, this is even true even when I do lftp with get, like so:

lftp -e 'pget -n 1 http://example.com/somefile;quit'

I understand that downloading a file via multiple connections won't always result in a speedup, depending on how bandwidth is constrained. But: why would it be slower? Especially when calling lftp/pget with -n 1?

+1  A: 

Is it possible that the HTTP server is compressing the stream using gzip? I can't remember if wget handles gzip Content Encoding or not. If it does, then this might explain the performance boost. Another possibility is that there is an HTTP cache somewhere in the pipeline. You can try something like

wget --no-cache --header="Accept-Encoding: identity"

and compare this to your FTP-based transfer times.

D.Shawley
Not to mention things like content distribution networks, QOS routing (particularly in relation to non-passive ftp), proxy caching.....
symcbean