views:

92

answers:

3

I have a large file I want to download from a server I have root access to. I also have several different, concurrent internet connections from my machine to the server at my disposal.

Do you know of any protocol, (S)FTP client, HTTP client, AFP client, or any other file transfer protocol server and client combination that supports multithreaded downloads over different connections?

+2  A: 

Any of these? You'll need a webserver hosting the same file on all the interfaces though.

genpfault
Thanks for pointing out Axel, both look like great downloaders, but I couldn't find a way to use them on multiple interfaces concurrently.
hornairs
A: 

http - check out one of the various download manager (ie firefox with http://www.downthemall.net/ extension) there are also ftp downloader that support multiple streams

Niko
I'm fairly sure the "multi-threaded" HTTP/FTP clients will most likely use the same interface. Such features are generally to get around per-transfer speed limits, not to use multiple connections
dbr
actually they use multiple connections. they request the files with a starbyte and endbyte (or range not sure now) and then parse the data into either one file with random access or create multiple parts which are then pasted together
Niko
DownThemAll is what I use now but I can't find any way to force it to use threads on different interfaces. It also freeze firefox every time it allocates disk space, and the bug has been known for years and hasn't been fixed. I'm tired of it to say the least.
hornairs
+1  A: 

One option would be the "old fashioned" multi-part file..

split -b 50m hugefile multiparthugefile_

That will create multiparthugefile_a, multiparthugefile_b and so on. To rejoin them, use the cat command:

cat multiparthugefile_* > hugefile_rejoined

To actually transfer the files using different interfaces, the wget --bind-address=ADDRESS flag should work:

--bind-address=ADDRESS    bind to ADDRESS (hostname or IP) on local host.

This problem seems like something Bittorrent is positioned to do well, but I'm not sure exactly how you would do this..

Perhaps create a temporary tracker (or use something like OpenBitTorrent.com), and run multiple clients locally - as long as the clients support the LAN transfer feature, each client would grab different parts from the server, and share them with the (local) clients. You'd end up with multiple copies of the file locally, but it would only transferred over the internet once

dbr
So far this is the only answer I can find that will actually facilitate concurrent transfers over multiple interfaces, however its still "old fashioned" as you aptly put. Thanks!
hornairs