views:

186

answers:

2

hi!

i want to download a huge file from an ftp server in chunks of 50-100MB each. At each point, i want to be able to set the "starting" point and the length of the chunk i want. i wont have the "previous" chunks saved locally (ie i cant ask the program to "resume" the downlaod).

what is the best way of going about that? i use wget mostly, but would something else be better?


hi there! i'm really interested in a prebuilt/inbuild function rather than using a library for this purpose... since wget/ftp (also, i think) allow resumption of downloads, i dont see if that would be problem... (i cant figure out from all the options though!)


hi noinfection - i had a look at that and that wouldnt work... i dont want to keep the entire huge file at my end, just process it in chunks... fyi all - i'm having a look at http://stackoverflow.com/questions/1177102/continue-ftp-download-afther-reconnect which seems interesting..

A: 

I'd recommend interfacing with libcurl from the language of your choice.

Yann Ramin
A: 

Use wget with:

-c option

Extracted from man pages:

-c / --continue

Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:

               wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z

If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.

noinflection