views:

50

answers:

2

I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. I would like to automate the download somehow.

Previously I have used

file_put_contents($filename, file_get_contents($url));

Will this be ok for such a large file? I will want to untar the file post downloading and then perform analysis of the various files inside the tarball.

regards,

Greg

A: 

You will have to adapt your php.ini to accept larger files in upload, and adapt your memory usage limit.

Guillaume Lebourgeois
+3  A: 

file_get_contents() is handy for small files but it's totally unsuitable for large files. Since it loads the entire file into memory you need like 2GB of RAM for each script instance!

You should use resort to old fopen() + fread() instead.

Also, don't discard using a third-party download tool like wget (installed by default in many Linux systems) and create a cron task to run it. It's possibly the best way to automate a daily download.

Álvaro G. Vicario
thanks, I am liking the idea of doing this via wget and cron
kitenski