Don't use file_get_contents()
and then echo
or print
to output the file. That loads the full contents of the file into memory. A large file can/will exceed your script's memory_limit
and kill the script.
For dumping a file's contents to the client, it's best to use readfile()
- it will properly slurp up file chunks and spit them out at the client without exceeding available memory. Just remember to turn off output buffering before you do so, otherwise you're essentially just doing file_get_contents()
again
So, you end up with this:
$tar = 'somefile.tar';
$tar_path = '/the/full/path/to/where/the/file/is' . $tar;
$size = filesize($tar_path);
header("Content-Type: application/x-tar");
header("Content-Disposition: attachment; filename=\"$tar\");
header("Content-Length: $size");
header("Content-Transfer-Encoding: binary");
readfile($tar_path);
If your tar file is actually gzipped, then use "application/x-gtar" instead.
If the file still comes out corrupted after download, do some checking on the client side:
- Is the downloaded file 0 bytes, but the download process seemed to take much longer than it would take for 0 bytes to transfer, then it's something client-side preventing the download. Virus scanner? Trojan?
- Is the downloaded file partially present, but smaller than the original? Something killed the transfer prematurely. Overeager firewall? Download manager having a bad day? Output buffering active on the server and the last buffer bucket not being flushed properly?
- Is the downloaded file the same size as the original? Do an md5/sha1/crc checksum on both copies. If those are the same, then something's wrong with the app opening the file, not the file itself
- Is the downloaded file bigger than the original? Open the file in notepad (or something better like notepad++ which doesn't take years to open big fils) and see if any PHP warnings messages, or some invisible whitespace you can't see in your script got inserted into the download at the start or end of the file.