In implementing the backup script I described in this serverfault question, I ran into some timeout issues that have prompted optimizations to the code (namely, backing up one file per execution of the script and doing everything I can to minimize the number of file-hashes I am calculating over the very large data files).
So far, that seems to have solved the timeout issue, but given the size of the files, there is certainly room for the transfer to take longer than the standard 30s allotted before a script times out. If that happens, I assume the file will simply be cut off as partially transferred. Is there any way to protect against this?
Note that I am operating on a shared-hosting environment, so editing the php.ini file is not an option.