I have a script which runs through a database table and downloads a file for each row, adds to a results table in memory, then bulk uploads all the results back to the database once finished.
The problem I have is that there could be thousands of files to download and the script could timeout or error half way through.
Is there a better approach to this, maybe involving threading or asynchronous calls?