You could use a 'log-file' to keep track of the zipped files, and of how many files still remain.
The procedural way should be like this:
- Count the numbers of file, write it in a text file, in a format like totalfiles.filespreocessed
- Every file you zip, simply update the file
So, if you have to zip 3 files, the log file will grown as:
3.0 -> begin, no file still processed
3.1 -> 1 file on 3 processed, 33% task complete
3.2 -> 2 file on 3 processed, 66% task complete
3.3 -> 3 file on 3 processed, 100% task complete
And then with a simple ajax function (an interval) check the log-file every second.
In python, open, read and rite a file such small should be very quick, but maybe can cause some requests trouble if you'll have many users doing that in the same time, but obviously you'll need to create a log file for each request, maybe with rand name, and delete it after the task is completed.
A problem could be that, for let the ajax read the log-file, you'll need to open and close the file handler in python every time you update it.
Eventually, for a more accurate progress meter, you culd even use the file size instead of the number of file as parameter.