views:

553

answers:

3

Srry for my English.

I am creating some Django application, that does various long computations with uploaded files. I don't want make user wait for the file to be handled - i just want to show him some page like 'file is being parsed'.

So, how can i make an asynchronous function call from a view?

Something that may look like that:

def view(request):
    ...
    if form.is_valid():
        form.save()
        async_call(handle_file)
    return render_to_response(...)
A: 

Unless you specifically need to use a separate process, which seems to be the gist of the other questions S.Lott is indicating as duplicate of yours, the threading module from the Python standard library (documented here) may offer the simplest solution. Just make sure that handle_file is not accessing any globals that might get modified, nor especially modifying any globals itself; ideally it should communicate with the rest of your process only through Queue instances; etc, etc, all the usual recommendations about threading;-).

Alex Martelli
Thank you for your answer. Am i understanding it right, that the main application main quit even if the thread with some file handling is still running or not? Or the thread will be also terminated? If so, it is kind of... useless, because many webservers spawns manage.py process when needed and kill them if there are too much of them being not busy.I'm now really looking into just adding something like os.system('python handle_file.py path/to/file')...
DataGreed
just figured out, that it will still be a subprocess... guess, i just need to use subprocess.Popen
DataGreed
If you need a subprocess, you need a subprocess, and S.Lott was right (guess there's a reason the guy's got almost twice my rep;-). But, for completeness: a Python process terminates "naturally" when the last NON-DAEMON thread terminates, not before; so, threads whose jobs are important to terminate should be non-daemon (the default) OR explicitly .join'd by the main thread at process end, while "worker threads" doing background tasks should be daemon ones (and not .join'd to;-) iff they may be terminated harmlessly.
Alex Martelli
thanks for the explanation :)
DataGreed
hmmm, tried ou subprocess, but it is still synchronous an the user has to wait till the page loads.
DataGreed
+3  A: 

Rather than trying to manage this via subprocesses or threads, I recommend you separate it out completely. There are two approaches: the first is to set a flag in a database table somewhere, and have a cron job running regularly that checks the flag and performs the required operation.

The second option is to use a message queue. Your file upload process sends a message on the queue, and a separate listener receives the message and does what's needed. I've used RabbitMQ for this sort of thing, but others are available.

Either way, your user doesn't have to wait for the process to finish, and you don't have to worry about managing subprocesses.

Daniel Roseman
oh, how haven't I thought of this! Thanks, that's really a good point. I almost forgot about cron at all. Thanks, that should solve the problem neatly :)
DataGreed
The issue is solved via cron (and a simple shedule python script on the debugging machine, that acts like a simple cron :) )
DataGreed
A: 

I have tried to do the same and failed after multiple attempt due of the nature of django and other asynchronous call.

The solution I have come up which could be a bit over the top for you is to have another asynchronous server in the background processing messages queues from the web request and throwing some chunked javascript which get parsed directly from the browser in an asynchronous way (ie: ajax).

Everything is made transparent for the end user via mod_proxy setting.

Chmouel Boudjnah
thanks for the idea, but i guess, i just have to use cron to start a manage.py command that parses all queued files every couple of minutes, because the project isn't that huge and public to run a second server :)
DataGreed