I'm writing a simple browser-based front end that should be able to launch a background task and then get progress from it. I want the browser to receive a response saying whether the task launched successfully, and then poll to determine when it is done. However, the presence of a background task seems to be stopping the XMLHttpRequest response from being sent immediately, so I can't report the success of launching the process. Consider the following (simplified) code:
import SocketServer
import SimpleHTTPServer
import multiprocessing
import time
class MyProc(multiprocessing.Process):
def run(self):
print 'Starting long process..'
for i in range(100): time.sleep(1)
print 'Done long process'
class Page(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_GET(self):
if self.path == '/':
print >>self.wfile, "<html><body><a href='/run'>Run</a></body></html>"
if self.path == '/run':
self.proc = MyProc()
print 'Starting..'
self.proc.start()
print 'After start.'
print >>self.wfile, "Process started."
httpd = SocketServer.TCPServer(('', 8000), Page)
httpd.serve_forever()
When I run this, and browse to http://localhost:8000, I get a button named "Run". When I click on it, the terminal displays:
Starting..
After start.
However the browser view does not change.. in fact the cursor is spinning. Only when I press Ctrl-C in the terminal to interrupt the program, then the browser is update with the message Process started.
The message After start
is clearly being printed. Therefore I can assume that do_GET
is returning after starting the process. Yet, the browser doesn't get a response until after I interrupt the long-running process. I have to conclude there is something blocking between do_GET
and the response being sent, which is inside SimpleHTTPServer
.
I've also tried this with threads and subprocess.Popen but ran into similar problems. Any ideas?