views:

212

answers:

5

I want to do a program and want make a the spawn like this process -> n process -> n process

can the second level spawn process with multiprocessing ? using multiprocessinf module of python 2.6

thnx

+1  A: 

Yes it can.

vilalian
+3  A: 

@vilalian's answer is correct, but terse. Of course, it's hard to supply more information when your original question was vague.

To expand a little, you'd have your original program spawn its n processes, but they'd be slightly different than the original in that you'd want them (each, if I understand your question) to spawn n more processes. You could accomplish this by either by having them run code similar to your original process, but that spawned new sets of programs that performed the task at hand, without further processing, or you could use the same code/entry point, just providing different arguments - something like

def main(level):
    if level == 0:
        do_work
    else:
        for i in range(n):
            spawn_process_that_runs_main(level-1)

and start it off with level == 2

Blair Conrad
A: 

Sure you can. Expecially if you are using fork to spawn child processes, they works as perfectly normal processes (like the father). Thread management is quite different, but you can also use "second level" sub-treading.

Pay attention to not over-complicate your program, as example program with two level threads are normally unused.

Emilio
A: 

You can structure your app as a series of process pools communicating via Queues at any nested depth. Though it can get hairy pretty quick (probably due to the required context switching).

It's not erlang though that's for sure.

The docs on multiprocessing are extremely useful.

Here(little too much to drop in a comment) is some code I use to increase throughput in a program that updates my feeds. I have one process polling for feeds that need to fetched, that stuffs it's results in a queue that a Process Pool of 4 workers picks up those results and fetches the feeds, it's results(if any) are then put in a queue for a Process Pool to parse and put into a queue to shove back in the database. Done sequentially, this process would be really slow due to some sites taking their own sweet time to respond so most of the time the process was waiting on data from the internet and would only use one core. Under this process based model, I'm actually waiting on the database the most it seems and my NIC is saturated most of the time as well as all 4 cores are actually doing something. Your mileage may vary.

Tom Willis
+1  A: 

Yes - but, you might run into an issue which would require the fix I committed to python trunk yesterday. See bug http://bugs.python.org/issue5313

jnoller