views:

398

answers:

1

I have subclassed Process like so:

class EdgeRenderer(Process):
    def __init__(self,starter,*args,**kwargs):
        Process.__init__(self,*args,**kwargs)
        self.starter=starter

Then I define a run method which uses self.starter.

That starter object is of a class State that I define.

Is it okay that I do this? What happens to the object? Does it get serialized? Does that mean that I always have to ensure the State object is serializable? Does the new process get a duplicate copy of this object?

+3  A: 

On unix systems, multiprocessing uses os.fork() to create the children, on windows, it uses some subprocess trickery and serialization to share the data. So to be cross platform, yes - it must be serializable. The child will get a new copy.

That being said, here's an example:

from multiprocessing import Process
import time

class Starter(object):
    def __init__(self):
        self.state = False

x = Starter()

class EdgeRenderer(Process):
    def __init__(self,starter,*args,**kwargs):
        Process.__init__(self,*args,**kwargs)
        self.starter=starter
    def run(self):
        self.starter.state = "HAM SANDWICH"
        time.sleep(1)
        print self.starter.state

a = EdgeRenderer(x)
a.start()
x.state = True
a.join()
print x.state

When run, you will see:

HAM SANDWICH
True

So the changes the parent makes don't get communicated after the fork() and the changes the child makes have the same issue. You have to adhere to fork limitations.

jnoller
You're my hero :-p
fuzzyman
Okay, I understand. But that leaves one weird thing: When I pass a queue in the same way as I passed `starter`, it works, so I assume it does not get duplicated. So how come a normal object gets duplicated but a queue doesn't?
cool-RR
I assume you're talking about a multiprocessing.queue, which is handled differently than other objects, take a look at: http://svn.python.org/view/python/trunk/Lib/multiprocessing/queues.py?view=markup
jnoller