views:

432

answers:

3

I'm trying to share a composite structure through a multiprocessing manager but I felt in trouble with a "RuntimeError: maximum recursion depth exceeded" when trying to use just one of the Composite class methods.

The class is token from code.activestate and tested by me before inclusion into the manager.

When retrieving the class into a process and invoking its addChild() method I kept the RunTimeError, while outside the process it works.

The composite class inheritates from a SpecialDict class, that implements a ** __getattr()__ ** method.

Could be possible that while calling addChild() the interpreter of python looks for a different ** __getattr()__ ** because the right one is not proxied by the manager?

If so It's not clear to me the right way to make a proxy to that class/method

The following code reproduce exactly this condition:

1) this is the manager.py:

from multiprocessing.managers import BaseManager
from CompositeDict import *

class PlantPurchaser():

    def __init__(self):
        self.comp  = CompositeDict('Comp')

    def get_cp(self):
        return self.comp

class Manager():

    def __init__(self):

        self.comp  = QueuePurchaser().get_cp()

        BaseManager.register('get_comp', callable=lambda:self.comp)

        self.m = BaseManager(address=('127.0.0.1', 50000), authkey='abracadabra')
        self.s = self.m.get_server()

        self.s.serve_forever()

2) I want to use the composite into this consumer.py:

from multiprocessing.managers import BaseManager

class Consumer():

    def __init__(self):

        BaseManager.register('get_comp')

        self.m = BaseManager(address=('127.0.0.1', 50000), authkey='abracadabra')
        self.m.connect()

        self.comp = self.m.get_comp()
        ret = self.comp.addChild('consumer')

3) run all launching by a controller.py:

from multiprocessing import Process

class Controller():
    def __init__(self):
        for child in _run_children():
            child.join()

def _run_children():

    from manager import Manager
    from consumer import Consumer as Consumer

procs = (
         Process(target=Manager,  name='Manager' ),
         Process(target=Consumer, name='Consumer'),
        )

for proc in procs:
    proc.daemon = 1
    proc.start()
return procs

c = Controller()

Take a look this related questions on how to do a proxy for CompositeDict() class as suggested by AlberT.

The solution given by tgray works but cannot avoid race conditions

A: 

Python has a default maximum recursion depth of 1000 (or 999, I forget...). But you can change the default behavior thusly:

import sys
sys.setrecursionlimit(n)

Where n is the number of recursions you wish to allow.

Edit:

The above answer does nothing to solve the root cause of this problem (as pointed out in the comments). It only needs to be used if you are intentionally recursing more than 1000 times. If you are in an infinite loop (like in this problem), you will eventually hit whatever limit you set.

To address your actual problem, I re-wrote your code from scratch starting as simply as I could make it and built it up to what I believe is what you want:

import sys
from multiprocessing import Process
from multiprocessing.managers import BaseManager
from CompositDict import *

class Shared():
    def __init__(self):
        self.comp = CompositeDict('Comp')

    def get_comp(self):
        return self.comp

    def set_comp(self, c):
        self.comp = c

class Manager():
    def __init__(self):
        shared = Shared()
        BaseManager.register('get_shared', callable=lambda:shared)
        mgr = BaseManager(address=('127.0.0.1', 50000), authkey='abracadabra')
        srv = mgr.get_server()
        srv.serve_forever()

class Consumer():
    def __init__(self, child_name):
        BaseManager.register('get_shared')
        mgr = BaseManager(address=('127.0.0.1', 50000), authkey='abracadabra')
        mgr.connect()

        shared = mgr.get_shared()
        comp = shared.get_comp()
        child = comp.addChild(child_name)
        shared.set_comp(comp)
        print comp

class Controller():
    def __init__(self):
        pass

    def main(self):
        m = Process(target=Manager, name='Manager')
        m.daemon = True
        m.start()

        consumers = []
        for i in xrange(3):
            p = Process(target=Consumer, name='Consumer', args=('Consumer_' + str(i),))
            p.daemon = True
            consumers.append(p)

        for c in consumers:
            c.start()
        for c in consumers:
            c.join()
        return 0


if __name__ == '__main__':
    con = Controller()
    sys.exit(con.main())

I did this all in one file, but you shouldn't have any trouble breaking it up.

I added a child_name argument to your consumer so that I could check that the CompositDict was getting updated.

Note that there is both a getter and a setter for your CompositDict object. When I only had a getter, each Consumer was overwriting the CompositDict when it added a child.

This is why I also changed your registered method to get_shared instead of get_comp, as you will want access to the setter as well as the getter within your Consumer class.

Also, I don't think you want to try joining your manager process, as it will "serve forever". If you look at the source for the BaseManager (./Lib/multiprocessing/managers.py:Line 144) you'll notice that the serve_forever() function puts you into an infinite loop that is only broken by KeyboardInterrupt or SystemExit.

Bottom line is that this code works without any recursive looping (as far as I can tell), but let me know if you still experience your error.

tgray
Yes but in this way I cannot avoid the problem neither argue the cause!
DrFalk3n
This not eradicate the cause, even in the case that would mask the problem
AlberT
You are correct, I answered too quickly without understanding the full problem.
tgray
I've updated my answer to (hopefully) be more helpful.
tgray
Thanks!, +1, but what I get from your code is "AttributeError: 'AutoProxy[get_shared]' object has no attribute 'get_comp'"Why AutoProxy doen't works? see also this other my questions relatedhttp://stackoverflow.com/questions/1478351/python-multiprocessing-proxy
DrFalk3n
Ok it's working
DrFalk3n
Shared() class and its use into the consumer are the key!!! really many many thanks!
DrFalk3n
Just another observation: With your code you cannot avoid race conditions between concurrent consumers! So again the best should be to proxy directly the Compositedict class
DrFalk3n
Ah, I see. I generally just use a Queue object to avoid race conditions.
tgray
A: 

Is it possible there is a circular reference between the classes? For example, the outer class has a reference to the composite class, and the composite class has a reference back to the outer class.

The multiprocessing manager works well, but when you have large, complicated class structures, then you are likely to run into an error where a type/reference can not be serialized correctly. The other problem is that errors from multiprocessing manager are very cryptic. This makes debugging failure conditions even more difficult.

Casey
It happens quite soon to stumble with type/reference not serialized correctly and, yeah!, it's difficult to debug them.It doesn't seems to be a circular reference like that in your example or, at least, not so evident. +1 for the interest on my question :-)
DrFalk3n
A: 

I think the problem is that you have to instruct the Manager on how to manage you object, which is not a standard python type.

In other worlds you have to create a proxy for you CompositeDict

You could look at this doc for an example: http://ruffus.googlecode.com/svn/trunk/doc/html/sharing_data_across_jobs_example.html

AlberT