views:

84

answers:

1

I need to share a huge dictionary (around 1 gb in size) between multiple processs, however since all processes will always read from it. I dont need locking.

Is there any way to share a dictionary without locking?

The multiprocessing module in python provides an Array class which allows sharing without locking by setting
lock=false
however There is no such option for Dictionary provided by manager in multiprocessing module.

A: 

Well, in fact the dict on a Manager has no locks at all! I guess this is true for the other shared object you can create through the manager too. How i know this? I tried:

from multiprocessing import Process, Manager

def f(d):
    for i in range(10000):
        d['blah'] += 1

if __name__ == '__main__':
    manager = Manager()

    d = manager.dict()
    d['blah'] = 0
    procs = [ Process(target=f, args=(d,)) for _ in range(10) ]
    for p in procs:
        p.start()
    for p in procs:
        p.join()

    print d

If there were locks on d, the result would be 100000. But instead, the result is pretty random and so this is just a nice illustration why locks are needed when you modify stuff ;-)

So just go ahead and use manager.dict().

THC4k