views:

76

answers:

5

I have a class that is only ever accessed externally through static methods. Those static methods then create an object of the class to use within the method, then they return and the object is presumably destroyed. The class is a getter/setter for a couple config files and now I need to place thread locks on the access to the config files.

Since I have several different static methods that all need read/write access to the config files that all create objects in the scope of the method, I was thinking of having my lock acquires done inside of the object constructor, and then releasing in the destructor.

My coworker expressed concern that it seems like that could potentially leave the class locked forever if something happened. And he also mentioned something about how the destructor in python was called in regards to the garbage collector, but we're both relatively new to python so that's an unknown.

Is this a reasonable solution or should I just lock/unlock in each of the methods themselves?


Class A():
    rateLock = threading.RLock()
    chargeLock = threading.RLock()

    @staticmethod
    def doZStuff():
        a = A()
        a.doStuff('Z')

    @staticmethod
    def doYStuff():
        a = A()
        a.doStuff('Y')

    @synchronized(lock)
    def doStuff(self, type):
        if type == 'Z':
            otherstuff()
        elif type == 'B':
            evenmorestuff()

Is it even possible to get it to work that way with the decorator on doStuff() instead of doZStuff()

Update

Thanks for the answers everyone. The problem I'm facing is mostly due to the fact that it doesn't really make sense to access my module asynchronously, but this is just part of an API. And the team accessing our stuff through the API was complaining about concurrency issues. So I don't need the perfect solution, I'm just trying to make it so they can't crash our side or get garbage data back

+1  A: 

You are right with the garbage collection, so it is not a good idea. Look into decorators, for writing synchronized functions.

Example: http://code.activestate.com/recipes/465057-basic-synchronization-decorator/

edit I'm still not 100% sure what you have in mind, so my suggestion may be wrong:

class A():
    lockZ = threading.RLock()
    lockY = threading.RLock()

    @staticmethod
    @synchroized(lockZ)
    def doZStuff():
        a = A()
        a.doStuff('Z')

    @staticmethod
    @synchroized(lockY)
    def doYStuff():
        a = A()
        a.doStuff('Y')

    def doStuff(self, type):
        if type == 'Z':
            otherstuff()
        elif type == 'B':
            evenmorestuff()
tauran
I've never really understood decorators very well. I'll update my question, can you show me an example of how to modify that code to be able to conditionally lock on either lock?
Falmarri
What are your critical sections? What are rateLock and chargeLock? You should lock at the lowest possible level.
tauran
This class can be used to get/set config options on 2 different config files that need to be locked entirely separately.
Falmarri
A: 

I don't know which platform you are on, but if you need to lock a file, well, you should probably use flock() if it is available instead of rolling your own locking routines.

Since you've mentioned that you are new to python, I must say that most of the time threads are not the solution in python. If your activity is CPU-bound, you should consider using multiprocessing. There is no concurrent execution because of GIL, remember? (this is true for most cases). If your activity is I/O bound, which I guess is the case, you should, perhaps, consider using an event-driven framework, like Twisted. That way you won't have to worry about deadlocks at all, I promise :)

shylent
This is one part of a much much larger project that I only have real knowledge of my section. I can't really change anything, and I'm just told that we need to lock. The problem is I need Read-Process-Write locking, so simply locking on the file isn't good enough.
Falmarri
@Falmarri: Well, I see that you have your requirements, etc, etc, but you never stated them in the question. I guess I'll just leave my answer here, because I think that my point re threads is still very valid.
shylent
@Falmarri: by the way, what's with the staticmethods? Do you have any reason why they should be used instead of a plain function?
shylent
I'm not sure if there's a 100% legitimate reason, but that's what I'm working with and that's how the project is built. I think it's mostly just trying to use purely OO paradigms
Falmarri
Sounds like a very inexperienced designer.
Glenn Maynard
@Falmarri: don't get me wrong, but instead of 'using purely OO paradigms', one should perhaps try to *use python*. I understand, that you are not the one in charge of design decisions so, well, all I can do is wish you luck :/ and perhaps you should point your superiors to pep8, although it is too late for that, I guess.
shylent
+2  A: 
Class A():
    rateLock = threading.RLock()
    chargeLock = threading.RLock()

    def doStuff(self,ratefile,chargefile):
        with A.rateLock:
            with open(ratefile) as f:
                # ...
        with A.chargeLock:
            with open(chargefile) as f:
                # ...

Using the with statement will guarantee that the (R)Lock is acquired and released in pairs. The release will be called even if there an exception occurs within the with-block.

You might also want to think about placing your locks around the file access block with open(...) as ... as tightly as you can so that the locks are not held longer than necessary.

Finally, the creation and garbage collection of a=A() will not affect the locks if (as above) the locks are class attributes (as opposed to instance attributes). The class attributes live in A.__dict__, rather than a.__dict__. So the locks will not be garbage collected until A itself is garbage collected.

unutbu
+1 The with statements could be wrapped in a decorator if they are used often.
tauran
A: 

Releasing locks in the destruction of objects is risky as has already been mentioned because of the garbage collector, because deciding when to call the __del__() method on objects is exclusively decided by the GC (usually when the refcount reaches zero) but in some cases, if you have circular references, it might never be called, even when the program exits.

If you are treating one specific configfile inside a class instance, then you might put a lock object from the Threading module inside it. Some example code of this:

from threading import Lock

class ConfigFile:
  def __init__(file):
    self.file = file
    self.lock = Lock()

  def write(self, data):
    self.lock.aquire()    
    <do stuff with file>
    self.lock.release()

# Function that uses ConfigFile object

def staticmethod():
    config = ConfigFile('myconfig.conf')
    config.write('some data')

You can also use locks in a With statement, like:

def write(self, data):
  with self.lock:
    <do stuff with file>

And Python will aquire and release the lock for you, even in case of errors that happens while doing stuff with the file.

Espen Rønnevik
@Espen: I like your idea of using the lock in a ConfigFile class. However, I think the Lock() needs to be passed to the ConfigFile.__init__ method, instead of being made an instance attribute. As it stands, each ConfigFile instance would have its own lock, defeating the purpose...
unutbu
Every thread that calls the static function will get a new object and a new unlocked lock!!!
tauran
+1  A: 

However, if you HAVE TO acquire and release locks in constructors and destructors, then you really, really, really should give your design another chance. You should change your basic assumptions.

In any application: a "LOCK" should always be held for a short time only - as short as possible. That means - in probably 90% of all cases, you will acquire the lock in the same method that will also release the lock.

There should hardly be NEVER EVER a reason to lock/unlock an object in a RAII style. This is not what it was meant to become ;)

Let me give you an ekxample: you manage some ressources, those res. can be read from many threads at once but only one thread can write to them.

In a "naive" implementation you would have one lock per object, and whenever someone wants to write to it, then you will LOCK it. When multiple threads want to write to it, then you have it synchronyzed fairly, all safe and well, BUT: When thread says "WRITE", then we will stall, until the other threads decide to release the lock.

But please understand that locks, mutex - all these primitives were created to synchronize only a few lines of your source code. So, instead of making the lock part of you writeable object, you have only a lock for the very short time where it really is required. You have to invest more time and thoughts in your interfaces. But, LOCKS/MUTEXES were never meant to be "held" for more than a few microseconds.

frunsi
Those are good points. I guess locks aren't the BEST way to handle the resources I need to manage, but I'm kind of stuck in what I can do. I guess my idea with the constructor/destructor locking was a hackish way to implement a queue
Falmarri