The combination of coroutines and resource acquisition seems like it could have some unintended (or unintuitive) consequences.
The basic question is whether or not something like this works:
def coroutine():
with open(path, 'r') as fh:
for line in fh:
yield line
Which it does. (You can test it!)
The deeper concern is that with
is supposed to be something an alternative to finally
, where you ensure that a resource is released at the end of the block. Coroutines can suspend and resume execution from within the with
block, so how is the conflict resolved?
For example, if you open a file with read/write both inside and outside a coroutine while the coroutine hasn't yet returned:
def coroutine():
with open('test.txt', 'rw+') as fh:
for line in fh:
yield line
a = coroutine()
assert a.next() # Open the filehandle inside the coroutine first.
with open('test.txt', 'rw+') as fh: # Then open it outside.
for line in fh:
print 'Outside coroutine: %r' % repr(line)
assert a.next() # Can we still use it?
Update
I was going for write-locked file handle contention in the previous example, but since most OSes allocate filehandles per-process there will be no contention there. (Kudos to @Miles for pointing out the example didn't make too much sense.) Here's my revised example, which shows a real deadlock condition:
import threading
lock = threading.Lock()
def coroutine():
with lock:
yield 'spam'
yield 'eggs'
generator = coroutine()
assert generator.next()
with lock: # Deadlock!
print 'Outside the coroutine got the lock'
assert generator.next()