Is it safe to yield from within a "with" block in Python (and why)?
Solution 1:
I don't really understand what conflict you're asking about, nor the problem with the example: it's fine to have two coexisting, independent handles to the same file.
One thing I didn't know that I learned in response to your question it that there is a new close() method on generators:
close()
raises a newGeneratorExit
exception inside the generator to terminate the iteration. On receiving this exception, the generator’s code must either raiseGeneratorExit
orStopIteration
.
close()
is called when a generator is garbage-collected, so this means the generator’s code gets one last chance to run before the generator is destroyed. This last chance means thattry...finally
statements in generators can now be guaranteed to work; thefinally
clause will now always get a chance to run. This seems like a minor bit of language trivia, but using generators andtry...finally
is actually necessary in order to implement thewith
statement described by PEP 343.http://docs.python.org/whatsnew/2.5.html#pep-342-new-generator-features
So that handles the situation where a with
statement is used in a generator, but it yields in the middle but never returns—the context manager's __exit__
method will be called when the generator is garbage-collected.
Edit:
With regards to the file handle issue: I sometimes forget that there exist platforms that aren't POSIX-like. :)
As far as locks go, I think Rafał Dowgird hits the head on the nail when he says "You just have to be aware that the generator is just like any other object that holds resources." I don't think the with
statement is really that relevant here, since this function suffers from the same deadlock issues:
def coroutine():
lock.acquire()
yield 'spam'
yield 'eggs'
lock.release()
generator = coroutine()
generator.next()
lock.acquire() # whoops!
Solution 2:
I don't think there is a real conflict. You just have to be aware that the generator is just like any other object that holds resources, so it is the creator's responsibility to make sure it is properly finalized (and to avoid conflicts/deadlock with the resources held by the object). The only (minor) problem I see here is that generators don't implement the context management protocol (at least as of Python 2.5), so you cannot just:
with coroutine() as cr:
doSomething(cr)
but instead have to:
cr = coroutine()
try:
doSomething(cr)
finally:
cr.close()
The garbage collector does the close()
anyway, but it's bad practice to rely on that for freeing resources.