Generators (functions with yield
instead of return
) can indeed be seen as "lazy" (and itertools.chain
can chain them just as well as any other iterator, if that's what you mean by "chainable").
But if by "chainable" (and lazy) you mean you want to call fee().fie().fo().fum()
and have all the "hard work" happen only in fum
(which seems closer to what SQLAlchemy does), then generators won't help -- what you need, rather, is the "Promise" design pattern, where each function/method (except the one actually doing all the work) returns an object which records all the conditions, parameters, and constraints on the operation, and the one hard-working function uses that information to finally perform the work.
To give a very simple example, say that "the hard work" is performing an RPC call of the form remote(host, **kwargs)
. You could dress this up in "lazy chainable clothing" as follows:
class RPC(object):
def __init__(self, host):
self._host = host
self._kws = {}
def doit(self, **morekws):
return remote(self._host, **dict(self._kws, **morekws))
def __getattr__(self, name):
def setkw(value):
self._kws[name] = value
return self
return setkw
Now, RPC(x).foo('bar').baz('bap').doit()
calls remote(x, foo=bar, baz=bap)
(and of course you can save intermediate stages of the chain, pass them around as arguments, etc, etc, until the call to doit
).