views:

91

answers:

3

I'd like to know what happens when I pass the result of a generator function to python's enumerate(). Example:

def veryBigHello():
    i = 0
    while i < 10000000:
        i += 1
        yield "hello"

numbered = enumerate(veryBigHello())
for i, word in numbered:
    print i, word

Is the enumeration iterated lazily, or does it slurp everything into the first? I'm 99.999% sure it's lazy, so can I treat it exactly the same as the generator function, or do I need to watch out for anything?

+7  A: 

It's lazy. It's fairly easy to prove that's the case:

>>> def abc():
...     letters = ['a','b','c']
...     for letter in letters:
...         print letter
...         yield letter
...
>>> numbered = enumerate(abc())
>>> for i, word in numbered:
...     print i, word
...
a
0 a
b
1 b
c
2 c
Dave Webb
+4  A: 

Since you can call this function without getting out of memory exceptions it definitly is lazy

def veryBigHello():
    i = 0
    while i < 1000000000000000000000000000:
        yield "hello"

numbered = enumerate(veryBigHello())
for i, word in numbered:
    print i, word
Nikolaus Gradwohl
+4  A: 

It's even easier to tell than either of the previous suggest:

$ python
Python 2.5.5 (r255:77872, Mar 15 2010, 00:43:13)
[GCC 4.3.4 20090804 (release) 1] on cygwin
Type "help", "copyright", "credits" or "license" for more information.
>>> abc = (letter for letter in 'abc')
>>> abc
<generator object at 0x7ff29d8c>
>>> numbered = enumerate(abc)
>>> numbered
<enumerate object at 0x7ff29e2c>

If enumerate didn't perform lazy evaluation it would return [(0,'a'), (1,'b'), (2,'c')] or some (nearly) equivalent.

Of course, enumerate is really just a fancy generator:

def myenumerate(iterable):
   count = 0
   for _ in iterable:
      yield (count, _)
      count += 1

for i, val in myenumerate((letter for letter in 'abc')):
    print i, val
Wayne Werner