Hi folks,
I've written a small snippet that computes the path length of a given node (e.g. its distance to the root node):
def node_depth(node, depth=0, colored_nodes=set()):
"""
Return the length of the path in the parse tree from C{node}'s position
up to the root node. Effectively tests if C{node} is inside a circle
and, if so, returns -1.
"""
if node.mother is None:
return depth
mother = node.mother
if mother.id in colored_nodes:
return -1
colored_nodes.add(node.id)
return node_depth(mother, depth + 1, colored_nodes)
Now there is a strange thing happening with that function (at least it is strange to me): Calling node_depth for the first time returns the right value. However, calling it a second time with the same node returns -1. The colored_nodes set is empty in the first call, but contains all node-IDs in the second call that have been added during first one:
print node_depth(node) # --> 9
# initially colored nodes --> set([])
print node_depth(node) # --> -1
# initially colored nodes --> set([1, 2, 3, 38, 39, 21, 22, 23, 24])
print node_depth(node, colored_nodes=set()) # --> 9
print node_depth(node, colored_nodes=set()) # --> 9
Am I missing some Python-specific thing here and this is really supposed to be that way?
Thanks in advance,
Jena