views:

154

answers:

3

Function Annotations: PEP-3107

I ran across a snippet of code demonstrating Python3's function annotations. The concept is simple but I can't think of why these were implemented in Python3 or any good uses for them. Perhaps SO can enlighten me?

How it works:

def foo(a: 'x', b: 5 + 6, c: list) -> max(2, 9):
    ... function body ...

Everything following the colon after an argument is an 'annotation', and the information following the -> is an annotation for the function's return value.

foo.func_annotations would return a dictionary:

{'a': 'x',
 'b': 11,
 'c': list,
 'return': 9}

What's the significance of having this available?

+5  A: 

I think this is actually great.

Coming from an academic background, I can tell you that annotations have proved themselves invaluable for enabling smart static analyzers for languages like Java. For instance, you could define semantics like state restrictions, threads that are allowed to access, architecture limitations, etc., and there are quite a few tools that can then read these and process them to provide assurances beyond what you get from the compilers. You could even write things that check preconditions/postconditions.

I feel something like this is especially needed in Python because of its weaker typing, but there were really no constructs that made this straightforward and part of the official syntax.

There are other uses for annotations beyond assurance. I can see how I could apply my Java-based tools to Python. For instance, I have a tool that lets you assign special warnings to methods, and gives you indications when you call them that you should read their documentation (E.g., imagine you have a method that must not be invoked with a negative value, but it's not intuitive from the name). With annotations, I could technicall write something like this for Python. Similarly, a tool that organizes methods in a large class based on tags can be written if there is an official syntax.

Uri
+2  A: 

Uri has already given a proper answer, so here's a less serious one: So you can make your docstrings shorter.

JAB
love it. +1. however, in the end, writing docstrings is still the number one way I make my code readable , however, if you were to implement any kind of static or dynamic checking, it is nice to have this. Perhaps I might find a use for it.
Warren P
+1  A: 

Just to add a specific example of a good use from my answer here, coupled with decorators a simple mechanism for multimethods can be done.

# This is in the 'mm' module

registry = {}

class MultiMethod(object):
    def __init__(self, name):
        self.name = name
        self.typemap = {}
    def __call__(self, *args):
        types = tuple(arg.__class__ for arg in args) # a generator expression!
        function = self.typemap.get(types)
        if function is None:
            raise TypeError("no match")
        return function(*args)
    def register(self, types, function):
        if types in self.typemap:
            raise TypeError("duplicate registration")
        self.typemap[types] = function

def multimethod(function):
    name = function.__name__
    mm = registry.get(name)
    if mm is None:
        mm = registry[name] = MultiMethod(name)
    types = tuple(function.__annotations__.values())
    mm.register(types, function)
    return mm

and an example of use:

from mm import multimethod

@multimethod
def foo(a: int):
    return "an int"

@multimethod
def foo(a: int, b: str):
    return "an int and a string"

if __name__ == '__main__':
    print("foo(1,'a') = {}".format(foo(1,'a')))
    print("foo(7) = {}".format(foo(7)))

This can be done by adding the types to the decorator as Guido's original post shows, but annotating the parameters themselves is better as it avoids the possibility of wrong matching of parameters and types.

Note: In Python you can access the annotations as function.__annotations__ rather than function.func_annotations as the func_* style was removed on Python 3.

Muhammad Alkarouri