views:

125

answers:

4

Python is so dynamic that it's not always clear what's going on in a large program, and looking at a tiny bit of source code does not always help. To make matters worse, editors tend to have poor support for navigating to the definitions of tokens or import statements in a Python file.

One way to compensate might be to write a special profiler that, instead of timing the program, would record the runtime types and paths of objects of the program and expose this data to the editor.

This might be implemented with sys.settrace() which sets a callback for each line of code and is how pdb is implemented, or by using the ast module and an import hook to instrument the code, or is there a better strategy? How would you write something like this without making it impossibly slow, and without runnning afoul of extreme dynamism e.g side affects on property access?

+3  A: 

I don't think you can help making it slow, but it should be possible to detect the address of each variable when you encounter a STORE_FAST STORE_NAME STORE_* opcode.

Whether or not this has been done before, I do not know.

If you need debugging, look at PDB, this will allow you to step through your code and access any variables.

import pdb
def test():
    print 1
    pdb.set_trace() # you will enter an interpreter here
    print 2
Unknown
+1  A: 

What if you monkey-patched object's class or another prototypical object?

This might not be the easiest if you're not using new-style classes.

Joe Koberg
+1  A: 

You might want to check out PyChecker's code - it does (i think) what you are looking to do.

Jos
+1  A: 

Pythoscope does something very similar to what you describe and it uses a combination of static information in a form of AST and dynamic information through sys.settrace.

BTW, if you have problems refactoring your project, give Pythoscope a try.

Michał Kwiatkowski