tags:

views:

129

answers:

2

I work on a large financial pricing application in which some long running calculations. We have identified some functions which can be sped up by the selective application of psyco. My management have requested an assessment of the costs & benefits of adding psyco into our stack.

Given the critical nature of my project, it's not acceptable if a "performance enhancement" can potentially reduce reliability. I've read that using psyco gets additional performance at the cost of more memory used. I'm worried that this could be a problem.

I'm doing it like this:

@psyco.proxy
def my_slow_function(xxx):

In all, we expect to apply psyco to no more than 15 functions - these are used very heavily. There are thousands of functions in this library, so this is only affecting a tiny sub-set of of our code. All of the functions are small, mathematical and stateless.

  • Is there likely to be a risk that this will use substantially more memory
  • Are there any other problems we might encounter when adding this component to our long established library?

FYI, platform is Python 2.4.4 on Windows 32bit XP

UPDATE: It seems that the main potential risk is due to a program requiring more memory to run than before psyco was added, so ideally I'd like to find a way to see if adding psyco dramatically changes the memory requirements of the system.

+2  A: 

Psyco is a JIT compiler. If your function are stateless, then there should be almost no draw back except more memory.

e-satis
Can we start to quantify how much more memory it is expected to use? For example will it use twice us much memory, hundreds of times more memory?
Salim Fadhley
+3  A: 

Why not try profiling it? Psyco has a pretty detailed logging facility:

memory usage: x+ kb

Psyco's current notion of how much memory is consumes for the emitted machine code and supporting data structures. This is a rouch estimation of the memory overhead (the + sign is supposed to remind you that this figure is highly underestimated). Use this info to tune the memory limits (section 3.2.2).

Note also that the memory usage is configurable:

memorymax

Stop when the memory consumed by Psyco reached the limit (in kilobytes). This limit includes the memory consumed before this profiler started.

ire_and_curses
Thanks, great answer
Salim Fadhley
What is it about psyco that makes it use more memory? Is it that it creates copies of the original function objects in memory (which presumably are quite small things), or is it that the functions require more memory to execute.
Salim Fadhley
I believe it's mostly the overhead of runtime information that needs to be gathered before the specialisation processing step can take place. This is necessary in a very high-level language like Python because each operation can be highly dependent on run-time context. This is discussed in detail in the Psyco PEPM '04 paper: http://psyco.sourceforge.net/doc.html
ire_and_curses
By the way, this question is relevant to your more general concern: http://stackoverflow.com/questions/575385/why-not-always-use-psyco-for-python-code/1437939#1437939
ire_and_curses
@Salim Fadhley: I thought your comment was interesting enough to ask as a separate, more general question: http://stackoverflow.com/questions/1438220/why-does-psyco-use-a-lot-of-memory
ire_and_curses