views:

48

answers:

1

I have a long running process that will fetch 100k rows from the db genrate a web page and then release all the small objets (list, tuples and dicts). On windows, after each request the memory is freed. Howerver, on linux, the memory of the server keeps growing.

The following posts describes what the problem is and one possible solution.

http://pushingtheweb.com/2010/06/python-and-tcmalloc/

Is there any other way to get around this problem without having to compile my own python version which uses tcmalloc. That option is going to be very difficult to do, since python is controlled by the sys admin.

A: 

You may be able to compile Python in your own working directory rather than try to have the sysadmin replace the system Python.

First you should confirm that the tcmalloc solution solves your problem and does not impact performance too much for your application

gnibbler
I'm guessing my usage is pretty common and not unique. There must be a way to work with the existing python allocator so that it can free the memory back to OS.
Sad
@Sad, You need to confirm that the problem described in the link is the same as your problem. While the memory may not be returned to the OS, it should still at least be freed inside the interpreter to be used for subsequent requests. If you are seeing the memory grow and grow, perhaps your problem is different
gnibbler
I've confirmed that this is the same problem. I've used several memory profiling options available in python and can't find any memory leaks, yet the program's memory size keeps growing. Also the memory can be reused inside the interpreter and the problem is only restricted to linux. All of that combined with the fact that my usecse is identical to original post (allocate a large number of objects) and then release them all makes me pretty comfortable that my problem is the same as outlined in the post.
Sad