views:

81

answers:

2

Is it possible for a single process running a 32bit compiled version of python in Snow Leopard (64bit machine) to appear to consume > 4gb (say 5.4gb) of virtual memory as seen by the top command?

I did a file ...python to see that the binary was not x86, yet it appeared to be consuming over 5gb of memory.

My guess is that the libraries that were used (RPy) were 'mmap'ing chunks of data, and the in-memory cache was appearing under the memory footprint of my process.

Or maybe I haven't verified that the Python binaries were 32bit. Or maybe there's some 32bit/64bit commingling going (libffi?).

Totally confused.

+1  A: 

No, it's physically impossible. That doesn't stop the OS assigning more than it can use due to alignment and fragmentation, say, it could have a whole page and not actually map in all of it. However it's impossible to actually use over 4GB for any process, and most likely substantially less than that for kernel space.

DeadMG
The limit is most of the time slightly over 3GB and the author of the program needs to specificaly ask for the ability to use that much memory (the default limit is 2GB). This applies to Windows, don't know about Linux.
PeterK
@peterK: The exact limit isn't important. What is important is that it's far south of 5.4 GB.
DeadMG
@DeadMG: i know, just wanted to add some more info.
PeterK
A: 

It is possible if the processes is using some kind of insane long/far/extended pointers and mapping data into and outof a 32-bit address space as it needs it, but at that point it hardly qualifies as 32-bit anyway. (Python most definitely does not do this, so @DeadMG's answer is almost certainly what is actually happening.)

David X