I need a very large list, and am trying to figure out how big I can make it so that it still fits in 1-2GB of RAM. I am using the CPython implementation, on 64 bit (x86_64).
Edit: thanks to bua's answer, I have filled in some of the more concrete answers.
What is the space (memory) usage of (in bytes):
- the list itself
sys.getsizeof([]) == 72
- each list entry (not including the data)
sys.getsizeof([0, 1, 2, 3]) == 104
, so 8 bytes overhead per entry.
- the data if it is an integer
sys.getsizeof(2**62) == 24
(but varies according to integer size)sys.getsizeof(2**63) == 40
sys.getsizeof(2**128) == 48
sys.getsizeof(2**256) == 66
- the data if it is an object (
sizeof(Pyobject)
I guess))sys.getsizeof(C()) == 72
(C is an empty user-space object)
If you can share more general data about the observed sizes, that would be great. For example:
- Are there special cases (I think immutable values might be shared, so maybe a list of bools doesn't take any extra space for the data)?
- Perhaps small lists take X bytes overhead but large lists take Y bytes overhead?