views:

1528

answers:

6

In Python,

how big can an array/list get? I need an array about 12000 elements large... is that okay? - will I still be able to run array/list methods such as sorting, etc?

Thanks so much, Ed

+2  A: 

12000 elements is nothing in Python... and actually the number of elements can go as far as the Python interpreter has memory on your system.

AlbertoPL
+1  A: 

I'd say you're only limited by the total amount of RAM available. Obviously the larger the array the longer operations on it will take.

Wayne Koorts
Generally true, but not all of them -- appending remains amortized constant time independent of the size of the array.
cdleary
Interesting, thanks for the comment.
Wayne Koorts
+10  A: 

Sure it is OK. Actually you can see for yourself easily:

l = range(12000)
l = sorted(l, reverse=True)

Running the those lines on my machine took:

real    0m0.036s
user    0m0.024s
sys  0m0.004s

But sure as everyone else said. The larger the array the slower the operations will be.

Nadia Alramli
Timing this way can be misleading -- most of the time is spent starting up the Python interpreter. A better way is: python -m timeit.py "l=range(12000); l=sorted(l, reverse=True)". On my machine this gives about 1/20th of the time for this example.
dF
@dF, You are right about accuracy. Thanks for noting that. I just wanted to prove a point. And the example proves it.
Nadia Alramli
+3  A: 

In casual code I've created lists with millions of elements. I believe that Python's implementation of lists are only bound by the amount of memory on your system.

In addition, the list methods / functions should continue to work despite the size of the list.

If you care about performance, it might be worthwhile to look into a library such as NumPy.

Doug
+20  A: 

According to the source code, the maximum size of a list is PY_SSIZE_T_MAX/sizeof(PyObject*).

PY_SSIZE_T_MAX is defined in pyport.h to be ((size_t) -1)>>1

On a regular 32bit system, this is (4294967295 / 2) / 4 or 536870912.

Therefore the maximum size of a python list on a 32 bit system is 536,870,912 elements.

As long as the number of elements you have is equal or below this, all list functions should operate correctly.

Unknown
+2  A: 

Performance characteristics for lists are described on Effbot.

Python lists are actually implemented as vector for fast random access, so the container will basically hold as many items as there is space for in memory. (You need space for pointers contained in the list as well as space in memory for the object(s) being pointed to.)

Appending is O(1) (amortized constant complexity), however, inserting into/deleting from the middle of the sequence will require an O(n) (linear complexity) reordering, which will get slower as the number of elements in your list.

Your sorting question is more nuanced, since the comparison operation can take an unbounded amount of time. If you're performing really slow comparisons, it will take a long time, though it's no fault of Python's list data type.

Reversal just takes the amount of time it required to swap all the pointers in the list (necessarily O(n) (linear complexity), since you touch each pointer once).

cdleary