tags:

views:

73

answers:

4

Which is more expensive and by how much:

List<cards> cardList = getMyList(small);//returns a list of 100 elements
cardList.add(card);

Vs.

List<cards> cardList = getMyList(big);//returns a list of 100,000 elements
cardList.add(card);

I guess my real question is, is it expensive to bring a large list into memory? Or is list smart enough to get only as big as it needs to be? Small when adding but large when searching.

A: 

List is list - nothing smart in a linked list that is not documented in the standard algo books. In this case, I would NOT get the whole list into memory - bad design, uses tons of memory.

The second approach is 1000 times as expensive as the first, as it has 1000 times as many elements. List has (1) add operation behavior.

TomTom
1. List is just an interface, different implementations may have different costs to various methods. 2. You can't really comment on whether or not getting a List of 10,000 items is a bad design unless you know what is being done with the list yet, and whether or not all 10,000 entries are needed at the same time.
matt b
What are you talking about? LinkedList is a "plain old and crappy" linked list, but ArrayList has O(1) operations for *get*, *set*, *add*. *List* is just an interface to provide basic API functionalities to many kind of lists, that are not linked lists except LinkedList.
Jack
LIST is not an interface in C#, IList is the interface. Depends on language. Check C# references. Show me a link to anothe rlangauge in the post.
TomTom
@TomTom, the question is tagged "Java".
matt b
A: 

I think you are asking if resizing the list is an expensive operation.

The answer to this is that it depends on the List implementation, an example being the ArrayList implementation:

Each ArrayList instance has a capacity. The capacity is the size of the array used to store the elements in the list. It is always at least as large as the list size. As elements are added to an ArrayList, its capacity grows automatically. The details of the growth policy are not specified beyond the fact that adding an element has constant amortized time cost.

An application can increase the capacity of an ArrayList instance before adding a large number of elements using the ensureCapacity operation. This may reduce the amount of incremental reallocation.

matt b
+3  A: 

Well, obviously, getting a big list in memory is more expensive than getting a smaller one. In fact, the cost factor depends upon the size of your objects, and upon the initial heap size. Indeed, when JVM has no more memory, it doubles its heap size from its Xms parameter up to its Xmx parameter.

However, this is only true if the getMyBigList method creates the objects. If these objects are already loaded in memory, this method will only load a list of 100 000 references in memory, which won't cost you more than a few Mb.

In such a case your limiting factor won't be memory allocation for the JVM, but rather the method you use to load that list.

Are they loaded from network ? the bandwidth is then the limit.

Are they loaded from a magnetic hard disk drive ? the bandwidth is then the limit.

Riduidel
A: 

is it expensive to bring a large list into memory? Or is list smart enough to get only as big as it needs to be? Small when adding but large when searching.

What is "it"? List is an interface. How "smart" the object returned by getMyBigList() is depends entirely on the implementation of that method and the class of the object it returns. Theoretically, it could be some kind of lazy loading "smart" implementation. Most likely, it is not.

Michael Borgwardt