views:

224

answers:

4

There's a lot of talk about data structures, but I can't find a simple list of data structures and their practical use out there. I'm trying to study for an interview and I think this would help me out, along with many others. I'm looking for something like this:

Data structure - Example/Used for

Hash table - fast data lookup ...then give an example

Array - ...

Binary tree - ...

If there is a resource like this somewhere, please let me know.

Thanks!

EDIT: I mean wikipedia is good and all, but on most of the pages they don't actually list practical uses. I'm looking for something more than that.

+4  A: 

Wikipedia's List of data structures is a pretty good source. You need to click on each one to see the description, but most descriptions include a common usage scenario.

Reed Copsey
Weird - why the downvotes on this?
Reed Copsey
+2  A: 

Just take a look at the wikipedia list of data structures and click on the links to see what they are good for. It might be sufficient just to look up the headlines.

Dario
Two seconds behind Reed Copsey!
Jonathan Leffler
Coincidence? ;-)
Dario
A: 

I think it is more sensible to just study the features of the data structures.

Based on the scenario your interviewer gives you, pick the appropriate one.

I say lies to the people who claim software engineering and programming not being creative.

Zigu
A: 

Any ranking of various data structures will be at least partially tied to problem context. It would help to learn how to analyze time and space performance of algorithms. Typically, "big O notation" is used, e.g. binary search is in O(log n) time, which means that the time to search for an element is the log (in base 2, implicitly) of the number of elements. Intuitively, since every step discards half of the remaining data as irrelevant, doubling the number of elements will increases the time by 1 step. (Binary search scales rather well.) Space performance concerns how the amount of memory grows for larger data sets. Also, note that big-O notation ignores constant factors - for smaller data sets, an O(n^2) algorithm may still be faster than an O(n * log n) algorithm that has a higher constant factor. Complex algorithms often have more work to do on startup.

Besides time and space, other characteristics include whether a data structure is sorted (trees and skiplists are sorted, hash tables are not), persistence (binary trees can reuse pointers from older versions, while hash tables are modified in place), etc.

While you'll need to learn the behavior of several data structures to be able to compare them, one way to develop a sense for why they differ in performance is to closely study a few. I'd suggest comparing singly-linked lists, binary search trees, and skip lists, all of which are relatively simple, but have very different characteristics. Think about how much work it takes to find a value, add a new value, find all values in order, etc.

There are various texts on analyzing algorithms / data structure performance that people recommend, but what really made them make sense to me was learning OCaml. Dealing with complex data structures is ML's strong suit, and their behavior is much clearer when you can avoid pointers and memory management as in C. (Learning OCaml just to understand data structures is almost certainly the long way around, though. :) )

silentbicycle