views:

145

answers:

2

What's the big O for JavaScript's array access when used as a hash?

For example,

 var x= [];
 for(var i=0; i<100000; i++){
    x[i.toString()+'a'] = 123; // using string to illustrate x[alpha]
 }
 alert(x['9999a']); // linear search?

One can hope JS engines will not use a linear search internally O(n), but is this for sure?

+4  A: 

First and foremost Arrays are in fact hashes. Always. That's why x[5] === x["5"]:

var x = [];
x[5] = 10;
alert( x[5] === x["5"] ); // true

Objects are hashes and Arrays are just special objects. If you want to use general hashes go for Objects. "Associative arrays" in Javascript are Objects. Arrays are for numerically indexed data. Arrays have a length property and Array-like methods like push, pop, sort, etc. which makes no sense to be used on hashes.

As for the big O for searching in Objects: it's implementation dependent.

Probably the 2 best things you can do to:

  1. Check the source code of some browser implementations

  2. Do some benchmark for big n and make your conclusion


The related part of the language specification:

4.3.3 object

An object is a collection of properties and has a single prototype object.

8.6.2 Object Internal Properties and Methods

Array objects have a slightly different implementation of the [[DefineOwnProperty]] internal method. Array objects give special treatment to a certain class of property names.

galambalazs
+2  A: 

Accessing object properties and array elements in JavaScript is syntacticly assumed to be done in constant time: O(1). Performance characteristics are not guaranteed in the ECMAScript specification, but all the modern JavaScript engines retrieve object properties in constant time.

Here's a simple example showing how access times grow when the container is x1000 times bigger:

var largeObject = {};
var smallObject = {};

var x, i;

for (i = 0; i < 1000000; i++) {
   largeObject['a' + i] = i;
}

for (i = 0; i < 1000; i++) {
   smallObject['b' + i] = i;
}

console.time('10k Accesses from largeObject');
for (i = 0; i < 10000; i++) x = largeObject['a' + (i % 1000000)];
console.timeEnd('10k Accesses from largeObject');

console.time('10k Accesses from smallObject');
for (i = 0; i < 10000; i++) x = largeObject['a' + (i % 1000)];
console.timeEnd('10k Accesses from smallObject');

Results in Firebug, Firefox 3.6.10 (Mac OS X 10.6.4 - 2.93Ghz Intel Core 2 Duo):

10k Accesses from largeObject: 22ms
10k Accesses from smallObject: 19ms

Results in Chrome Dev Tools 6.0.472:

10k Accesses from largeObject: 15ms
10k Accesses from smallObject: 15ms

Internet Explorer 8.0.7600 with Firebug Lite on Windows 7

10k Accesses from largeObject: 250ms
10k Accesses from smallObject: 219ms
Daniel Vassallo
Can you point to the paragraph in the ECMAScript or JavaScript specifications which says that accessing array elements or object properties is guaranteed to be O(1)? AFAIK, JScript has a perfectly standards-compliant O(n) implementation of objects as simple linked lists.
Jörg W Mittag
@Jörg: It's not mentioned in the spec, so in practice it is implementation dependent... That's why I chose to say that it is "syntactically assumed". Nevertheless, I rephrased my answer to be more accurate.
Daniel Vassallo
I don't know how the IE team can live with such embarrassing results...
ChaosPandion
@Chaos: At least they aren't implementing object properties as a linked list.
Daniel Vassallo
The O(1) is average-case. Worst-case for a hash table is O(n).
Daniel Stutzbach