views:

327

answers:

3

Java has LinkedHashMap which gets you 99% there to an LRU cache.

Is there a Javascript implementation of an LRU cache, preferably from a reputable source, that is:

  1. understandable
  2. efficient (amortized O(1) get/put/delete)

? I've been searching on the web but couldn't find one; I thought I found one on Ajax Design Patterns but it glosses over the sendToTail() method and has O(n) performance (presumably, since the queue and associative array are split up).

I suppose I could write my own, but I've learned the hard way that reinventing the wheel for core algorithms can be hazardous to one's health :/

+2  A: 

This:

http://www.monsur.com/projects/jscache/

seems to fit you case although setItem (i.e. put) is O(N) in the worst case, that happens if the cache is filled up on insertion. In this case the cache is searched to purge expired items or least recently used items. getItem is O(1) and the expiry is handled on the getItem operation (i.e. if the item being fetched is expired, removes it and returns null).

The code is compact enough to be easily understood.

P.S. It might be useful to add to the constructor the option to specify the fillFactor, which is fixed to 0.75 (meaning that when the cache is purged it's size is reduced at least to 3/4th of the maximum size)

LucaM
Jason S
+1 The implementation has nothing to do with ASP.NET I think it is worth a look
Sam Saffron
A: 

It's not an LRU cache, but I've got my own linked map implementation. As it uses a JS objects as store, it'll have similar performance characteristics (the wrapper objects and hash function impart a performance penalty).

Currently, documentation is basically non-existant, but there's a related SO answer.

The each() method will pass the current key, the current value and a boolean indicating if there are more elements as arguments to the callback function.

Alternatively, looping can be done manually via

for(var i = map.size; i--; map.next()) {
    var currentKey = map.key();
    var currentValue = map.value();
}
Christoph
A: 

The monsur.com implementation is O(n) on insertion only because it has items which actually expire on real world time. It is not a simple LRU. If you only care about maintaining the most recently used items without regard to real world time, this can be done in O(1). A queue, implemented as a doubly linked list, is O(1) for insertion or deletion from the end, and this is all you should need for a cache. As for lookup, a hash map, which javascript makes pathetically easy, should be good for nearly O(1) lookup (assuming the javascript engine uses a good hashmap, that's implementation dependent of course). So you have a linked list of items with a hash map pointing to the items. Manipulate the ends of the linked list as needed to put new items and requested items on one end and remove old items from the other end.

James Oakley
the linked list needs to be able to delete (but not insert) items from the middle if items are removed from the LRU cache and reinserted. That's the hard part, your answer seems to gloss over that.
Jason S