tags:

views:

1271

answers:

4

Hi all,
I have some C++ code where I need to implement cache replacement using LRU technique.
So far I know two methods to implement LRU cache replacement:

  1. Using timeStamp for each time the cached data is accessed and finally comparing the timeStamps at time of replacement.
  2. Using a stack of cached items and moving them to the top if they are accessed recently, so finally the bottom will contain the LRU Candidate.

So, which of these is better to be used in production code?
Are their any other better methods?

+1  A: 

There is a good guide to one here.

wheaties
+6  A: 

Recently I implemented a LRU cache using a linked list spread over a hash map.

    /// Typedef for URL/Entry pair
    typedef std::pair< std::string, Entry > EntryPair;

    /// Typedef for Cache list
    typedef std::list< EntryPair > CacheList;

    /// Typedef for URL-indexed map into the CacheList
    typedef boost::unordered_map< std::string, CacheList::iterator > CacheMap;

    /// Cache LRU list
    CacheList mCacheList;

    /// Cache map into the list
    CacheMap mCacheMap;

It has the advantage of being O(1) for all important operations.

The insertion algorithm:

// create new entry
Entry iEntry( ... );

// push it to the front;
mCacheList.push_front( std::make_pair( aURL, iEntry ) );

// add it to the cache map
mCacheMap[ aURL ] = mCacheList.begin();

// increase count of entries
mEntries++;

// check if it's time to remove the last element
if ( mEntries > mMaxEntries )
{
    // erease from the map the last cache list element
    mCacheMap.erase( mCacheList.back().first );

    // erase it from the list
    mCacheList.pop_back();

    // decrease count
    mEntries--;
}
Kornel Kisielewicz
+1  A: 

In our production environment we use a C++ double linked list which is similar to the Linux kernel linked list. The beauty of it is that you can add an object to as many linked lists as you want and list operation is fast and simple.

grokus
+1  A: 

For simplicity, maybe you should consider using Boost's MultiIndex map. If we separate the key from the data, we support multiple sets of keys on the same data.

From [ http://old.nabble.com/realization-of-Last-Recently-Used-cache-with-boost%3A%3Amulti_index-td22326432.html ]:

"...use two indexes: 1) hashed for searching value by key 2) sequential for tracking last recently used items (get function put item as last item in sequesnce. If we need to remove some items from cache, we may delete they from begin of sequence)."

Note that the "project" operator "allows the programmer to move between different indices of the same multi_index_container" efficiently.

William Wong