I'm doing a Dictionary<>
lookup in an O(n^2) loop and need it to be ridiculously fast. It's not. Does anyone have any insight into how Dictionary<>
is implemented? I'm testing Dictionary performance with an isolated test case after running my code through a profiler and determining Dictionary lookups are the bulk of the CPU time.. My test code is like this:
Int32[] keys = new Int32[10] { 38784, 19294, 109574, 2450985, 5, 398, 98405, 12093, 909802, 38294394 };
Dictionary<Int32, MyData> map = new Dictionary<Int32, MyData>();
//Add a bunch of things to map
timer.Start();
Object item;
for (int i = 0; i < 1000000; i++)
{
for (int j = 0; j < keys.Length; j++)
{
bool isFound = map.ContainsKey(keys[j]);
if (isFound)
{
item = map[keys[j]];
}
}
}
timer.Stop();
ContainsKey and map[] are the two slow parts (equally slow).. If i add a TryGetValue, it's nearly identical in speed to ContainsKey. Here's some interesting facts..
A Dictionary<Guid, T>
is about twice as slow as Dictionary<Int32, T>
. Dictionary<String, T>
is about twice as slow as a Guid dictionary. A Dictionary<Byte, T>
is a good 50% faster than using Ints. This leads me to believe that a Dictionary is doing an O(log n) binary search to find the key, and the comparison operators on the keys are the bottleneck. For some reason, I don't believe it's implemented as a Hashtable, because .NET already has a Hashtable class, and in my experience it's even slower than Dictionary.
The dictionaries I'm building are only accessed by one thread at a time, so read locking is not an issue. RAM is also not an issue. The dictionary will most likely only have about 10 buckets, but each bucket can point to one of about 2,000 possibly things. Does anyone have any feedback on how to make this faster? Thanks!
Mike