I am trying to optimize a piece of .NET 2.0 C# code that looks like this:
Dictionary<myType, string> myDictionary = new Dictionary<myType, string>();
// some other stuff
// inside a loop check if key is there and if not add element
if(!myDictionary.ContainsKey(currentKey))
{
myDictionary.Add(currentKey, "");
}
Looks like the Dictionary has been used by whoever wrote this piece of code even if not needed (only the key is being used to store a list of unique values) because faster than a List of myType objects for search. This seems obviously wrong as only the key of the dictionary but I am trying to understand what's the best way to fix it.
Questions:
1) I seem to understand I would get a good performance boost even just using .NET 3.5 HashSet. Is this correct?
2) What would be the best way to optimize the code above in .NET 2.0 and why?
EDIT: This is existing code I am trying to optimize, it's looping through dozens of thousands items and for each one of them is calling a ContainsKey. There's gotta be a better way of doing it (even in .NET 2.0)! :)