views:

71

answers:

3

When a hash table is initialized how is memory is allocated for it? When we add new members to it how does the memory used by the hash table get extended? Does it ever happen that a hash table is not able to store objects after a fixed size?

+3  A: 

You can use .NET reflector to find out.

System.Collections.Hashtable has some hard limits in it:

double num = ((float) capacity) / this.loadFactor;
if (num > 2147483647.0)
{
    throw new ArgumentException(Environment.GetResourceString("Arg_HTCapacityOverflow"));
}

Also keep in mind the value of int.MaxSize for capacity (I think capacity might be the same as the bucket count, depending on load factor).

If you're hitting that size limit, though, you may want to look into better storage methods than an in-memory hash table CLR object...

Edit:

Memory for the hash table is allocated in this manner:

int num2 = (num > 11.0) ? HashHelpers.GetPrime((int) num) : 11;
this.buckets = new bucket[num2];

[StructLayout(LayoutKind.Sequential)]
private struct bucket
{
    public object key;
    public object val;
    public int hash_coll;
}

See Will's answer for what HashHelpers.GetPrime does.

Merlyn Morgan-Graham
Hi, can u please explain me how memory will be allocated at initialization time?
Tanuja
@Tanuja: Also note that there are other dictionary classes in .NET, we're specifically talking about `System.Collections.Hashtable`, which may not be the optimal implementation for you :)
Merlyn Morgan-Graham
.Net Reflector holds the answers to many of life's questions. Unfortunately, mines been broken since a recent update... :)
Will A
Thanks a lot.. :)
Tanuja
@Will A: Not that this is the place for it, but I find quitting out of VS entirely, then deleting/re-deploying Reflector has fixed most of the breaks I've experienced. Whenever I've had problems, it was with the list of assemblies it was configured to load.
Merlyn Morgan-Graham
+1  A: 

The Hashtable manages its size - so you won't run into a situation where you cannot insert an object unless you've run out of memory (or if you're trying to insert a duplicate key, of course).

According to the docs:

When the actual load factor reaches the specified load factor, the number of buckets is automatically increased to the smallest prime number that is larger than twice the current number of buckets.

Will A
Having seen Merlyn's response, I really should retract my 'limitless Hashtable' comment - however, if you find yourself in a situation where you're placing 2 billion+ objects into a Hashtable you're definitely in need of a better storage method!
Will A
+1. Link to relevant docs is also quite helpful :)
Merlyn Morgan-Graham
A: 

Hash table - Wikipedia, the free encyclopedia

http://en.wikipedia.org/wiki/Hash_table

See memory allocated? HashTable?

http://www.dotnetmonster.com/Uwe/Forum.aspx/dotnet-framework/9596/See-memory-allocated-HashTable

ratty