Given that .Net has the ability to detect bitness via IntPtr (looking through reflector a good amount of it is marked unsafe, though - shame) I've been thinking that GetHashCode returning an int is potentially short-sighted.
I know that ultimately with a good hashing algorithm the billions of permutations offered by Int32 are absolutely adequate, but even so, the narrower the possible set of hashes the slower hashed key lookups are as more linear searching will be required.
Equally - am I the only one who finds this amusing:
struct Int64{
public override int GetHashCode()
{
return (((int) this) ^ ((int) (this >> 0x20)));
}
}
Whilst Int32 simply returns this
.
If IntPtr is out of question because of performance concerns, perhaps an IHashCode that implements IEquatable etc is better?
As our platforms get larger and larger in terms of memory capacity, disk size etc, surely the days of 32 bit hashes being enough are potentially numbered?
Or is it simply the case that the overhead involved in either abstracting out the hash via interfaces, or adapting the size of the hash according to the platform outweighs any potential performance benefits?