views:

197

answers:

1

I'm trying to find a reliable to way to uniquely identify and track distinct HttpRequests in an ASP.NET web site.

Does anybody know anything about the implementation of HttpRequest.GetHashCode()? Specifically, how often do collisions occur?

I understand that HashCodes are not guaranteed to be unique. What I'm trying to understand is statistically how often I could expect a HashCode to repeat itself.

The system I have in mind would gracefully handle HashCode collisions, but I want to make sure they're at least as unique as 1 in 1000 or so.

+3  A: 

Hash codes are never guaranteed to be unique as that is not their purpose - they are designed to aid in equality tests as an early indicator of potential equality between two instances.

In other words, a hash code helps you quickly rule out two instances that are definitely not equal.

Maybe something like this would be best:

class TrackableHttpRequest : IEquatable<TrackableHttpRequest>
{
    readonly Guid id = Guid.NewGuid();

    public Guid Id { get { return this.id; } }
    public HttpRequest Request { get; set; }

    public override Int32 GetHashCode()
    {
     return this.Id.GetHashCode();
    }

    public override Boolean Equals(Object obj)
    {
     return this.Equals(obj as TrackableHttpRequest);
    }

    public bool Equals(TrackableHttpRequest other)
    {
     if (other == null)
      return false;

     return this.Id == other.Id;
    }
}
Andrew Hare