I am doing a md5 hash, and just want to make sure the result of:
md5.ComputeHash(bytePassword);
Is consistent regardless of the server?
e.g. windows 2003/2008 and 32/64 bit etc.
I am doing a md5 hash, and just want to make sure the result of:
md5.ComputeHash(bytePassword);
Is consistent regardless of the server?
e.g. windows 2003/2008 and 32/64 bit etc.
MD5 Hashing is [system/time/anything except the input] independent
Yes, it's consistent, the md5 algorithm specification defines it regardless of platform.
The result of an md5 hash is a number. The number returned for a given input is always the same, no matter what server or even platform you use.
However, the expression of the number may vary. For example, 1
and 1.0
are the same number, but are expressed differently. Similarly, some platforms will return the hash formatted slightly differently than others. In this case, you have a byte array, and that should be fairly safe to pass around. Just be careful what you do after converting it to a string.
MD5 is independent of operating system and architecture. So it is "consistent".
However, MD5 takes as input an arbitrary sequence of bits, and outputs a sequence of 128 bits. In many situations, you want strings. For instance, you want to hash a password, and the password is initially a string. The conversion of that string into a sequence of bits is not part of MD5 itself, and several conventions exist. I do not know precisely about C#, but the Java equivalent String.getBytes()
method will use the "platform default charset" which may vary with the operating system installation. Similarly, the output of MD5 is often converted to a string with hexadecimal notation, and it may be uppercase or lowercase or whatever.
So that while MD5 itself is consistent, bugs often lurk in the parts which prepare the data for MD5 and post-process its output. Beware.