views:

274

answers:

4

The title of the question says it all. I have been researching SHA-1 and most places I see it being 40 Hex Characters long which to me is 640bit. Could it not be represented just as well with only 10 hex characters 160bit = 20byte. And one hex character can represent 2 byte right? Why is it twice as long as it needs to be? What am I missing in my understanding.

And couldn't an SHA-1 be even just 5 or less characters if using Base32 or Base36 ?

+14  A: 
KennyTM
You can have 5 characters if you use base `4.3e9`
NullUserException
@NullUserException: Unfortunately, I run out of digits after about base 2.09e6.
Ben Hocking
Thank you so much for this explanation I knew that my math must have been off. Been a few years since I had to do some serious math.
AGrunewald
+3  A: 

2 hex characters mak up a range from 0-255, i.e. 0x00 == 0 and 0xFF == 255. So 2 hex characters are 8 bit, which makes 160 bit for your SHA digest.

Jim Brissom
+3  A: 

There are two hex characters per 8-bit-byte, not two bytes per hex character.

If you are working with 8-bit bytes (as in the SHA-1 definition), then a hex character encodes a single high or low 4-bit nibble within a byte. So it takes two such characters for a full byte.

Jeffrey L Whitledge
A: 

I think the OP's confusion comes from a string representing a SHA1 hash takes 40 bytes (at least if you are using ASCII), which equals 320 bits (not 640 bits).

The reason is that the hash is in binary and the hex string is just an encoding of that. So if you were to use a more efficient encoding (or no encoding at all), you could take only 160 bits of space (20 bytes), but the problem with that is it won't be binary safe.

You could use base64 though, in which case you'd need about 27-28 bytes (or characters) instead of 40 (see this page).

NullUserException