views:

281

answers:

10

Hello,

Whenever I work with hexadecimal values, I do the conversion (if needed) into decimal in my head or with the help of a converter. Well I was thinking, if you work long enough and get used to hexadecimal, do you just 'see' the value (whatever that means), as when reading decimal ? I mean are you after a while able to read base 16 values as easily as base 10 values ? When I write code with hexadecimal values, I do it because it looks sexy and badass (and sometimes because 8 bits fits into two digits, which helps sometimes), not because I feel somehow comfortable with 'em. That way I could know if there is any point in trying to get used to hexadecimal, to be some day able to work with them with ease.

A: 

Great Question!

I think in terms of Phinary http://en.wikipedia.org/wiki/Golden_ratio_base

Seriously though I had an architecture prof who did not need a disassembler - only a hex editor :)

Hamish Grubijan
+1  A: 

When dealing with flags, I always use hex. When dealing with codes for e.g. characters, I always use hex. When dealing with actual numbers, I use hex or decimal, depending on what the context asks for.

Anon.
Flags are best done with a binary shift of 1, such as 1 << 47. Makes life easier when you have 64 flags to pack.
Hamish Grubijan
Personally, I use named constants with hex values. Perhaps it's because I don't store flags in 64-bit values - at that point, it becomes two separate 32-bit flags.
Anon.
Yes, 32 bits is probably easier on the eye.Right, imagine 64 lines like these: nameGHKKJH = 0x0000000000001; nameGHKASD = 0x0000000000002; ... nameGHKASD = 0x0008000000000;Now compare this with: nameGHKKJH = 1 << 0; nameGHKASD = 1 << 1; ... nameGHKASD = 1 << 27;
Hamish Grubijan
+7  A: 

The way the brain works, it gets better if you do something a lot with it. Some people claim "it's like a muscle" but that's simplifying a bit too far.

So yeah, if you do hex conversions day in and out, you'll eventually be able to convert reasonably large numbers on sight, and you can do hex arithmetic without needing to convert into and out of decimal.

I once spent an evening memorizing the first 100 digits of pi.

So what?

Einstein is quoted as never memorizing any phone numbers. He said, "that's what I have a phone book for."

Same deal here. If you want to impress yourself and maybe your friends, sure, why not. But it's a mental exercise, not a truly useful skill.


In coding, unless you're working predominantly with assembler, you'll be using hex numbers only rarely. It makes sense to use hex numbers in contexts where those numbers will be bit-twiddled. e.g. You might mask a byte by AND-ing against 0xFF . Numbers associated with bitwise shifting, sign bits, OR and XOR may also make a lot of sense as hex. Numbers used for day-to-day arithmetic will usually be best expressed in decimal.

Once you're done with parlor tricks, one of the most important goals of programming is to write code that others (and maybe some day you) can read and understand easily. Using hex in the right places is good; using it in the wrong places is obfuscation and bad.

Carl Smotricz
+1 Cool and very true answer.
BalusC
+6  A: 

You shouldn' use hex because "it looks sexy and badass (and sometimes because 8 bits fits into two digits)". Instead, use whatever number system is more appropriate.

Hexadecimal is more appropriate when you are working with binary bits (e.g. bit masks, flags etc,), since they closely correspond to binary, and (with some practice) you can quickly recognize each it in a hexadecimal number.

Decimal is more appropriate when working with "normal", real-life numbers.

oefe
Well, yeah, I actually use hexadecimal very rarely. Thinking about it, nearly never, as I don't do much bitwise operations. When I use it is mainly if I need to save an address or a color for example. I could write it in decimal, but I just think it looks nicer and is more appropriate if I do it in hex. I'm not the sort of guy who does it to be cool, that sentence meant more like 'I use it but's I have no real argument why'.
Hoffa
+2  A: 

what a great question (+1 to you), personally, i use hex to describe color, and i've been using it every day going on 15 years – so now i can amaze my friends and colleagues by looking at a hex value and saying something like "that's fuchsia". so yes, what i'm thinking is, if you use something a certain way for a long time, your brain will eventually bridge that gap for you.

Nir Gavish
+1  A: 

I don't think you need to practice using hex numbers just for the sake of practicing. If you have an exam that you need to know hexadecimal for, practice. If you are interviewing for a job where you might need to deal with protocols fields or set bits in a register, then practice.

If you use hex numbers often enough in day to day coding, you will eventually acquire the ninja skills you are looking for. If you don't use them that often, then you probably don't need those skills. I wouldn't force it.

Ben Gartner
+1  A: 

I was working predominantly in assembler many years ago and deciding that I needed to learn my basic facts in Hex (multiplication and addition tables). So I wrote up a couple of charts and stuck them to my wall and started learning. My manager walked by, asked me what they were and then told me to go out and buy a Hex calculater! I never did learn those facts, but I can't help feeling it would have been a good idea to do so, in that context. And I don't think it would have been too hard to do it either.

Having said this, I agree with the other answers - use Hex in the right context, main aim should be to make code understandable.

Tony van der Peet
+1  A: 

It might interest you that the octal system was invented by King Charles XII of Sweden, who had every intention of forcing them on the country, but had a terrible case of death two years afterwards.

Daniel
A terrible case of death?
Michael Myers
Yep, never recovered from it. So sad. We could all have been ready for binary computers by the time they came. Heck, maybe the difference engine and analytical engine could have used binary (or octal) digits, making them vastly simpler, so that Charles Babbage could actually have finished the latter and the computer age could have began with steam computers!
Daniel
+1  A: 

I'm sorry. The geek in me took over.

I had to make you this.

http://pap.centelia.net/

It's a little Javascript game I knocked up to help you become badass at the ol' hexadecimal.

Yes, you can cheat, but that's not really the point!

Paul Alan Taylor
+1  A: 

If you use hexadecimal for long enough, you eventually get used to it, yes. Just like you got used to decimal when you were a kid. Just don't complain when you, for example, flunk a math course because "pi is not equal to 3.243f6a88..." or something of that sort.

Also, @oefe, decimal is not more appropriate when working with "normal", real-life numbers, it is simply more popular.

David X