This kind of builds up on Already asked question... However here, say, I'm given a hexadecimal input which could be a max of '0xFFFF' I'll need it converted to binary, so that I'd end up with a max of 16 bits.
I was wondering if using 'bitset' it'd be quite simple.. Any ideas?
EDIT :
After getting answers, improvised piece of code here : http://pastebin.com/f7a6f0a69