I may be missing something in the standard libs, but I don't think so. I have this current implementation:
int char2hex(unsigned char c) {
switch (c) {
case '0' ... '9':
return c - '0';
case 'a' ... 'f':
return c - 'a' + 10;
case 'A' ... 'F':
return c - 'A' + 10;
default:
WARNING(@"passed non-hexdigit (%s) to hexDigitToInt()", c);
return 0xFF;
}
}
- (NSData *)decodeHexString {
ASSERT([self length] % 2, @"Attempted to decode an odd lengthed hex string.");
NSData *hexData = [self dataUsingEncoding:NSUTF8StringEncoding];
NSMutableData *resultData = [NSMutableData dataWithLength:([hexData length]) / 2];
const unsigned char *hexBytes = [hexData bytes];
unsigned char *resultBytes = [resultData mutableBytes];
for(NSUInteger i = 0; i < [hexData length] / 2; i++) {
resultBytes[i] = (char2hex(hexBytes[i + i]) << 4) | char2hex(hexBytes[i + i + 1]);
}
return resultData;
}
decodeHexString is a category addition on NSString.
What I'm wondering is, if it's worth supporting odd lengthed hexstrings. And if so, how should i?
P.S. Ignore my debugging macros. And I'm aware that the syntax in use in the switch statement is a GCC extension and may not compile in all compilers. Oh and the code does work as posted.