What is the best way to convert a string to hex and vice versa in C++?
Example:
A string like "Hello World" to hex format: 48656C6C6F20576F726C64 And from hex 48656C6C6F20576F726C64 to string: "Hello World"
What is the best way to convert a string to hex and vice versa in C++?
Example:
A string like "Hello World" to hex format: 48656C6C6F20576F726C64 And from hex 48656C6C6F20576F726C64 to string: "Hello World"
string ToHex(const string& s, bool upper_case /* = true */)
{
ostringstream ret;
for (string::size_type i = 0; i < s.length(); ++i)
ret << std::hex << std::setfill('0') << std::setw(2) << (upper_case ? std::uppercase : std::nouppercase) << (int)s[i];
return ret.str();
}
int FromHex(const string &s) { return strtoul(s.c_str(), NULL, 16); }
Simplest example using the Standard Library.
#include <iostream>
using namespace std;
int main()
{
char c = 'n';
cout << "HEX " << hex << (int)c << endl; // output in hexadecimal
cout << "ASC" << c << endl; // output in ascii
return 0;
}
To check the output, codepad returns: 6e
and an online ascii-to-hexadecimal conversion tool yields 6e as well. So it works.
You can also do this:
template<class T> std::string toHexString(const T& value, int width) {
std::ostringstream oss;
oss << hex;
if (width > 0) {
oss << setw(width) << setfill('0');
}
oss << value;
return oss.str();
}
A string like "Hello World" to hex format: 48656C6C6F20576F726C64.
Ah, here you go:
#include string
std::string string_to_hex(const std::string& input)
{
static const char* const lut = "0123456789ABCDEF";
size_t len = input.length();
std::string output;
output.reserve(2 * len);
for (size_t i = 0; i < len; ++i)
{
const char c = input[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
return output;
}
#include <algorithm>
#include <stdexcept>
std::string hex_to_string(const std::string& input)
{
static const char* const lut = "0123456789ABCDEF";
size_t len = input.length();
if (len & 1) throw std::invalid_argument("odd length");
std::string output;
output.reserve(len / 2);
for (size_t i = 0; i < len; i += 2)
{
char a = input[i];
const char* p = std::lower_bound(lut, lut + 16, a);
if (*p != a) throw std::invalid_argument("not a hex digit");
char b = input[i + 1];
const char* q = std::lower_bound(lut, lut + 16, b);
if (*q != b) throw std::invalid_argument("not a hex digit");
output.push_back(((p - lut) << 4) | (q - lut));
}
return output;
}
(This assumes that a char has 8 bits, so it's not very portable, but you can take it from here.)