I'm trying to convert an integer to a string right now, and I'm having a problem.
I've gotten the code written and working for the most part, but it has a small flaw when carrying to the next place. It's hard to describe, so I'll give you an example. Using base 26 with a character set consisting of the lowercase alphabet:
0 = "a"
1 = "b"
2 = "c"
...
25 = "z"
26 = "ba" (This should equal "aa")
It seems to skip the character at the zero place in the character set in certain situations.
The thing that's confusing me is I see nothing wrong with my code. I've been working on this for too long now, and I still can't figure it out.
char* charset = (char*)"abcdefghijklmnopqrstuvwxyz";
int charsetLength = strlen(charset);
unsigned long long num = 5678; // Some random number, it doesn't matter
std::string key
do
{
unsigned int remainder = (num % charsetLength);
num /= charsetLength;
key.insert(key.begin(), charset[remainder]);
} while(num);
I have a feeling the function is tripping up over the modulo returning a zero, but I've been working on this so long, I can't figure out how it's happening. Any suggestions are welcome.
EDIT: The fact that the generated string is little endian is irrelevant for my application.