I'm trying to use an integer as the numerical representation of a string, for example, storing "ABCD" as 0x41424344. However, when it comes to output, I've got to convert the integer back into 4 ASCII characters. Right now, I'm using bit shifts and masking, as follows:
int value = 0x41424344;
string s = new string (
new char [] {
(char)(value >> 24),
(char)(value >> 16 & 0xFF),
(char)(value >> 8 & 0xFF),
(char)(value & 0xFF) });
Is there a cleaner way to do this? I've tried various casts, but the compiler, as expected, complained about it.