A friend of mine showed me a situation where reading characters produced unexpected behaviour. Reading the character '¤' caused his program to crash. I was able to conclude that '¤' is 164 decimal so it's over the ASCII range.
We noticed the behaviour on '¤' but any character >127 seems to show the problem. The question is how would we reliably read such characters char by char?
int main(int argc, const char *argv[])
{
char input;
do
{
cin >> input;
cout << input;
cout << " " << setbase(10) << (int)input;
cout << " 0x" << setbase(16) << (int)input;
cout << endl;
} while(input);
return 0;
}
masse@libre:temp/2009-11-30 $ ./a.out
¤
 -62 0xffffffc2
¤ -92 0xffffffa4